Sample records for automated quality control

  1. A Validity-Based Approach to Quality Control and Assurance of Automated Scoring

    ERIC Educational Resources Information Center

    Bejar, Isaac I.

    2011-01-01

    Automated scoring of constructed responses is already operational in several testing programmes. However, as the methodology matures and the demand for the utilisation of constructed responses increases, the volume of automated scoring is likely to increase at a fast pace. Quality assurance and control of the scoring process will likely be more…

  2. Rapid toxicity detection in water quality control utilizing automated multispecies biomonitoring for permanent space stations

    NASA Technical Reports Server (NTRS)

    Morgan, E. L.; Young, R. C.; Smith, M. D.; Eagleson, K. W.

    1986-01-01

    The objective of this study was to evaluate proposed design characteristics and applications of automated biomonitoring devices for real-time toxicity detection in water quality control on-board permanent space stations. Simulated tests in downlinking transmissions of automated biomonitoring data to Earth-receiving stations were simulated using satellite data transmissions from remote Earth-based stations.

  3. Automation of testing modules of controller ELSY-ТМК

    NASA Astrophysics Data System (ADS)

    Dolotov, A. E.; Dolotova, R. G.; Petuhov, D. V.; Potapova, A. P.

    2017-01-01

    In modern life, there are means for automation of various processes which allow one to provide high quality standards of released products and to raise labour efficiency. In the given paper, the data on the automation of the test process of the ELSY-TMK controller [1] is presented. The ELSY-TMK programmed logic controller is an effective modular platform for construction of automation systems for small and average branches of industrial production. The modern and functional standard of communication and open environment of the logic controller give a powerful tool of wide spectrum applications for industrial automation. The algorithm allows one to test controller modules by operating the switching system and external devices faster and at a higher level of quality than a human without such means does.

  4. Quality Work, Quality Control in Technical Services.

    ERIC Educational Resources Information Center

    Horny, Karen L.

    1985-01-01

    Quality in library technical services is explored in light of changes produced by automation. Highlights include a definition of quality; new opportunities and shifting priorities; cataloging (fullness of records, heading consistency, accountability, local standards, automated checking); need for new skills (management, staff); and boons of…

  5. Automated image quality assessment for chest CT scans.

    PubMed

    Reeves, Anthony P; Xie, Yiting; Liu, Shuang

    2018-02-01

    Medical image quality needs to be maintained at standards sufficient for effective clinical reading. Automated computer analytic methods may be applied to medical images for quality assessment. For chest CT scans in a lung cancer screening context, an automated quality assessment method is presented that characterizes image noise and image intensity calibration. This is achieved by image measurements in three automatically segmented homogeneous regions of the scan: external air, trachea lumen air, and descending aorta blood. Profiles of CT scanner behavior are also computed. The method has been evaluated on both phantom and real low-dose chest CT scans and results show that repeatable noise and calibration measures may be realized by automated computer algorithms. Noise and calibration profiles show relevant differences between different scanners and protocols. Automated image quality assessment may be useful for quality control for lung cancer screening and may enable performance improvements to automated computer analysis methods. © 2017 American Association of Physicists in Medicine.

  6. Analysis And Control System For Automated Welding

    NASA Technical Reports Server (NTRS)

    Powell, Bradley W.; Burroughs, Ivan A.; Kennedy, Larry Z.; Rodgers, Michael H.; Goode, K. Wayne

    1994-01-01

    Automated variable-polarity plasma arc (VPPA) welding apparatus operates under electronic supervision by welding analysis and control system. System performs all major monitoring and controlling functions. It acquires, analyzes, and displays weld-quality data in real time and adjusts process parameters accordingly. Also records pertinent data for use in post-weld analysis and documentation of quality. System includes optoelectronic sensors and data processors that provide feedback control of welding process.

  7. Automated quality control in a file-based broadcasting workflow

    NASA Astrophysics Data System (ADS)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  8. Automated daily quality control analysis for mammography in a multi-unit imaging center.

    PubMed

    Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli

    2018-01-01

    Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.

  9. QCloud: A cloud-based quality control system for mass spectrometry-based proteomics laboratories

    PubMed Central

    Chiva, Cristina; Olivella, Roger; Borràs, Eva; Espadas, Guadalupe; Pastor, Olga; Solé, Amanda

    2018-01-01

    The increasing number of biomedical and translational applications in mass spectrometry-based proteomics poses new analytical challenges and raises the need for automated quality control systems. Despite previous efforts to set standard file formats, data processing workflows and key evaluation parameters for quality control, automated quality control systems are not yet widespread among proteomics laboratories, which limits the acquisition of high-quality results, inter-laboratory comparisons and the assessment of variability of instrumental platforms. Here we present QCloud, a cloud-based system to support proteomics laboratories in daily quality assessment using a user-friendly interface, easy setup, automated data processing and archiving, and unbiased instrument evaluation. QCloud supports the most common targeted and untargeted proteomics workflows, it accepts data formats from different vendors and it enables the annotation of acquired data and reporting incidences. A complete version of the QCloud system has successfully been developed and it is now open to the proteomics community (http://qcloud.crg.eu). QCloud system is an open source project, publicly available under a Creative Commons License Attribution-ShareAlike 4.0. PMID:29324744

  10. The Automation of Nowcast Model Assessment Processes

    DTIC Science & Technology

    2016-09-01

    that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data

  11. Control and automation of the Pegasus multi-point Thomson scattering system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bodner, G. M., E-mail: gbodner@wisc.edu; Bongard, M. W.; Fonck, R. J.

    A new control system for the Pegasus Thomson scattering diagnostic has recently been deployed to automate the laser operation, data collection process, and interface with the system-wide Pegasus control code. Automation has been extended to areas outside of data collection, such as manipulation of beamline cameras and remotely controlled turning mirror actuators to enable intra-shot beam alignment. Additionally, the system has been upgraded with a set of fast (∼1 ms) mechanical shutters to mitigate contamination from background light. Modification and automation of the Thomson system have improved both data quality and diagnostic reliability.

  12. Control and automation of the Pegasus multi-point Thomson scattering system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bodner, Grant M.; Bongard, Michael W.; Fonck, Raymond J.

    A new control system for the Pegasus Thomson scattering diagnostic has recently been deployed to automate the laser operation, data collection process, and interface with the system-wide Pegasus control code. Automation has been extended to areas outside of data collection, such as manipulation of beamline cameras and remotely controlled turning mirror actuators to enable intra-shot beam alignment. In addition, the system has been upgraded with a set of fast (~1 ms) mechanical shutters to mitigate contamination from background light. Modification and automation of the Thomson system have improved both data quality and diagnostic reliability.

  13. Control and automation of the Pegasus multi-point Thomson scattering system

    DOE PAGES

    Bodner, Grant M.; Bongard, Michael W.; Fonck, Raymond J.; ...

    2016-08-12

    A new control system for the Pegasus Thomson scattering diagnostic has recently been deployed to automate the laser operation, data collection process, and interface with the system-wide Pegasus control code. Automation has been extended to areas outside of data collection, such as manipulation of beamline cameras and remotely controlled turning mirror actuators to enable intra-shot beam alignment. In addition, the system has been upgraded with a set of fast (~1 ms) mechanical shutters to mitigate contamination from background light. Modification and automation of the Thomson system have improved both data quality and diagnostic reliability.

  14. The automated system for technological process of spacecraft's waveguide paths soldering

    NASA Astrophysics Data System (ADS)

    Tynchenko, V. S.; Murygin, A. V.; Emilova, O. A.; Bocharov, A. N.; Laptenok, V. D.

    2016-11-01

    The paper solves the problem of automated process control of space vehicles waveguide paths soldering by means of induction heating. The peculiarities of the induction soldering process are analyzed and necessity of information-control system automation is identified. The developed automated system makes the control of the product heating process, by varying the power supplied to the inductor, on the basis of information about the soldering zone temperature, and stabilizing the temperature in a narrow range above the melting point of the solder but below the melting point of the waveguide. This allows the soldering process automating to improve the quality of the waveguides and eliminate burn-troughs. The article shows a block diagram of a software system consisting of five modules, and describes the main algorithm of its work. Also there is a description of the waveguide paths automated soldering system operation, for explaining the basic functions and limitations of the system. The developed software allows setting of the measurement equipment, setting and changing parameters of the soldering process, as well as view graphs of temperatures recorded by the system. There is shown the results of experimental studies that prove high quality of soldering process control and the system applicability to the tasks of automation.

  15. [Use of the MS2 automated system in the determination of the activity of 5 antiseptic drugs: quality control].

    PubMed

    Vincent, F; Guyomard, S; Goury, V; Darbord, J C

    1987-06-01

    The study of growth curves of Klebsiella pneumoniae and Staphylococcus aureus in presence of five antiseptics, established using a MS2 Abbott system is presented. From our results, the advantages of automation after the adaptation of the method for the determination of bactericidal properties are examined. This technique may be proposed for the quality control of such drugs.

  16. Can the Roche hemolysis index be used for automated determination of cell-free hemoglobin? A comparison to photometric assays.

    PubMed

    Petrova, Darinka Todorova; Cocisiu, Gabriela Ariadna; Eberle, Christoph; Rhode, Karl-Heinz; Brandhorst, Gunnar; Walson, Philip D; Oellerich, Michael

    2013-09-01

    The aim of this study was to develop a novel method for automated quantification of cell-free hemoglobin (fHb) based on the HI (Roche Diagnostics). The novel fHb method based on the HI was correlated with fHb measured using the triple wavelength methods of both Harboe [fHb, g/L = (0.915 * HI + 2.634)/100] and Fairbanks et al. [fHb, g/L = (0.917 * HI + 2.131)/100]. fHb concentrations were estimated from the HI using the Roche Modular automated platform in self-made and commercially available quality controls, as well as samples from a proficiency testing scheme (INSTAND). The fHb using Roche automated HI results were then compared to results obtained using the traditional spectrophotometric assays for one hundred plasma samples with varying degrees of hemolysis, lipemia and/or bilirubinemia. The novel method using automated HI quantification on the Roche Modular clinical chemistry platform correlated well with results using the classical methods in the 100 patient samples (Harboe: r = 0.9284; Fairbanks et al.: r = 0.9689) and recovery was good for self-made controls. However, commercially available quality controls showed poor recovery due to an unidentified matrix problem. The novel method produced reliable determination of fHb in samples without interferences. However, poor recovery using commercially available fHb quality control samples currently greatly limits its usefulness. © 2013.

  17. Spaceport Command and Control System Automated Testing

    NASA Technical Reports Server (NTRS)

    Stein, Meriel

    2017-01-01

    The Spaceport Command and Control System (SCCS) is the National Aeronautics and Space Administrations (NASA) launch control system for the Orion capsule and Space Launch System, the next generation manned rocket currently in development. This large system requires high quality testing that will properly measure the capabilities of the system. Automating the test procedures would save the project time and money. Therefore, the Electrical Engineering Division at Kennedy Space Center (KSC) has recruited interns for the past two years to work alongside full-time engineers to develop these automated tests, as well as innovate upon the current automation process.

  18. Spaceport Command and Control System Automation Testing

    NASA Technical Reports Server (NTRS)

    Hwang, Andrew

    2017-01-01

    The Spaceport Command and Control System (SCCS) is the National Aeronautics and Space Administrations (NASA) launch control system for the Orion capsule and Space Launch System, the next generation manned rocket currently in development. This large system requires high quality testing that will properly measure the capabilities of the system. Automating the test procedures would save the project time and money. Therefore, the Electrical Engineering Division at Kennedy Space Center (KSC) has recruited interns for the past two years to work alongside full-time engineers to develop these automated tests, as well as innovate upon the current automation process.

  19. Automated standardization technique for an inductively-coupled plasma emission spectrometer

    USGS Publications Warehouse

    Garbarino, John R.; Taylor, Howard E.

    1982-01-01

    The manifold assembly subsystem described permits real-time computer-controlled standardization and quality control of a commercial inductively-coupled plasma atomic emission spectrometer. The manifold assembly consists of a branch-structured glass manifold, a series of microcomputer-controlled solenoid valves, and a reservoir for each standard. Automated standardization involves selective actuation of each solenoid valve that permits a specific mixed standard solution to be pumped to the nebulizer of the spectrometer. Quality control is based on the evaluation of results obtained for a mixed standard containing 17 analytes, that is measured periodically with unknown samples. An inaccurate standard evaluation triggers restandardization of the instrument according to a predetermined protocol. Interaction of the computer-controlled manifold assembly hardware with the spectrometer system is outlined. Evaluation of the automated standardization system with respect to reliability, simplicity, flexibility, and efficiency is compared to the manual procedure. ?? 1982.

  20. Quality of closed chest compression on a manikin in ambulance vehicles and flying helicopters with a real time automated feedback.

    PubMed

    Havel, Christof; Schreiber, Wolfgang; Trimmel, Helmut; Malzer, Reinhard; Haugk, Moritz; Richling, Nina; Riedmüller, Eva; Sterz, Fritz; Herkner, Harald

    2010-01-01

    Automated verbal and visual feedback improves quality of resuscitation in out-of-hospital cardiac arrest and was proven to increase short-term survival. Quality of resuscitation may be hampered in more difficult situations like emergency transportation. Currently there is no evidence if feedback devices can improve resuscitation quality during different modes of transportation. To assess the effect of real time automated feedback on the quality of resuscitation in an emergency transportation setting. Randomised cross-over trial. Medical University of Vienna, Vienna Municipal Ambulance Service and Helicopter Emergency Medical Service Unit (Christophorus Flugrettungsverein) in September 2007. European Resuscitation Council (ERC) certified health care professionals performing CPR in a flying helicopter and in a moving ambulance vehicle on a manikin with human-like chest properties. CPR sessions, with real time automated feedback as the intervention and standard CPR without feedback as control. Quality of chest compression during resuscitation. Feedback resulted in less deviation from ideal compression rate 100 min(-1) (9+/-9 min(-1), p<0.0001) with this effect becoming steadily larger over time. Applied work was less in the feedback group compared to controls (373+/-448 cm x compression; p<0.001). Feedback did not influence ideal compression depth significantly. There was some indication of a learning effect of the feedback device. Real time automated feedback improves certain aspects of CPR quality in flying helicopters and moving ambulance vehicles. The effect of feedback guidance was most pronounced for chest compression rate. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  1. Development and implementation of an automated quantitative film digitizer quality control program

    NASA Astrophysics Data System (ADS)

    Fetterly, Kenneth A.; Avula, Ramesh T. V.; Hangiandreou, Nicholas J.

    1999-05-01

    A semi-automated, quantitative film digitizer quality control program that is based on the computer analysis of the image data from a single digitized test film was developed. This program includes measurements of the geometric accuracy, optical density performance, signal to noise ratio, and presampled modulation transfer function. The variability of the measurements was less than plus or minus 5%. Measurements were made on a group of two clinical and two laboratory laser film digitizers during a trial period of approximately four months. Quality control limits were established based on clinical necessity, vendor specifications and digitizer performance. During the trial period, one of the digitizers failed the performance requirements and was corrected by calibration.

  2. Designing, Evaluating, and Deploying Automated Scoring Systems with Validity in Mind: Methodological Design Decisions

    ERIC Educational Resources Information Center

    Rupp, André A.

    2018-01-01

    This article discusses critical methodological design decisions for collecting, interpreting, and synthesizing empirical evidence during the design, deployment, and operational quality-control phases for automated scoring systems. The discussion is inspired by work on operational large-scale systems for automated essay scoring but many of the…

  3. Classification Trees for Quality Control Processes in Automated Constructed Response Scoring.

    ERIC Educational Resources Information Center

    Williamson, David M.; Hone, Anne S.; Miller, Susan; Bejar, Isaac I.

    As the automated scoring of constructed responses reaches operational status, the issue of monitoring the scoring process becomes a primary concern, particularly when the goal is to have automated scoring operate completely unassisted by humans. Using a vignette from the Architectural Registration Examination and data for 326 cases with both human…

  4. Network-based production quality control

    NASA Astrophysics Data System (ADS)

    Kwon, Yongjin; Tseng, Bill; Chiou, Richard

    2007-09-01

    This study investigates the feasibility of remote quality control using a host of advanced automation equipment with Internet accessibility. Recent emphasis on product quality and reduction of waste stems from the dynamic, globalized and customer-driven market, which brings opportunities and threats to companies, depending on the response speed and production strategies. The current trends in industry also include a wide spread of distributed manufacturing systems, where design, production, and management facilities are geographically dispersed. This situation mandates not only the accessibility to remotely located production equipment for monitoring and control, but efficient means of responding to changing environment to counter process variations and diverse customer demands. To compete under such an environment, companies are striving to achieve 100%, sensor-based, automated inspection for zero-defect manufacturing. In this study, the Internet-based quality control scheme is referred to as "E-Quality for Manufacturing" or "EQM" for short. By its definition, EQM refers to a holistic approach to design and to embed efficient quality control functions in the context of network integrated manufacturing systems. Such system let designers located far away from the production facility to monitor, control and adjust the quality inspection processes as production design evolves.

  5. A Novel Approach for Enhancement of Automobile Clutch Engagement Quality Using Mechatronics Based Automated Clutch System

    NASA Astrophysics Data System (ADS)

    Tripathi, K.

    2013-01-01

    In automated manual clutch (AMC) a mechatronic system controls clutch force trajectory through an actuator governed by a control system. The present study identifies relevant characteristics of this trajectory and their effects on driveline dynamics and engagement quality. A new type of force trajectory is identified which gives the good engagement quality. However this trajectory is not achievable through conventional clutch control mechanism. But in AMC a mechatronic system based on electro-hydraulic or electro-mechanical elements can make it feasible. A mechatronic system is presented in which a mechatronic add-on system can be used to implement the novel force trajectory, without the requirement of replacing the traditional diaphragm spring based clutch in a vehicle with manual transmission.

  6. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    PubMed

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-04

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .

  7. Taking Over Control From Highly Automated Vehicles in Complex Traffic Situations: The Role of Traffic Density.

    PubMed

    Gold, Christian; Körber, Moritz; Lechner, David; Bengler, Klaus

    2016-06-01

    The aim of this study was to quantify the impact of traffic density and verbal tasks on takeover performance in highly automated driving. In highly automated vehicles, the driver has to occasionally take over vehicle control when approaching system limits. To ensure safety, the ability of the driver to regain control of the driving task under various driving situations and different driver states needs to be quantified. Seventy-two participants experienced takeover situations requiring an evasive maneuver on a three-lane highway with varying traffic density (zero, 10, and 20 vehicles per kilometer). In a between-subjects design, half of the participants were engaged in a verbal 20-Questions Task, representing speaking on the phone while driving in a highly automated vehicle. The presence of traffic in takeover situations led to longer takeover times and worse takeover quality in the form of shorter time to collision and more collisions. The 20-Questions Task did not influence takeover time but seemed to have minor effects on the takeover quality. For the design and evaluation of human-machine interaction in takeover situations of highly automated vehicles, the traffic state seems to play a major role, compared to the driver state, manipulated by the 20-Questions Task. The present results can be used by developers of highly automated systems to appropriately design human-machine interfaces and to assess the driver's time budget for regaining control. © 2016, Human Factors and Ergonomics Society.

  8. Computer program CDCID: an automated quality control program using CDC update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singer, G.L.; Aguilar, F.

    1984-04-01

    A computer program, CDCID, has been developed in coordination with a quality control program to provide a highly automated method of documenting changes to computer codes at EG and G Idaho, Inc. The method uses the standard CDC UPDATE program in such a manner that updates and their associated documentation are easily made and retrieved in various formats. The method allows each card image of a source program to point to the document which describes it, who created the card, and when it was created. The method described is applicable to the quality control of computer programs in general. Themore » computer program described is executable only on CDC computing systems, but the program could be modified and applied to any computing system with an adequate updating program.« less

  9. Upgrades to the NOAA/NESDIS automated Cloud-Motion Vector system

    NASA Technical Reports Server (NTRS)

    Nieman, Steve; Menzel, W. Paul; Hayden, Christopher M.; Wanzong, Steve; Velden, Christopher S.

    1993-01-01

    The latest version of the automated cloud motion vector software has yielded significant improvements in the quality of the GOES cloud-drift winds produced operationally by NESDIS. Cloud motion vectors resulting from the automated system are now equal or superior in quality to those which had the benefit of manual quality control a few years ago. The single most important factor in this improvement has been the upgraded auto-editor. Improved tracer selection procedures eliminate targets in difficult regions and allow a higher target density and therefore enhanced coverage in areas of interest. The incorporation of the H2O-intercept height assignment method allows an adequate representation of the heights of semi-transparent clouds in the absence of a CO2-absorption channel. Finally, GOES-8 water-vapor motion winds resulting from the automated system are superior to any done previously by NESDIS and should now be considered as an operational product.

  10. Quality Control of Structural MRI Images Applied Using FreeSurfer—A Hands-On Workflow to Rate Motion Artifacts

    PubMed Central

    Backhausen, Lea L.; Herting, Megan M.; Buse, Judith; Roessner, Veit; Smolka, Michael N.; Vetter, Nora C.

    2016-01-01

    In structural magnetic resonance imaging motion artifacts are common, especially when not scanning healthy young adults. It has been shown that motion affects the analysis with automated image-processing techniques (e.g., FreeSurfer). This can bias results. Several developmental and adult studies have found reduced volume and thickness of gray matter due to motion artifacts. Thus, quality control is necessary in order to ensure an acceptable level of quality and to define exclusion criteria of images (i.e., determine participants with most severe artifacts). However, information about the quality control workflow and image exclusion procedure is largely lacking in the current literature and the existing rating systems differ. Here, we propose a stringent workflow of quality control steps during and after acquisition of T1-weighted images, which enables researchers dealing with populations that are typically affected by motion artifacts to enhance data quality and maximize sample sizes. As an underlying aim we established a thorough quality control rating system for T1-weighted images and applied it to the analysis of developmental clinical data using the automated processing pipeline FreeSurfer. This hands-on workflow and quality control rating system will aid researchers in minimizing motion artifacts in the final data set, and therefore enhance the quality of structural magnetic resonance imaging studies. PMID:27999528

  11. Intelligent Processing Equipment Projects at DLA

    NASA Technical Reports Server (NTRS)

    Obrien, Donald F.

    1992-01-01

    The Defense Logistics Agency is successfully incorporating Intelligent Processing Equipment (IPE) into each of its Manufacturing Technology thrust areas. Several IPE applications are addressed in the manufacturing of two 'soldier support' items: combat rations and military apparel. In combat rations, in-line sensors for food processing are being developed or modified from other industries. In addition, many process controls are being automated to achieve better quality and to gain higher use (soldier) acceptance. IPE applications in military apparel include: in-process quality controls for identification of sewing defects, use of robots in the manufacture of shirt collars, and automated handling of garments for pressing.

  12. Intelligent processing equipment projects at DLA

    NASA Astrophysics Data System (ADS)

    Obrien, Donald F.

    1992-04-01

    The Defense Logistics Agency is successfully incorporating Intelligent Processing Equipment (IPE) into each of its Manufacturing Technology thrust areas. Several IPE applications are addressed in the manufacturing of two 'soldier support' items: combat rations and military apparel. In combat rations, in-line sensors for food processing are being developed or modified from other industries. In addition, many process controls are being automated to achieve better quality and to gain higher use (soldier) acceptance. IPE applications in military apparel include: in-process quality controls for identification of sewing defects, use of robots in the manufacture of shirt collars, and automated handling of garments for pressing.

  13. Command and Control Common Semantic Core Required to Enable Net-centric Operations

    DTIC Science & Technology

    2008-05-20

    automated processing capability. A former US Marine Corps component C4 director during Operation Iraqi Freedom identified the problems of 1) uncertainty...interoperability improvements to warfighter community processes, thanks to ubiquitous automated processing , are likely high and somewhat easier to quantify. A...synchronized with the actions of other partners / warfare communities. This requires high- quality information, rapid sharing and automated processing – which

  14. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    PubMed

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  15. Managing laboratory automation

    PubMed Central

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed. PMID:18925018

  16. Managing laboratory automation.

    PubMed

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  17. Improving transcriptome construction in non-model organisms: integrating manual and automated gene definition in Emiliania huxleyi.

    PubMed

    Feldmesser, Ester; Rosenwasser, Shilo; Vardi, Assaf; Ben-Dor, Shifra

    2014-02-22

    The advent of Next Generation Sequencing technologies and corresponding bioinformatics tools allows the definition of transcriptomes in non-model organisms. Non-model organisms are of great ecological and biotechnological significance, and consequently the understanding of their unique metabolic pathways is essential. Several methods that integrate de novo assembly with genome-based assembly have been proposed. Yet, there are many open challenges in defining genes, particularly where genomes are not available or incomplete. Despite the large numbers of transcriptome assemblies that have been performed, quality control of the transcript building process, particularly on the protein level, is rarely performed if ever. To test and improve the quality of the automated transcriptome reconstruction, we used manually defined and curated genes, several of them experimentally validated. Several approaches to transcript construction were utilized, based on the available data: a draft genome, high quality RNAseq reads, and ESTs. In order to maximize the contribution of the various data, we integrated methods including de novo and genome based assembly, as well as EST clustering. After each step a set of manually curated genes was used for quality assessment of the transcripts. The interplay between the automated pipeline and the quality control indicated which additional processes were required to improve the transcriptome reconstruction. We discovered that E. huxleyi has a very high percentage of non-canonical splice junctions, and relatively high rates of intron retention, which caused unique issues with the currently available tools. While individual tools missed genes and artificially joined overlapping transcripts, combining the results of several tools improved the completeness and quality considerably. The final collection, created from the integration of several quality control and improvement rounds, was compared to the manually defined set both on the DNA and protein levels, and resulted in an improvement of 20% versus any of the read-based approaches alone. To the best of our knowledge, this is the first time that an automated transcript definition is subjected to quality control using manually defined and curated genes and thereafter the process is improved. We recommend using a set of manually curated genes to troubleshoot transcriptome reconstruction.

  18. Honeywell Technical Order Transfer Tests.

    DTIC Science & Technology

    1987-06-12

    of simple corrections, a reasonable reproduction of the original could be generated. The quality was not good enough for a production environment. Lack of automated quality control (AQC) tools could account for the errors.

  19. Quality control in urinalysis.

    PubMed

    Takubo, T; Tatsumi, N

    1999-01-01

    Quality control (QC) has been introduced in laboratories, and QC surveys in urinalysis have been performed by College of American Pathologist, by Japanese Association of Medical Technologists, by Osaka Medical Association and by manufacturers. QC survey in urinalysis for synthetic urine by the reagent strip and instrument made in same manufacturer, and by an automated urine cell analyser provided satisfactory results among laboratories. QC survey in urinalysis for synthetic urine by the reagent strips and instruments made by various manufacturers indicated differences in the determination values among manufacturers, and between manual and automated methods because the reagent strips and instruments have different characteristics, respectively. QC photo survey in urinalysis on the microscopic photos of urine sediment constituents indicated differences in the identification of cells among laboratories. From the results, it is necessary to standardize a reagent strip method, manual and automated methods, and synthetic urine.

  20. Automation and Intensity Modulated Radiation Therapy for Individualized High-Quality Tangent Breast Treatment Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Purdie, Thomas G., E-mail: Tom.Purdie@rmp.uhn.on.ca; Department of Radiation Oncology, University of Toronto, Toronto, Ontario; Techna Institute, University Health Network, Toronto, Ontario

    Purpose: To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Methods and Materials: Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to definemore » and simplify the technical aspects of the treatment planning process. Results: Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Conclusions: Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented improvements in care for breast cancer patients, using technologies that are widely available and already in clinical use.« less

  1. Automation and intensity modulated radiation therapy for individualized high-quality tangent breast treatment plans.

    PubMed

    Purdie, Thomas G; Dinniwell, Robert E; Fyles, Anthony; Sharpe, Michael B

    2014-11-01

    To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to define and simplify the technical aspects of the treatment planning process. Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented improvements in care for breast cancer patients, using technologies that are widely available and already in clinical use. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Automated Assessment of Visual Quality of Digital Video

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Ellis, Stephen R. (Technical Monitor)

    1997-01-01

    The advent of widespread distribution of digital video creates a need for automated methods for evaluating visual quality of digital video. This is particularly so since most digital video is compressed using lossy methods, which involve the controlled introduction of potentially visible artifacts. Compounding the problem is the bursty nature of digital video, which requires adaptive bit allocation based on visual quality metrics. In previous work, we have developed visual quality metrics for evaluating, controlling, and optimizing the quality of compressed still images[1-4]. These metrics incorporate simplified models of human visual sensitivity to spatial and chromatic visual signals. The challenge of video quality metrics is to extend these simplified models to temporal signals as well. In this presentation I will discuss a number of the issues that must be resolved in the design of effective video quality metrics. Among these are spatial, temporal, and chromatic sensitivity and their interactions, visual masking, and implementation complexity. I will also touch on the question of how to evaluate the performance of these metrics.

  3. Data quality can make or break a research infrastructure

    NASA Astrophysics Data System (ADS)

    Pastorello, G.; Gunter, D.; Chu, H.; Christianson, D. S.; Trotta, C.; Canfora, E.; Faybishenko, B.; Cheah, Y. W.; Beekwilder, N.; Chan, S.; Dengel, S.; Keenan, T. F.; O'Brien, F.; Elbashandy, A.; Poindexter, C.; Humphrey, M.; Papale, D.; Agarwal, D.

    2017-12-01

    Research infrastructures (RIs) commonly support observational data provided by multiple, independent sources. Uniformity in the data distributed by such RIs is important in most applications, e.g., in comparative studies using data from two or more sources. Achieving uniformity in terms of data quality is challenging, especially considering that many data issues are unpredictable and cannot be detected until a first occurrence of the issue. With that, many data quality control activities within RIs require a manual, human-in-the-loop element, making it an expensive activity. Our motivating example is the FLUXNET2015 dataset - a collection of ecosystem-level carbon, water, and energy fluxes between land and atmosphere from over 200 sites around the world, some sites with over 20 years of data. About 90% of the human effort to create the dataset was spent in data quality related activities. Based on this experience, we have been working on solutions to increase the automation of data quality control procedures. Since it is nearly impossible to fully automate all quality related checks, we have been drawing from the experience with techniques used in software development, which shares a few common constraints. In both managing scientific data and writing software, human time is a precious resource; code bases, as Science datasets, can be large, complex, and full of errors; both scientific and software endeavors can be pursued by individuals, but collaborative teams can accomplish a lot more. The lucrative and fast-paced nature of the software industry fueled the creation of methods and tools to increase automation and productivity within these constraints. Issue tracking systems, methods for translating problems into automated tests, powerful version control tools are a few examples. Terrestrial and aquatic ecosystems research relies heavily on many types of observational data. As volumes of data collection increases, ensuring data quality is becoming an unwieldy challenge for RIs. Business as usual approaches to data quality do not work with larger data volumes. We believe RIs can benefit greatly from adapting and imitating this body of theory and practice from software quality into data quality, enabling systematic and reproducible safeguards against errors and mistakes in datasets as much as in software.

  4. Automated batch characterization of inkjet-printed elastomer lenses using a LEGO platform.

    PubMed

    Sung, Yu-Lung; Garan, Jacob; Nguyen, Hoang; Hu, Zhenyu; Shih, Wei-Chuan

    2017-09-10

    Small, self-adhesive, inkjet-printed elastomer lenses have enabled smartphone cameras to image and resolve microscopic objects. However, the performance of different lenses within a batch is affected by hard-to-control environmental variables. We present a cost-effective platform to perform automated batch characterization of 300 lens units simultaneously for quality inspection. The system was designed and configured with LEGO bricks, 3D printed parts, and a digital camera. The scheme presented here may become the basis of a high-throughput, in-line inspection tool for quality control purposes and can also be employed for optimization of the manufacturing process.

  5. Contributions of CCLM to advances in quality control.

    PubMed

    Kazmierczak, Steven C

    2013-01-01

    Abstract The discipline of laboratory medicine is relatively young when considered in the context of the history of medicine itself. The history of quality control, within the context of laboratory medicine, also enjoys a relatively brief, but rich history. Laboratory quality control continues to evolve along with advances in automation, measurement techniques and information technology. Clinical Chemistry and Laboratory Medicine (CCLM) has played a key role in helping disseminate information about the proper use and utility of quality control. Publication of important advances in quality control techniques and dissemination of guidelines concerned with laboratory quality control has undoubtedly helped readers of this journal keep up to date on the most recent developments in this field.

  6. Verification and quality control of routine hematology analyzers.

    PubMed

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  7. Domain specific software architectures: Command and control

    NASA Technical Reports Server (NTRS)

    Braun, Christine; Hatch, William; Ruegsegger, Theodore; Balzer, Bob; Feather, Martin; Goldman, Neil; Wile, Dave

    1992-01-01

    GTE is the Command and Control contractor for the Domain Specific Software Architectures program. The objective of this program is to develop and demonstrate an architecture-driven, component-based capability for the automated generation of command and control (C2) applications. Such a capability will significantly reduce the cost of C2 applications development and will lead to improved system quality and reliability through the use of proven architectures and components. A major focus of GTE's approach is the automated generation of application components in particular subdomains. Our initial work in this area has concentrated in the message handling subdomain; we have defined and prototyped an approach that can automate one of the most software-intensive parts of C2 systems development. This paper provides an overview of the GTE team's DSSA approach and then presents our work on automated support for message processing.

  8. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes

    PubMed Central

    Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-01-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed‐batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647–1661, 2017 PMID:28786215

  9. Assessment Study on Sensors and Automation in the Industries of the Future. Reports on Industrial Controls, Information Processing, Automation, and Robotics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Bonnie; Boddy, Mark; Doyle, Frank

    This report presents the results of an expert study to identify research opportunities for Sensors & Automation, a sub-program of the U.S. Department of Energy (DOE) Industrial Technologies Program (ITP). The research opportunities are prioritized by realizable energy savings. The study encompasses the technology areas of industrial controls, information processing, automation, and robotics. These areas have been central areas of focus of many Industries of the Future (IOF) technology roadmaps. This report identifies opportunities for energy savings as a direct result of advances in these areas and also recognizes indirect means of achieving energy savings, such as product quality improvement,more » productivity improvement, and reduction of recycle.« less

  10. Enhancement of the Automated Quality Control Procedures for the International Soil Moisture Network

    NASA Astrophysics Data System (ADS)

    Heer, Elsa; Xaver, Angelika; Dorigo, Wouter; Messner, Romina

    2017-04-01

    In-situ soil moisture observations are still trusted to be the most reliable data to validate remotely sensed soil moisture products. Thus, the quality of in-situ soil moisture observations is of high importance. The International Soil Moisture Network (ISMN; http://ismn.geo.tuwien.ac.at/) provides in-situ soil moisture data from all around the world. The data is collected from individual networks and data providers, measured by different sensors in various depths. The data sets which are delivered in different units, time zones and data formats are then transformed into homogeneous data sets. An erroneous behavior of soil moisture data is very difficult to detect, due to annual and daily changes and most significantly the high influence of precipitation and snow melting processes. Only few of the network providers have a quality assessment for their data sets. Therefore, advanced quality control procedures have been developed for the ISMN (Dorigo et al. 2013). Three categories of quality checks were introduced: exceeding boundary values, geophysical consistency checks and a spectrum based approach. The spectrum based quality control algorithms aim to detect erroneous measurements which occur within plausible geophysical ranges, e.g. a sudden drop in soil moisture caused by a sensor malfunction. By defining several conditions which have to be met by the original soil moisture time series and their first and second derivative, such error types can be detected. Since the development of these sophisticated methods many more data providers shared their data with the ISMN and new types of erroneous measurements were identified. Thus, an enhancement of the automated quality control procedures became necessary. In the present work, we introduce enhancements of the existing quality control algorithms. Additionally, six completely new quality checks have been developed, e.g. detection of suspicious values before or after NAN-values, constant values and values that lie in a spectrum where a high majority of values before and after is flagged and therefore a sensor malfunction is certain. For the evaluation of the enhanced automated quality control system many test data sets were chosen, and manually validated to be compared to the existing quality control procedures and the new algorithms. Improvements will be shown that assure an appropriate assessment of the ISMN data sets, which are used for validations of soil moisture data retrieved by satellite data and are the foundation many other scientific publications.

  11. Quality control and quality assurance plan for bridge channel-stability assessments in Massachusetts

    USGS Publications Warehouse

    Parker, Gene W.; Pinson, Harlow

    1993-01-01

    A quality control and quality assurance plan has been implemented as part of the Massachusetts bridge scour and channel-stability assessment program. This program is being conducted by the U.S. Geological Survey, Massachusetts-Rhode Island District, in cooperation with the Massachusetts Highway Department. Project personnel training, data-integrity verification, and new data-management technologies are being utilized in the channel-stability assessment process to improve current data-collection and management techniques. An automated data-collection procedure has been implemented to standardize channel-stability assessments on a regular basis within the State. An object-oriented data structure and new image management tools are used to produce a data base enabling management of multiple data object classes. Data will be reviewed by assessors and data base managers before being merged into a master bridge-scour data base, which includes automated data-verification routines.

  12. First evaluation of automated specimen inoculation for wound swab samples by use of the Previ Isola system compared to manual inoculation in a routine laboratory: finding a cost-effective and accurate approach.

    PubMed

    Mischnik, Alexander; Mieth, Markus; Busch, Cornelius J; Hofer, Stefan; Zimmermann, Stefan

    2012-08-01

    Automation of plate streaking is ongoing in clinical microbiological laboratories, but evaluation for routine use is mostly open. In the present study, the recovery of microorganisms from the Previ Isola system plated polyurethane (PU) swab samples is compared to manually plated control viscose swab samples from wounds according to the CLSI procedure M40-A (quality control of microbiological transport systems). One hundred twelve paired samples (224 swabs) were analyzed. In 80/112 samples (71%), concordant culture results were obtained with the two methods. In 32/112 samples (29%), CFU recovery of microorganisms from the two methods was discordant. In 24 (75%) of the 32 paired samples with a discordant result, Previ Isola plated PU swabs were superior. In 8 (25%) of the 32 paired samples with a discordant result, control viscose swabs were superior. The quality of colony growth on culture media for further investigations was superior with Previ Isola inoculated plates compared to manual plating techniques. Gram stain results were concordant between the two methods in 62/112 samples (55%). In 50/112 samples (45%), the results of Gram staining were discordant between the two methods. In 34 (68%) of the 50 paired samples with discordant results, Gram staining of PU swabs was superior to that of control viscose swabs. In 16 (32%) of the 50 paired samples, Gram staining of control viscose swabs was superior to that of PU swabs. We report the first clinical evaluation of Previ Isola automated specimen inoculation for wound swab samples. This study suggests that use of an automated specimen inoculation system has good results with regard to CFU recovery, quality of Gram staining, and accuracy of diagnosis.

  13. First Evaluation of Automated Specimen Inoculation for Wound Swab Samples by Use of the Previ Isola System Compared to Manual Inoculation in a Routine Laboratory: Finding a Cost-Effective and Accurate Approach

    PubMed Central

    Mieth, Markus; Busch, Cornelius J.; Hofer, Stefan; Zimmermann, Stefan

    2012-01-01

    Automation of plate streaking is ongoing in clinical microbiological laboratories, but evaluation for routine use is mostly open. In the present study, the recovery of microorganisms from the Previ Isola system plated polyurethane (PU) swab samples is compared to manually plated control viscose swab samples from wounds according to the CLSI procedure M40-A (quality control of microbiological transport systems). One hundred twelve paired samples (224 swabs) were analyzed. In 80/112 samples (71%), concordant culture results were obtained with the two methods. In 32/112 samples (29%), CFU recovery of microorganisms from the two methods was discordant. In 24 (75%) of the 32 paired samples with a discordant result, Previ Isola plated PU swabs were superior. In 8 (25%) of the 32 paired samples with a discordant result, control viscose swabs were superior. The quality of colony growth on culture media for further investigations was superior with Previ Isola inoculated plates compared to manual plating techniques. Gram stain results were concordant between the two methods in 62/112 samples (55%). In 50/112 samples (45%), the results of Gram staining were discordant between the two methods. In 34 (68%) of the 50 paired samples with discordant results, Gram staining of PU swabs was superior to that of control viscose swabs. In 16 (32%) of the 50 paired samples, Gram staining of control viscose swabs was superior to that of PU swabs. We report the first clinical evaluation of Previ Isola automated specimen inoculation for wound swab samples. This study suggests that use of an automated specimen inoculation system has good results with regard to CFU recovery, quality of Gram staining, and accuracy of diagnosis. PMID:22692745

  14. Automated surface quality inspection with ARGOS: a case study

    NASA Astrophysics Data System (ADS)

    Kiefhaber, Daniel; Etzold, Fabian; Warken, Arno F.; Asfour, Jean-Michel

    2017-06-01

    The commercial availability of automated inspection systems for optical surfaces specified according to ISO 10110-7 promises unsupervised and automated quality control with reproducible results. In this study, the classification results of the ARGOS inspection system are compared to the decisions by well-trained inspectors based on manual-visual inspection. Both are found to agree in 93.6% of the studied cases. Exemplary cases with differing results are studied, and shown to be partly caused by shortcomings of the ISO 10110-7 standard, which was written for the industry standard manual-visual inspection. Applying it to high resolution images of the whole surface of objective machine vision systems brings with it a few challenges which are discussed.

  15. Design and implementation of software for automated quality control and data analysis for a complex LC/MS/MS assay for urine opiates and metabolites.

    PubMed

    Dickerson, Jane A; Schmeling, Michael; Hoofnagle, Andrew N; Hoffman, Noah G

    2013-01-16

    Mass spectrometry provides a powerful platform for performing quantitative, multiplexed assays in the clinical laboratory, but at the cost of increased complexity of analysis and quality assurance calculations compared to other methodologies. Here we describe the design and implementation of a software application that performs quality control calculations for a complex, multiplexed, mass spectrometric analysis of opioids and opioid metabolites. The development and implementation of this application improved our data analysis and quality assurance processes in several ways. First, use of the software significantly improved the procedural consistency for performing quality control calculations. Second, it reduced the amount of time technologists spent preparing and reviewing the data, saving on average over four hours per run, and in some cases improving turnaround time by a day. Third, it provides a mechanism for coupling procedural and software changes with the results of each analysis. We describe several key details of the implementation including the use of version control software and automated unit tests. These generally useful software engineering principles should be considered for any software development project in the clinical lab. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Automating PACS quality control with the Vanderbilt image processing enterprise resource

    NASA Astrophysics Data System (ADS)

    Esparza, Michael L.; Welch, E. Brian; Landman, Bennett A.

    2012-02-01

    Precise image acquisition is an integral part of modern patient care and medical imaging research. Periodic quality control using standardized protocols and phantoms ensures that scanners are operating according to specifications, yet such procedures do not ensure that individual datasets are free from corruption; for example due to patient motion, transient interference, or physiological variability. If unacceptable artifacts are noticed during scanning, a technologist can repeat a procedure. Yet, substantial delays may be incurred if a problematic scan is not noticed until a radiologist reads the scans or an automated algorithm fails. Given scores of slices in typical three-dimensional scans and widevariety of potential use cases, a technologist cannot practically be expected inspect all images. In large-scale research, automated pipeline systems have had great success in achieving high throughput. However, clinical and institutional workflows are largely based on DICOM and PACS technologies; these systems are not readily compatible with research systems due to security and privacy restrictions. Hence, quantitative quality control has been relegated to individual investigators and too often neglected. Herein, we propose a scalable system, the Vanderbilt Image Processing Enterprise Resource (VIPER) to integrate modular quality control and image analysis routines with a standard PACS configuration. This server unifies image processing routines across an institutional level and provides a simple interface so that investigators can collaborate to deploy new analysis technologies. VIPER integrates with high performance computing environments has successfully analyzed all standard scans from our institutional research center over the course of the last 18 months.

  17. Automated update, revision, and quality control of the maize genome annotations using MAKER-P improves the B73 RefGen_v3 gene models and identifies new genes

    USDA-ARS?s Scientific Manuscript database

    The large size and relative complexity of many plant genomes make creation, quality control, and dissemination of high-quality gene structure annotations challenging. In response, we have developed MAKER-P, a fast and easy-to-use genome annotation engine for plants. Here, we report the use of MAKER-...

  18. Flexible automated approach for quantitative liquid handling of complex biological samples.

    PubMed

    Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H

    2007-11-01

    A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.

  19. Automated Microflow NMR: Routine Analysis of Five-Microliter Samples

    PubMed Central

    Jansma, Ariane; Chuan, Tiffany; Geierstanger, Bernhard H.; Albrecht, Robert W.; Olson, Dean L.; Peck, Timothy L.

    2006-01-01

    A microflow CapNMR probe double-tuned for 1H and 13C was installed on a 400-MHz NMR spectrometer and interfaced to an automated liquid handler. Individual samples dissolved in DMSO-d6 are submitted for NMR analysis in vials containing as little as 10 μL of sample. Sets of samples are submitted in a low-volume 384-well plate. Of the 10 μL of sample per well, as with vials, 5 μL is injected into the microflow NMR probe for analysis. For quality control of chemical libraries, 1D NMR spectra are acquired under full automation from 384-well plates on as many as 130 compounds within 24 h using 128 scans per spectrum and a sample-to-sample cycle time of ∼11 min. Because of the low volume requirements and high mass sensitivity of the microflow NMR system, 30 nmol of a typical small molecule is sufficient to obtain high-quality, well-resolved, 1D proton or 2D COSY NMR spectra in ∼6 or 20 min of data acquisition time per experiment, respectively. Implementation of pulse programs with automated solvent peak identification and suppression allow for reliable data collection, even for samples submitted in fully protonated DMSO. The automated microflow NMR system is controlled and monitored using web-based software. PMID:16194121

  20. Flexible Automation System for Determination of Elemental Composition of Incrustations in Clogged Biliary Endoprostheses Using ICP-MS.

    PubMed

    Fleischer, Heidi; Ramani, Kinjal; Blitti, Koffi; Roddelkopf, Thomas; Warkentin, Mareike; Behrend, Detlef; Thurow, Kerstin

    2018-02-01

    Automation systems are well established in industries and life science laboratories, especially in bioscreening and high-throughput applications. An increasing demand of automation solutions can be seen in the field of analytical measurement in chemical synthesis, quality control, and medical and pharmaceutical fields, as well as research and development. In this study, an automation solution was developed and optimized for the investigation of new biliary endoprostheses (stents), which should reduce clogging after implantation in the human body. The material inside the stents (incrustations) has to be controlled regularly and under identical conditions. The elemental composition is one criterion to be monitored in stent development. The manual procedure was transferred to an automated process including sample preparation, elemental analysis using inductively coupled plasma mass spectrometry (ICP-MS), and data evaluation. Due to safety issues, microwave-assisted acid digestion was executed outside of the automation system. The performance of the automated process was determined and validated. The measurement results and the processing times were compared for both the manual and the automated procedure. Finally, real samples of stent incrustations and pig bile were analyzed using the automation system.

  1. Advanced automation in space shuttle mission control

    NASA Technical Reports Server (NTRS)

    Heindel, Troy A.; Rasmussen, Arthur N.; Mcfarland, Robert Z.

    1991-01-01

    The Real Time Data System (RTDS) Project was undertaken in 1987 to introduce new concepts and technologies for advanced automation into the Mission Control Center environment at NASA's Johnson Space Center. The project's emphasis is on producing advanced near-operational prototype systems that are developed using a rapid, interactive method and are used by flight controllers during actual Shuttle missions. In most cases the prototype applications have been of such quality and utility that they have been converted to production status. A key ingredient has been an integrated team of software engineers and flight controllers working together to quickly evolve the demonstration systems.

  2. Piloted Evaluation of the H-Mode, a Variable Autonomy Control System, in Motion-Based Simulation

    NASA Technical Reports Server (NTRS)

    Goodrich, Kenneth H.; Schutte, Paul C.; Williams, Ralph A.

    2008-01-01

    As aircraft become able to autonomously respond to a range of situations with performance surpassing human operators, we are compelled to look for new methods that help understand their use and guide the design of new, more effective forms of automation and interaction. The "H-mode" is one such method and is based on the metaphor of a well-trained horse. The concept allows the pilot to manage a broad range of control automation functionality, from augmented manual control to FMS-like coupling and automation initiated actions, using a common interface system and easily learned set of interaction skills. The interface leverages familiar manual control interfaces (e.g., the control stick) and flight displays through the addition of contextually dependent haptic-multimodal elements. The concept is relevant to manned and remotely piloted vehicles. This paper provides an overview of the H-mode concept followed by a presentation of the results from a recent evaluation conducted in a motion-based simulator. The evaluation focused on assessing the overall usability and flying qualities of the concept with an emphasis on the effects of turbulence and cockpit motion. Because the H-mode results in interactions between traditional flying qualities and management of higher-level flight path automation, these effects are of particular interest. The results indicate that the concept may provide a useful complement or replacement to conventional interfaces, and retains the usefulness in the presence of turbulence and motion.

  3. Robotic and automatic welding development at the Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Jones, C. S.; Jackson, M. E.; Flanigan, L. A.

    1988-01-01

    Welding automation is the key to two major development programs to improve quality and reduce the cost of manufacturing space hardware currently undertaken by the Materials and Processes Laboratory of the NASA Marshall Space Flight Center. Variable polarity plasma arc welding has demonstrated its effectiveness on class 1 aluminum welding in external tank production. More than three miles of welds were completed without an internal defect. Much of this success can be credited to automation developments which stabilize the process. Robotic manipulation technology is under development for automation of welds on the Space Shuttle's main engines utilizing pathfinder systems in development of tooling and sensors for the production applications. The overall approach to welding automation development undertaken is outlined. Advanced sensors and control systems methodologies are described that combine to make aerospace quality welds with a minimum of dependence on operator skill.

  4. Conceptual design of an aircraft automated coating removal system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, J.E.; Draper, J.V.; Pin, F.G.

    1996-05-01

    Paint stripping of the U.S. Air Force`s large transport aircrafts is currently a labor-intensive, manual process. Significant reductions in costs, personnel and turnaround time can be accomplished by the judicious use of automation in some process tasks. This paper presents the conceptual design of a coating removal systems for the tail surfaces of the C-5 plane. Emphasis is placed on the technology selection to optimize human-automation synergy with respect to overall costs, throughput, quality, safety, and reliability. Trade- offs between field-proven vs. research-requiring technologies, and between expected gain vs. cost and complexity, have led to a conceptual design which ismore » semi-autonomous (relying on the human for task specification and disturbance handling) yet incorporates sensor- based automation (for sweep path generation and tracking, surface following, stripping quality control and tape/breach handling).« less

  5. Automatic structured grid generation using Gridgen (some restrictions apply)

    NASA Technical Reports Server (NTRS)

    Chawner, John R.; Steinbrenner, John P.

    1995-01-01

    The authors have noticed in the recent grid generation literature an emphasis on the automation of structured grid generation. The motivation behind such work is clear; grid generation is easily the most despised task in the grid-analyze-visualize triad of computational analysis (CA). However, because grid generation is closely coupled to both the design and analysis software and because quantitative measures of grid quality are lacking, 'push button' grid generation usually results in a compromise between speed, control, and quality. Overt emphasis on automation obscures the substantive issues of providing users with flexible tools for generating and modifying high quality grids in a design environment. In support of this paper's tongue-in-cheek title, many features of the Gridgen software are described. Gridgen is by no stretch of the imagination an automatic grid generator. Despite this fact, the code does utilize many automation techniques that permit interesting regenerative features.

  6. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes.

    PubMed

    Downey, Brandon; Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-11-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed-batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647-1661, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  7. Flexible optical metrology strategies for the control and quality assurance of small series production

    NASA Astrophysics Data System (ADS)

    Schmitt, R.; Pavim, A.

    2009-06-01

    The demand for achieving smaller and more flexible production series with a considerable diversity of products complicates the control of the manufacturing tasks, leading to big challenges for the quality assurance systems. The quality assurance strategy that is nowadays used for mass production is unable to cope with the inspection flexibility needed among automated small series production, because the measuring strategy is totally dependent on the fixed features of the few manufactured object variants and on process parameters that can be controlled/compensated during production time. The major challenge faced by a quality assurance system applied to small series production facilities is to guarantee the needed quality level already at the first run, and therefore, the quality assurance system has to adapt itself constantly to the new manufacturing conditions. The small series production culture requires a change of paradigms, because its strategies are totally different from mass production. This work discusses the tight inspection requirements of small series production and presents flexible metrology strategies based on optical sensor data fusion techniques, agent-based systems as well as cognitive and self-optimised systems for assuring the needed quality level of flexible small series. Examples of application scenarios are provided among the automated assembly of solid state lasers and the flexible inspection of automotive headlights.

  8. Quality control in urodynamics and the role of software support in the QC procedure.

    PubMed

    Hogan, S; Jarvis, P; Gammie, A; Abrams, P

    2011-11-01

    This article aims to identify quality control (QC) best practice, to review published QC audits in order to identify how closely good practice is followed, and to carry out a market survey of the software features that support QC offered by urodynamics machines available in the UK. All UK distributors of urodynamic systems were contacted and asked to provide information on the software features relating to data quality of the products they supply. The results of the market survey show that the features offered by manufacturers differ greatly. Automated features, which can be turned off in most cases, include: cough recognition, detrusor contraction detection, and high pressure alerts. There are currently no systems that assess data quality based on published guidelines. A literature review of current QC guidelines for urodynamics was carried out; QC audits were included in the literature review to see how closely guidelines were being followed. This review highlights the fact that basic QC is not being carried out effectively by urodynamicists. Based on the software features currently available and the results of the literature review there is both the need and capacity for a greater degree of automation in relation to urodynamic data quality and accuracy assessment. Some progress has been made in this area and certain manufacturers have already developed automated cough detection. Copyright © 2011 Wiley Periodicals, Inc.

  9. A method to establish seismic noise baselines for automated station assessment

    USGS Publications Warehouse

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  10. Automated locomotor activity monitoring as a quality control assay for mass-reared tephritid flies.

    PubMed

    Dominiak, Bernard C; Fanson, Benjamin G; Collins, Samuel R; Taylor, Phillip W

    2014-02-01

    The Sterile Insect Technique (SIT) requires vast numbers of consistently high quality insects to be produced over long periods. Quality control (QC) procedures are critical to effective SIT, both providing quality assurance and warning of operational deficiencies. We here present a potential new QC assay for mass rearing of Queensland fruit flies (Bactrocera tryoni Froggatt) for SIT; locomotor activity monitoring. We investigated whether automated locomotor activity monitors (LAMs) that simply detect how often a fly passes an infrared sensor in a glass tube might provide similar insights but with much greater economy. Activity levels were generally lower for females than for males, and declined over five days in the monitor for both sexes. Female activity levels were not affected by irradiation, but males irradiated at 60 or 70 Gy had reduced activity levels compared with unirradiated controls. We also found some evidence that mild heat shock of pupae results in adults with reduced activity. LAM offers a convenient, effective and economical assay to probe such changes. © 2013 Society of Chemical Industry.

  11. Information systems as a quality management tool in clinical laboratories

    NASA Astrophysics Data System (ADS)

    Schmitz, Vanessa; Rosecler Bez el Boukhari, Marta

    2007-11-01

    This article describes information systems as a quality management tool in clinical laboratories. The quality of laboratory analyses is of fundamental importance for health professionals in aiding appropriate diagnosis and treatment. Information systems allow the automation of internal quality management processes, using standard sample tests, Levey-Jennings charts and Westgard multirule analysis. This simplifies evaluation and interpretation of quality tests and reduces the possibility of human error. This study proposes the development of an information system with appropriate functions and costs for the automation of internal quality control in small and medium-sized clinical laboratories. To this end, it evaluates the functions and usability of two commercial software products designed for this purpose, identifying the positive features of each, so that these can be taken into account during the development of the proposed system.

  12. Manufacturing Laboratory for Next Generation Engineers

    DTIC Science & Technology

    2013-12-16

    automated CNC machines, rapid prototype systems, robotic assembly systems, metrology , and non-traditional systems such as a waterjet cutter, EDM machine...CNC machines, rapid prototype systems, robotic assembly systems, metrology , and non-traditional systems such as a waterjet cutter, EDM machine, plasma...System Metrology and Quality Control Equipment - This area already had a CMM and other well known quality control instrumentation. It has been enhanced

  13. Printing quality control automation

    NASA Astrophysics Data System (ADS)

    Trapeznikova, O. V.

    2018-04-01

    One of the most important problems in the concept of standardizing the process of offset printing is the control the quality rating of printing and its automation. To solve the problem, a software has been developed taking into account the specifics of printing system components and the behavior in printing process. In order to characterize the distribution of ink layer on the printed substrate the so-called deviation of the ink layer thickness on the sheet from nominal surface is suggested. The geometric data construction the surface projections of the color gamut bodies allows to visualize the color reproduction gamut of printing systems in brightness ranges and specific color sectors, that provides a qualitative comparison of the system by the reproduction of individual colors in a varying ranges of brightness.

  14. Report: Fiscal 2004 and 2003 Financial Statements for the Pesticides Reregistration and Expedited Processing Fund

    EPA Pesticide Factsheets

    Report #2005-1-00081, May 4, 2005. We identified the following reportable conditions: We could not assess the adequacy of automated controls. EPA needs to improve financial statement preparation and quality control.

  15. Recent trends in laboratory automation in the pharmaceutical industry.

    PubMed

    Rutherford, M L; Stinger, T

    2001-05-01

    The impact of robotics and automation on the pharmaceutical industry over the last two decades has been significant. In the last ten years, the emphasis of laboratory automation has shifted from the support of manufactured products and quality control of laboratory applications, to research and development. This shift has been the direct result of an increased emphasis on the identification, development and eventual marketing of innovative new products. In this article, we will briefly identify and discuss some of the current trends in laboratory automation in the pharmaceutical industry as they apply to research and development, including screening, sample management, combinatorial chemistry, ADME/Tox and pharmacokinetics.

  16. Quantity is nothing without quality: automated QA/QC for streaming sensor networks

    Treesearch

    John L. Campbell; Lindsey E. Rustad; John H. Porter; Jeffrey R. Taylor; Ethan W. Dereszynski; James B. Shanley; Corinna Gries; Donald L. Henshaw; Mary E. Martin; Wade. M. Sheldon; Emery R. Boose

    2013-01-01

    Sensor networks are revolutionizing environmental monitoring by producing massive quantities of data that are being made publically available in near real time. These data streams pose a challenge for ecologists because traditional approaches to quality assurance and quality control are no longer practical when confronted with the size of these data sets and the...

  17. Combining Image and Non-Image Data for Automatic Detection of Retina Disease in a Telemedicine Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aykac, Deniz; Chaum, Edward; Fox, Karen

    A telemedicine network with retina cameras and automated quality control, physiological feature location, and lesion/anomaly detection is a low-cost way of achieving broad-based screening for diabetic retinopathy (DR) and other eye diseases. In the process of a routine eye-screening examination, other non-image data is often available which may be useful in automated diagnosis of disease. In this work, we report on the results of combining this non-image data with image data, using the protocol and processing steps of a prototype system for automated disease diagnosis of retina examinations from a telemedicine network. The system includes quality assessments, automated physiology detection,more » and automated lesion detection to create an archive of known cases. Non-image data such as diabetes onset date and hemoglobin A1c (HgA1c) for each patient examination are included as well, and the system is used to create a content-based image retrieval engine capable of automated diagnosis of disease into 'normal' and 'abnormal' categories. The system achieves a sensitivity and specificity of 91.2% and 71.6% using hold-one-out validation testing.« less

  18. Pulse-Flow Microencapsulation System

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R.

    2006-01-01

    The pulse-flow microencapsulation system (PFMS) is an automated system that continuously produces a stream of liquid-filled microcapsules for delivery of therapeutic agents to target tissues. Prior microencapsulation systems have relied on batch processes that involve transfer of batches between different apparatuses for different stages of production followed by sampling for acquisition of quality-control data, including measurements of size. In contrast, the PFMS is a single, microprocessor-controlled system that performs all processing steps, including acquisition of quality-control data. The quality-control data can be used as real-time feedback to ensure the production of large quantities of uniform microcapsules.

  19. Automated Quality Control of in Situ Soil Moisture from the North American Soil Moisture Database Using NLDAS-2 Products

    NASA Astrophysics Data System (ADS)

    Ek, M. B.; Xia, Y.; Ford, T.; Wu, Y.; Quiring, S. M.

    2015-12-01

    The North American Soil Moisture Database (NASMD) was initiated in 2011 to provide support for developing climate forecasting tools, calibrating land surface models and validating satellite-derived soil moisture algorithms. The NASMD has collected data from over 30 soil moisture observation networks providing millions of in situ soil moisture observations in all 50 states as well as Canada and Mexico. It is recognized that the quality of measured soil moisture in NASMD is highly variable due to the diversity of climatological conditions, land cover, soil texture, and topographies of the stations and differences in measurement devices (e.g., sensors) and installation. It is also recognized that error, inaccuracy and imprecision in the data set can have significant impacts on practical operations and scientific studies. Therefore, developing an appropriate quality control procedure is essential to ensure the data is of the best quality. In this study, an automated quality control approach is developed using the North American Land Data Assimilation System phase 2 (NLDAS-2) Noah soil porosity, soil temperature, and fraction of liquid and total soil moisture to flag erroneous and/or spurious measurements. Overall results show that this approach is able to flag unreasonable values when the soil is partially frozen. A validation example using NLDAS-2 multiple model soil moisture products at the 20 cm soil layer showed that the quality control procedure had a significant positive impact in Alabama, North Carolina, and West Texas. It had a greater impact in colder regions, particularly during spring and autumn. Over 433 NASMD stations have been quality controlled using the methodology proposed in this study, and the algorithm will be implemented to control data quality from the other ~1,200 NASMD stations in the near future.

  20. Applications of optical sensing for laser cutting and drilling.

    PubMed

    Fox, Mahlen D T; French, Paul; Peters, Chris; Hand, Duncan P; Jones, Julian D C

    2002-08-20

    Any reliable automated production system must include process control and monitoring techniques. Two laser processing techniques potentially lending themselves to automation are percussion drilling and cutting. For drilling we investigate the performance of a modification of a nonintrusive optical focus control system we previously developed for laser welding, which exploits the chromatic aberrations of the processing optics to determine focal error. We further developed this focus control system for closed-loop control of laser cutting. We show that an extension of the technique can detect deterioration in cut quality, and we describe practical trials carried out on different materials using both oxygen and nitrogen assist gas. We base our techniques on monitoring the light generated by the process, captured nonintrusively by the effector optics and processed remotely from the workpiece. We describe the relationship between the temporal and the chromatic modulation of the detected light and process quality and show how the information can be used as the basis of a process control system.

  1. Considerations In The Design And Specifications Of An Automatic Inspection System

    NASA Astrophysics Data System (ADS)

    Lee, David T.

    1980-05-01

    Considerable activities have been centered around the automation of manufacturing quality control and inspection functions. Several reasons can be cited for this development. The continuous pressure of direct and indirect labor cost increase is only one of the obvious motivations. With the drive for electronics miniaturization come more and more complex processes where control parameters are critical and the yield is highly susceptible to inadequate process monitor and inspection. With multi-step, multi-layer process for substrate fabrication, process defects that are not detected and corrected at certain critical points may render the entire subassembly useless. As a process becomes more complex, the time required to test the product increases significantly in the total build cycle. The urgency to reduce test time brings more pressure to improve in-process control and inspection. The advances and improvements of components, assemblies and systems such as micro-processors, micro-computers, programmable controllers, and other intelligent devices, have made the automation of quality control much more cost effective and justifiable.

  2. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing

    PubMed Central

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-01-01

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, feature extraction algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system. PMID:29462855

  3. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing.

    PubMed

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-02-15

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED light target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, direction location algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system.

  4. Sunglint Detection for Unmanned and Automated Platforms

    PubMed Central

    Garaba, Shungudzemwoyo Pascal; Schulz, Jan; Wernand, Marcel Robert; Zielinski, Oliver

    2012-01-01

    We present an empirical quality control protocol for above-water radiometric sampling focussing on identifying sunglint situations. Using hyperspectral radiometers, measurements were taken on an automated and unmanned seaborne platform in northwest European shelf seas. In parallel, a camera system was used to capture sea surface and sky images of the investigated points. The quality control consists of meteorological flags, to mask dusk, dawn, precipitation and low light conditions, utilizing incoming solar irradiance (ES) spectra. Using 629 from a total of 3,121 spectral measurements that passed the test conditions of the meteorological flagging, a new sunglint flag was developed. To predict sunglint conspicuous in the simultaneously available sea surface images a sunglint image detection algorithm was developed and implemented. Applying this algorithm, two sets of data, one with (having too much or detectable white pixels or sunglint) and one without sunglint (having least visible/detectable white pixel or sunglint), were derived. To identify the most effective sunglint flagging criteria we evaluated the spectral characteristics of these two data sets using water leaving radiance (LW) and remote sensing reflectance (RRS). Spectral conditions satisfying ‘mean LW (700–950 nm) < 2 mW·m−2·nm−1·Sr−1’ or alternatively ‘minimum RRS (700–950 nm) < 0.010 Sr−1’, mask most measurements affected by sunglint, providing an efficient empirical flagging of sunglint in automated quality control.

  5. A system-level approach to automation research

    NASA Technical Reports Server (NTRS)

    Harrison, F. W.; Orlando, N. E.

    1984-01-01

    Automation is the application of self-regulating mechanical and electronic devices to processes that can be accomplished with the human organs of perception, decision, and actuation. The successful application of automation to a system process should reduce man/system interaction and the perceived complexity of the system, or should increase affordability, productivity, quality control, and safety. The expense, time constraints, and risk factors associated with extravehicular activities have led the Automation Technology Branch (ATB), as part of the NASA Automation Research and Technology Program, to investigate the use of robots and teleoperators as automation aids in the context of space operations. The ATB program addresses three major areas: (1) basic research in autonomous operations, (2) human factors research on man-machine interfaces with remote systems, and (3) the integration and analysis of automated systems. This paper reviews the current ATB research in the area of robotics and teleoperators.

  6. Driving photomask supplier quality through automation

    NASA Astrophysics Data System (ADS)

    Russell, Drew; Espenscheid, Andrew

    2007-10-01

    In 2005, Freescale Semiconductor's newly centralized mask data prep organization (MSO) initiated a project to develop an automated global quality validation system for photomasks delivered to Freescale Semiconductor fabs. The system handles Certificate of Conformance (CofC) quality metric collection, validation, reporting and an alert system for all photomasks shipped to Freescale fabs from all qualified global suppliers. The completed system automatically collects 30+ quality metrics for each photomask shipped. Other quality metrics are generated from the collected data and quality metric conformance is automatically validated to specifications or control limits with failure alerts emailed to fab photomask and mask data prep engineering. A quality data warehouse stores the data for future analysis, which is performed quarterly. The improved access to data provided by the system has improved Freescale engineers' ability to spot trends and opportunities for improvement with our suppliers' processes. This paper will review each phase of the project, current system capabilities and quality system benefits for both our photomask suppliers and Freescale.

  7. A rule-based smart automated fertilization and irrigation systems

    NASA Astrophysics Data System (ADS)

    Yousif, Musab El-Rashid; Ghafar, Khairuddin; Zahari, Rahimi; Lim, Tiong Hoo

    2018-04-01

    Smart automation in industries has become very important as it can improve the reliability and efficiency of the systems. The use of smart technologies in agriculture have increased over the year to ensure and control the production of crop and address food security. However, it is important to use proper irrigation systems avoid water wastage and overfeeding of the plant. In this paper, a Smart Rule-based Automated Fertilization and Irrigation System is proposed and evaluated. We propose a rule based decision making algorithm to monitor and control the food supply to the plant and the soil quality. A build-in alert system is also used to update the farmer using a text message. The system is developed and evaluated using a real hardware.

  8. Measuring Up: Implementing a Dental Quality Measure in the Electronic Health Record Context

    PubMed Central

    Bhardwaj, Aarti; Ramoni, Rachel; Kalenderian, Elsbeth; Neumann, Ana; Hebballi, Nutan B; White, Joel M; McClellan, Lyle; Walji, Muhammad F

    2015-01-01

    Background Quality improvement requires quality measures that are validly implementable. In this work, we assessed the feasibility and performance of an automated electronic Meaningful Use dental clinical quality measure (percentage of children who received fluoride varnish). Methods We defined how to implement the automated measure queries in a dental electronic health record (EHR). Within records identified through automated query, we manually reviewed a subsample to assess the performance of the query. Results The automated query found 71.0% of patients to have had fluoride varnish compared to 77.6% found using the manual chart review. The automated quality measure performance was 90.5% sensitivity, 90.8% specificity, 96.9% positive predictive value, and 75.2% negative predictive value. Conclusions Our findings support the feasibility of automated dental quality measure queries in the context of sufficient structured data. Information noted only in the free text rather than in structured data would require natural language processing approaches to effectively query. Practical Implications To participate in self-directed quality improvement, dental clinicians must embrace the accountability era. Commitment to quality will require enhanced documentation in order to support near-term automated calculation of quality measures. PMID:26562736

  9. Systematic Assessment of the Hemolysis Index: Pros and Cons.

    PubMed

    Lippi, Giuseppe

    2015-01-01

    Preanalytical quality is as important as the analytical and postanalytical quality in laboratory diagnostics. After decades of visual inspection to establish whether or not a diagnostic sample may be suitable for testing, automated assessment of hemolysis index (HI) has now become available in a large number of laboratory analyzers. Although most national and international guidelines support systematic assessment of sample quality via HI, there is widespread perception that this indication has not been thoughtfully acknowledged. Potential explanations include concern of increased specimen rejection rate, poor harmonization of analytical techniques, lack of standardized units of measure, differences in instrument-specific cutoff, negative impact on throughput, organization and laboratory economics, and lack of a reliable quality control system. Many of these concerns have been addressed. Evidence now supports automated HI in improving quality and patient safety. These will be discussed. © 2015 Elsevier Inc. All rights reserved.

  10. The Surface Ocean CO2 Atlas: Stewarding Underway Carbon Data from Collection to Archival

    NASA Astrophysics Data System (ADS)

    O'Brien, K.; Smith, K. M.; Pfeil, B.; Landa, C.; Bakker, D. C. E.; Olsen, A.; Jones, S.; Shrestha, B.; Kozyr, A.; Manke, A. B.; Schweitzer, R.; Burger, E. F.

    2016-02-01

    The Surface Ocean CO2 Atlas (SOCAT, www.socat.info) is a quality controlled, global surface ocean carbon dioxide (CO2) data set gathered on research vessels, SOOP and buoys. To the degree feasible SOCAT is comprehensive; it draws together and applies uniform QC procedures to all such observations made across the international community. The first version of SOCAT (version 1.5) was publicly released September 2011(Bakker et al., 2011) with 6.3 million observations. This was followed by the release of SOCAT version 2, expanded to over 10 million observations, in June 2013 (Bakker et al., 2013). Most recently, in September 2015 SOCAT version 3 was released containing over 14 millions observations spanning almost 60 years! The process of assembling, QC'ing and publishing V1.5 and V2 of SOCAT required an unsustainable level of manual effort. To ease the burden on data managers and data providers, the SOCAT community agreed to embark an automated data ingestion process which would create a streamlined workflow to improve data stewardship from ingestion to quality control and from publishing to archival. To that end, for version 3 and beyond, the SOCAT automation team created a framework which was based upon standards and conventions, yet at the same time allows scientists to work in the data formats they felt most comfortable with (ie, csv files). This automated workflow provides several advantages: 1) data ingestion into uniform and standards-based file formats; 2) ease of data integration into standard quality control system; 3) data ingestion and quality control can be performed in parallel; 4) provides uniform method of archiving carbon data and generation of digital object identifiers (DOI).In this presentation, we will discuss and demonstrate the SOCAT data ingestion dashboard and the quality control system. We will also discuss the standards, conventions, and tools that were leveraged to create a workflow that allows scientists to work in their own formats, yet provides a framework for creating high quality data products on an annual basis, while meeting or exceeding data requirements for access, documentation and archival.

  11. [Applications of the hospital statistics management system].

    PubMed

    Zhai, Hong; Ren, Yong; Liu, Jing; Li, You-Zhang; Ma, Xiao-Long; Jiao, Tao-Tao

    2008-01-01

    The Hospital Statistics Management System is built on an Office Automation Platform of Shandong provincial hospital system. Its workflow, role and popedom technologies are used to standardize and optimize the management program of statistics in the total quality control of hospital statistics. The system's applications have combined the office automation platform with the statistics management in a hospital and this provides a practical example of a modern hospital statistics management model.

  12. Integrated microreactor for enzymatic reaction automation: An easy step toward the quality control of monoclonal antibodies.

    PubMed

    Ladner, Yoann; Mas, Silvia; Coussot, Gaelle; Bartley, Killian; Montels, Jérôme; Morel, Jacques; Perrin, Catherine

    2017-12-15

    The main purpose of the present work is to provide a fully integrated miniaturized electrophoretic methodology in order to facilitate the quality control of monoclonal antibodies (mAbs). This methodology called D-PES, which stands for Diffusion-mediated Proteolysis combined with an Electrophoretic Separation, permits to perform subsequently mAb tryptic digestion and electrophoresis separation of proteolysis products in an automated manner. Tryptic digestion conditions were optimized regarding the influence of enzyme concentration and incubation time in order to achieve similar enzymatic digestion efficiency to that obtained with the classical methodology (off-line). Then, the optimization of electrophoretic separation conditions concerning the nature of background electrolyte (BGE), ionic strength and pH was realized. Successful and repeatable electrophoretic profiles of three mAbs digests (Trastuzumab, Infliximab and Tocilizumab), comparable to the off-line digestion profiles, were obtained demonstrating the feasibility and robustness of the proposed methodology. In summary, the use of the proposed and optimized in-line approach opens a new, fast and easy way for the quality control of mAbs. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Statistical Methods in Assembly Quality Management of Multi-Element Products on Automatic Rotor Lines

    NASA Astrophysics Data System (ADS)

    Pries, V. V.; Proskuriakov, N. E.

    2018-04-01

    To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.

  14. QUALITY ASSURANCE AND QUALITY CONTROL IN THE DEVELOPMENT AND APPLICATION OF THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA) TOOL

    EPA Science Inventory

    Planning and assessment in land and water resource management are evolving from simple, local-scale problems toward complex, spatially explicit regional ones. Such problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and t...

  15. Instrument Quality Control.

    PubMed

    Jayakody, Chatura; Hull-Ryde, Emily A

    2016-01-01

    Well-defined quality control (QC) processes are used to determine whether a certain procedure or action conforms to a widely accepted standard and/or set of guidelines, and are important components of any laboratory quality assurance program (Popa-Burke et al., J Biomol Screen 14: 1017-1030, 2009). In this chapter, we describe QC procedures useful for monitoring the accuracy and precision of laboratory instrumentation, most notably automated liquid dispensers. Two techniques, gravimetric QC and photometric QC, are highlighted in this chapter. When used together, these simple techniques provide a robust process for evaluating liquid handler accuracy and precision, and critically underpin high-quality research programs.

  16. Automated respiratory cycles selection is highly specific and improves respiratory mechanics analysis.

    PubMed

    Rigo, Vincent; Graas, Estelle; Rigo, Jacques

    2012-07-01

    Selected optimal respiratory cycles should allow calculation of respiratory mechanic parameters focusing on patient-ventilator interaction. New computer software automatically selecting optimal breaths and respiratory mechanics derived from those cycles are evaluated. Retrospective study. University level III neonatal intensive care unit. Ten mins synchronized intermittent mandatory ventilation and assist/control ventilation recordings from ten newborns. The ventilator provided respiratory mechanic data (ventilator respiratory cycles) every 10 secs. Pressure, flow, and volume waves and pressure-volume, pressure-flow, and volume-flow loops were reconstructed from continuous pressure-volume recordings. Visual assessment determined assisted leak-free optimal respiratory cycles (selected respiratory cycles). New software graded the quality of cycles (automated respiratory cycles). Respiratory mechanic values were derived from both sets of optimal cycles. We evaluated quality selection and compared mean values and their variability according to ventilatory mode and respiratory mechanic provenance. To assess discriminating power, all 45 "t" values obtained from interpatient comparisons were compared for each respiratory mechanic parameter. A total of 11,724 breaths are evaluated. Automated respiratory cycle/selected respiratory cycle selections agreement is high: 88% of maximal κ with linear weighting. Specificity and positive predictive values are 0.98 and 0.96, respectively. Averaged values are similar between automated respiratory cycle and ventilator respiratory cycle. C20/C alone is markedly decreased in automated respiratory cycle (1.27 ± 0.37 vs. 1.81 ± 0.67). Tidal volume apparent similarity disappears in assist/control: automated respiratory cycle tidal volume (4.8 ± 1.0 mL/kg) is significantly lower than for ventilator respiratory cycle (5.6 ± 1.8 mL/kg). Coefficients of variation decrease for all automated respiratory cycle parameters in all infants. "t" values from ventilator respiratory cycle data are two to three times higher than ventilator respiratory cycles. Automated selection is highly specific. Automated respiratory cycle reflects most the interaction of both ventilator and patient. Improving discriminating power of ventilator monitoring will likely help in assessing disease status and following trends. Averaged parameters derived from automated respiratory cycles are more precise and could be displayed by ventilators to improve real-time fine tuning of ventilator settings.

  17. An Automatic Quality Control Pipeline for High-Throughput Screening Hit Identification.

    PubMed

    Zhai, Yufeng; Chen, Kaisheng; Zhong, Yang; Zhou, Bin; Ainscow, Edward; Wu, Ying-Ta; Zhou, Yingyao

    2016-09-01

    The correction or removal of signal errors in high-throughput screening (HTS) data is critical to the identification of high-quality lead candidates. Although a number of strategies have been previously developed to correct systematic errors and to remove screening artifacts, they are not universally effective and still require fair amount of human intervention. We introduce a fully automated quality control (QC) pipeline that can correct generic interplate systematic errors and remove intraplate random artifacts. The new pipeline was first applied to ~100 large-scale historical HTS assays; in silico analysis showed auto-QC led to a noticeably stronger structure-activity relationship. The method was further tested in several independent HTS runs, where QC results were sampled for experimental validation. Significantly increased hit confirmation rates were obtained after the QC steps, confirming that the proposed method was effective in enriching true-positive hits. An implementation of the algorithm is available to the screening community. © 2016 Society for Laboratory Automation and Screening.

  18. Method and system for automated on-chip material and structural certification of MEMS devices

    DOEpatents

    Sinclair, Michael B.; DeBoer, Maarten P.; Smith, Norman F.; Jensen, Brian D.; Miller, Samuel L.

    2003-05-20

    A new approach toward MEMS quality control and materials characterization is provided by a combined test structure measurement and mechanical response modeling approach. Simple test structures are cofabricated with the MEMS devices being produced. These test structures are designed to isolate certain types of physical response, so that measurement of their behavior under applied stress can be easily interpreted as quality control and material properties information.

  19. Real-time audiovisual feedback system in a physician-staffed helicopter emergency medical service in Finland: the quality results and barriers to implementation.

    PubMed

    Sainio, Marko; Kämäräinen, Antti; Huhtala, Heini; Aaltonen, Petri; Tenhunen, Jyrki; Olkkola, Klaus T; Hoppu, Sanna

    2013-07-01

    To evaluate the quality of cardiopulmonary resuscitation (CPR) in a physician staffed helicopter emergency medical service (HEMS) using a monitor-defibrillator with a quality analysis feature. As a post hoc analysis, the potential barriers to implementation were surveyed. The quality of CPR performed by the HEMS from November 2008 to April 2010 was analysed. To evaluate the implementation rate of quality analysis, the HEMS database was screened for all cardiac arrest missions during the study period. As a consequence of the observed low implementation rate, a survey was sent to physicians working in the HEMS to evaluate the possible reasons for not utilizing the automated quality analysis feature. During the study period, the quality analysis was used for 52 out of 187 patients (28%). In these cases the mean compression depth was < 40 mm in 46% and < 50 mm in 96% of the 1-min analysis intervals, but otherwise CPR quality corresponded with the 2005 resuscitation guidelines. In particular, the no-flow fraction was remarkably low 0.10 (0.07, 0.16). The most common reasons for not using quality-controlled CPR were that the device itself was not taken to the scene, or not applied to the patient, because another EMS unit was already treating the patient with another defibrillator. When quality-controlled CPR technology was used, the indicators of good quality CPR as described in the 2005 resuscitation guidelines were mostly achieved albeit with sufficient compression depth. The use of the well-described technology in improving patient care was low. Wider implementation of the automated quality control and feedback feature in defibrillators could further improve the quality of CPR on the field. ClinicalTrials.gov (NCT00951704).

  20. [The comprehensive approach to ensure the quality of forensic medical examination of a cadaver].

    PubMed

    Mel'nikov, O V; Mal'tsev, A E; Petrov, S B; Petrov, B A

    2015-01-01

    The objective of the present work was to estimate the effectiveness of the comprehensive monitoring system designed to enhance the quality of forensic medical expertise for determining the cause of death in the hanging cases. It was shown that the practical application of the algorithmization and automated quality control system improves the effectiveness of forensic medical examination of the cadavers in the hanging cases. The system performs the control, directing, and teaching functions. Moreover, it allows to estimate the completeness of the examination of the cadaver.

  1. Beware of agents when flying aircraft: Basic principles behind a generic methodology for the evaluation and certification of advanced aviation systems

    NASA Technical Reports Server (NTRS)

    Javaux, Denis; Masson, Michel; Dekeyser, Veronique

    1994-01-01

    There is currently a growing interest in the aeronautical community to assess the effects of the increasing levels of automation on pilots' performance and overall safety. The first effect of automation is the change in the nature of the pilot's role on the flight deck. Pilots have become supervisors who monitor aircraft systems in usual situations and intervene only when unanticipated events occur. Instead of 'hand flying' the airplane, pilots contribute to the control of aircraft by acting as mediators, instructions given to the automation. By eliminating the need for manually controlling normal situations, such a role division has reduced the opportunities for the pilot to acquire experience and skills necessary to safely cope with abnormal events. Difficulties in assessing the state and behavior of automation arise mainly from four factors: (1) the complexity of current systems and consequence mode-related problems; (2) the intrinsic autonomy of automation which is able to fire mode transitions without explicit commands from the pilots; (3) the bad quality of feed-back from the control systems displays and interfaces to the pilots; and (4) the fact that the automation currently has no explicit representation of the current pilots' intentions and strategy. Assuming certification has among its major goals to guarantee the passengers' and pilots' safety and the airplane integrity under normal and abnormal operational conditions, the authors suggest it would be particularly fruitful to come up with a conceptual reference system providing the certification authorities both with a theoretical framework and a list of principles usable for assessing the quality of the equipment and designs under examination. This is precisely the scope of this paper. However, the authors recognize that the conceptual presented is still under development and would thus be best considered as a source of reflection for the design, evaluation and certification processes of advanced aviation technologies.

  2. User’s manual for the Automated Data Assurance and Management application developed for quality control of Everglades Depth Estimation Network water-level data

    USGS Publications Warehouse

    Petkewich, Matthew D.; Daamen, Ruby C.; Roehl, Edwin A.; Conrads, Paul

    2016-09-29

    The generation of Everglades Depth Estimation Network (EDEN) daily water-level and water-depth maps is dependent on high quality real-time data from over 240 water-level stations. To increase the accuracy of the daily water-surface maps, the Automated Data Assurance and Management (ADAM) tool was created by the U.S. Geological Survey as part of Greater Everglades Priority Ecosystems Science. The ADAM tool is used to provide accurate quality-assurance review of the real-time data from the EDEN network and allows estimation or replacement of missing or erroneous data. This user’s manual describes how to install and operate the ADAM software. File structure and operation of the ADAM software is explained using examples.

  3. A Fully Automated Drosophila Olfactory Classical Conditioning and Testing System for Behavioral Learning and Memory Assessment

    PubMed Central

    Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L.; Page, Terry L.; Bhuva, Bharat; Broadie, Kendal

    2016-01-01

    Background Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. New Method The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. Results The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24 hours) are comparable to traditional manual experiments, while minimizing experimenter involvement. Comparison with Existing Methods The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ~$500US, making it affordable to a wide range of investigators. Conclusions This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays. PMID:26703418

  4. A fully automated Drosophila olfactory classical conditioning and testing system for behavioral learning and memory assessment.

    PubMed

    Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L; Page, Terry L; Bhuva, Bharat; Broadie, Kendal

    2016-03-01

    Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24h) are comparable to traditional manual experiments, while minimizing experimenter involvement. The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ∼$500US, making it affordable to a wide range of investigators. This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Quality specification in haematology: the automated blood cell count.

    PubMed

    Buttarello, Mauro

    2004-08-02

    Quality specifications for automated blood cell counts include topics that go beyond the traditional analytic stage (imprecision, inaccuracy, quality control) and extend to pre- and post-analytic phases. In this review pre-analytic aspects concerning the choice of anticoagulants, maximum conservation times and differences between storage at room temperature or at 4 degrees C are considered. For the analytic phase, goals for imprecision and bias obtained with various approaches (ratio to biologic variation, state of the art, specific clinical situations) are evaluated. For the post-analytic phase, medical review criteria (algorithm, decision limit and delta check) and the structure of the report (general part and comments), which constitutes the formal act through which a laboratory communicates with clinicians, are considered. K2EDTA is considered the anticoagulant of choice for automated cell counts. Regarding storage, specimens should be analyzed as soon as possible. Storage at 4 degrees C may stabilize specimens from 24 to 72 h when complete blood count (CBC) and differential leucocyte count (DLC) is performed. For precision, analytical goals based on the state of the art are acceptable while for bias this is satisfactory only for some parameters. In haematology quality specifications for pre- and analytical phases are important, but the review criteria and the quality of the report play a central role in assuring a definite clinical value.

  6. SMART-DS: Synthetic Models for Advanced, Realistic Testing: Distribution Systems and Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Bri-Mathias; Palmintier, Bryan

    This presentation provides an overview of full-scale, high-quality, synthetic distribution system data set(s) for testing distribution automation algorithms, distributed control approaches, ADMS capabilities, and other emerging distribution technologies.

  7. [Features of the maintenance of automated developing machines].

    PubMed

    Koveshnikov, A I

    1999-01-01

    Based on his long-term own experience the author gives recommendations on the assembly, adjustment, operation, and preventive maintenance of automatic developing machines. Procedures are presented for evaluating the quality of X-ray films and controlling the activity of operating qualities of a developer while machining photographic materials. Troubles and malfunction of equipment and procedures for their elimination are shown to affect the quality of development of films.

  8. SUGAR: graphical user interface-based data refiner for high-throughput DNA sequencing.

    PubMed

    Sato, Yukuto; Kojima, Kaname; Nariai, Naoki; Yamaguchi-Kabata, Yumi; Kawai, Yosuke; Takahashi, Mamoru; Mimori, Takahiro; Nagasaki, Masao

    2014-08-08

    Next-generation sequencers (NGSs) have become one of the main tools for current biology. To obtain useful insights from the NGS data, it is essential to control low-quality portions of the data affected by technical errors such as air bubbles in sequencing fluidics. We develop a software SUGAR (subtile-based GUI-assisted refiner) which can handle ultra-high-throughput data with user-friendly graphical user interface (GUI) and interactive analysis capability. The SUGAR generates high-resolution quality heatmaps of the flowcell, enabling users to find possible signals of technical errors during the sequencing. The sequencing data generated from the error-affected regions of a flowcell can be selectively removed by automated analysis or GUI-assisted operations implemented in the SUGAR. The automated data-cleaning function based on sequence read quality (Phred) scores was applied to a public whole human genome sequencing data and we proved the overall mapping quality was improved. The detailed data evaluation and cleaning enabled by SUGAR would reduce technical problems in sequence read mapping, improving subsequent variant analysis that require high-quality sequence data and mapping results. Therefore, the software will be especially useful to control the quality of variant calls to the low population cells, e.g., cancers, in a sample with technical errors of sequencing procedures.

  9. Automation effects in a stereotypical multiloop manual control system. [for aircraft

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Mcnally, B. D.

    1984-01-01

    The increasing reliance of state-of-the art, high performance aircraft on high authority stability and command augmentation systems, in order to obtain satisfactory performance and handling qualities, has made critical the achievement of a better understanding of human capabilities, limitations, and preferences during interactions with complex dynamic systems that involve task allocation between man and machine. An analytical and experimental study has been undertaken to investigate human interaction with a simple, multiloop dynamic system in which human activity was systematically varied by changing the levels of automation. Task definition has led to a control loop structure which parallels that for any multiloop manual control system, and may therefore be considered a stereotype.

  10. Automated generation of weld path trajectories.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sizemore, John M.; Hinman-Sweeney, Elaine Marie; Ames, Arlo Leroy

    2003-06-01

    AUTOmated GENeration of Control Programs for Robotic Welding of Ship Structure (AUTOGEN) is software that automates the planning and compiling of control programs for robotic welding of ship structure. The software works by evaluating computer representations of the ship design and the manufacturing plan. Based on this evaluation, AUTOGEN internally identifies and appropriately characterizes each weld. Then it constructs the robot motions necessary to accomplish the welds and determines for each the correct assignment of process control values. AUTOGEN generates these robot control programs completely without manual intervention or edits except to correct wrong or missing input data. Most shipmore » structure assemblies are unique or at best manufactured only a few times. Accordingly, the high cost inherent in all previous methods of preparing complex control programs has made robot welding of ship structures economically unattractive to the U.S. shipbuilding industry. AUTOGEN eliminates the cost of creating robot control programs. With programming costs eliminated, capitalization of robots to weld ship structures becomes economically viable. Robot welding of ship structures will result in reduced ship costs, uniform product quality, and enhanced worker safety. Sandia National Laboratories and Northrop Grumman Ship Systems worked with the National Shipbuilding Research Program to develop a means of automated path and process generation for robotic welding. This effort resulted in the AUTOGEN program, which has successfully demonstrated automated path generation and robot control. Although the current implementation of AUTOGEN is optimized for welding applications, the path and process planning capability has applicability to a number of industrial applications, including painting, riveting, and adhesive delivery.« less

  11. System for verifiable CT radiation dose optimization based on image quality. part II. process control system.

    PubMed

    Larson, David B; Malarik, Remo J; Hall, Seth M; Podberesky, Daniel J

    2013-10-01

    To evaluate the effect of an automated computed tomography (CT) radiation dose optimization and process control system on the consistency of estimated image noise and size-specific dose estimates (SSDEs) of radiation in CT examinations of the chest, abdomen, and pelvis. This quality improvement project was determined not to constitute human subject research. An automated system was developed to analyze each examination immediately after completion, and to report individual axial-image-level and study-level summary data for patient size, image noise, and SSDE. The system acquired data for 4 months beginning October 1, 2011. Protocol changes were made by using parameters recommended by the prediction application, and 3 months of additional data were acquired. Preimplementation and postimplementation mean image noise and SSDE were compared by using unpaired t tests and F tests. Common-cause variation was differentiated from special-cause variation by using a statistical process control individual chart. A total of 817 CT examinations, 490 acquired before and 327 acquired after the initial protocol changes, were included in the study. Mean patient age and water-equivalent diameter were 12.0 years and 23.0 cm, respectively. The difference between actual and target noise increased from -1.4 to 0.3 HU (P < .01) and the standard deviation decreased from 3.9 to 1.6 HU (P < .01). Mean SSDE decreased from 11.9 to 7.5 mGy, a 37% reduction (P < .01). The process control chart identified several special causes of variation. Implementation of an automated CT radiation dose optimization system led to verifiable simultaneous decrease in image noise variation and SSDE. The automated nature of the system provides the opportunity for consistent CT radiation dose optimization on a broad scale. © RSNA, 2013.

  12. Systems and methods for data quality control and cleansing

    DOEpatents

    Wenzel, Michael; Boettcher, Andrew; Drees, Kirk; Kummer, James

    2016-05-31

    A method for detecting and cleansing suspect building automation system data is shown and described. The method includes using processing electronics to automatically determine which of a plurality of error detectors and which of a plurality of data cleansers to use with building automation system data. The method further includes using processing electronics to automatically detect errors in the data and cleanse the data using a subset of the error detectors and a subset of the cleansers.

  13. A real-time automated quality control of rain gauge data based on multiple sensors

    NASA Astrophysics Data System (ADS)

    qi, Y.; Zhang, J.

    2013-12-01

    Precipitation is one of the most important meteorological and hydrological variables. Automated rain gauge networks provide direct measurements of precipitation and have been used for numerous applications such as generating regional and national precipitation maps, calibrating remote sensing data, and validating hydrological and meteorological model predictions. Automated gauge observations are prone to a variety of error sources (instrument malfunction, transmission errors, format changes), and require careful quality controls (QC). Many previous gauge QC techniques were based on neighborhood checks within the gauge network itself and the effectiveness is dependent on gauge densities and precipitation regimes. The current study takes advantage of the multi-sensor data sources in the National Mosaic and Multi-Sensor QPE (NMQ/Q2) system and developes an automated gauge QC scheme based the consistency of radar hourly QPEs and gauge observations. Error characteristics of radar and gauge as a function of the radar sampling geometry, precipitation regimes, and the freezing level height are considered. The new scheme was evaluated by comparing an NMQ national gauge-based precipitation product with independent manual gauge observations. Twelve heavy rainfall events from different seasons and areas of the United States are selected for the evaluation, and the results show that the new NMQ product with QC'ed gauges has a more physically spatial distribution than the old product. And the new product agrees much better statistically with the independent gauges.

  14. [Adaptation of the (18)FDG module for the preparation of a sodium fluoride [(18)F] injection solution in agreement with the United States (USP 32) and European Pharmacopeia (PhEur 6)].

    PubMed

    Martínez, T; Cordero, B; Medín, S; Sánchez Salmón, A

    2011-01-01

    To establish an automated procedure for the preparation of sodium fluoride (18)F injection using the resources available in our laboratory for the preparation of (18)FDG and to analyze the repercussion of the conditioning column of the fluoride ion entrapment on the characteristics of the final product. The sequence of an (18)FDG synthesis module prepared so that it traps the fluoride ion from the cyclotron in ion-exchange resin diluted with 0.9% sodium chloride. The final solution was dosified and sterilized in a final vial in an automatized dispensing module. Three different column conditioning protocols within the process were tested. Quality controls were run according to USP 32 and EurPh 6, adding control of ethanol levels of residual solvent and quality controls of the solution at 8 h post-preparation. Activation of the resin cartridges with ethanol and water was the chosen procedure, with fluoride ion trapping > 95% and pH around 7. Ethanol levels were < 5.000 ppm. Quality controls at 8 h indicated that the solution was in compliance with the USP 32 and EurPh 6 specifications. This is an easy, low-cost, reliable automated method for sodium fluoride preparation in PET facilities with existing equipment for (18)FDG synthesis and quality control. Copyright © 2010 Elsevier España, S.L. y SEMNIM. All rights reserved.

  15. Does Automated Feedback Improve Writing Quality?

    ERIC Educational Resources Information Center

    Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.

    2014-01-01

    The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…

  16. Automated reference-free detection of motion artifacts in magnetic resonance images.

    PubMed

    Küstner, Thomas; Liebgott, Annika; Mauch, Lukas; Martirosian, Petros; Bamberg, Fabian; Nikolaou, Konstantin; Yang, Bin; Schick, Fritz; Gatidis, Sergios

    2018-04-01

    Our objectives were to provide an automated method for spatially resolved detection and quantification of motion artifacts in MR images of the head and abdomen as well as a quality control of the trained architecture. T1-weighted MR images of the head and the upper abdomen were acquired in 16 healthy volunteers under rest and under motion. Images were divided into overlapping patches of different sizes achieving spatial separation. Using these patches as input data, a convolutional neural network (CNN) was trained to derive probability maps for the presence of motion artifacts. A deep visualization offers a human-interpretable quality control of the trained CNN. Results were visually assessed on probability maps and as classification accuracy on a per-patch, per-slice and per-volunteer basis. On visual assessment, a clear difference of probability maps was observed between data sets with and without motion. The overall accuracy of motion detection on a per-patch/per-volunteer basis reached 97%/100% in the head and 75%/100% in the abdomen, respectively. Automated detection of motion artifacts in MRI is feasible with good accuracy in the head and abdomen. The proposed method provides quantification and localization of artifacts as well as a visualization of the learned content. It may be extended to other anatomic areas and used for quality assurance of MR images.

  17. Technology Transfer Opportunities: Automated Ground-Water Monitoring

    USGS Publications Warehouse

    Smith, Kirk P.; Granato, Gregory E.

    1997-01-01

    Introduction A new automated ground-water monitoring system developed by the U.S. Geological Survey (USGS) measures and records values of selected water-quality properties and constituents using protocols approved for manual sampling. Prototypes using the automated process have demonstrated the ability to increase the quantity and quality of data collected and have shown the potential for reducing labor and material costs for ground-water quality data collection. Automation of water-quality monitoring systems in the field, in laboratories, and in industry have increased data density and utility while reducing operating costs. Uses for an automated ground-water monitoring system include, (but are not limited to) monitoring ground-water quality for research, monitoring known or potential contaminant sites, such as near landfills, underground storage tanks, or other facilities where potential contaminants are stored, and as an early warning system monitoring groundwater quality near public water-supply wells.

  18. AutoLock: a semiautomated system for radiotherapy treatment plan quality control

    PubMed Central

    Lowe, Matthew; Hardy, Mark J.; Boylan, Christopher J.; Whitehurst, Philip; Rowbottom, Carl G.

    2015-01-01

    A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock. PACS number: 87.55.Qr PMID:26103498

  19. AutoLock: a semiautomated system for radiotherapy treatment plan quality control.

    PubMed

    Dewhurst, Joseph M; Lowe, Matthew; Hardy, Mark J; Boylan, Christopher J; Whitehurst, Philip; Rowbottom, Carl G

    2015-05-08

    A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock.

  20. Automated Data Quality Assurance using OGC Sensor Web Enablement Frameworks for Marine Observatories

    NASA Astrophysics Data System (ADS)

    Toma, Daniel; Bghiel, Ikram; del Rio, Joaquin; Hidalgo, Alberto; Carreras, Normandino; Manuel, Antoni

    2014-05-01

    Over the past years, environmental sensors have continuously improved by becoming smaller, cheaper, and more intelligent. Therefore, many sensor networks are increasingly deployed to monitor our environment. But due to the large number of sensor manufacturers, accompanying protocols and data encoding, automated integration and data quality assurance of diverse sensors in an observing systems is not straightforward, requiring development of data management code and manual tedious configuration. However, over the past few years it has been demonstrated that Open-Geospatial Consortium (OGC) frameworks can enable web services with fully-described sensor systems, including data processing, sensor characteristics and quality control tests and results. So far, the SWE framework does not describe how to integrate sensors on-the-fly with minimal human intervention. The data management software which enables access to sensors, data processing and quality control tests has to be implemented and the results have to be manually mapped to the SWE models. In this contribution, we describe a Sensor Plug & Play infrastructure for the Sensor Web by combining (1) OGC PUCK protocol - a simple standard embedded instrument protocol to store and retrieve directly from the devices the declarative description of sensor characteristics and quality control tests, (2) an automatic mechanism for data processing and quality control tests underlying the Sensor Web - the Sensor Interface Descriptor (SID) concept, as well as (3) a model for the declarative description of sensor which serves as a generic data management mechanism - designed as a profile and extension of OGC SWE's SensorML standard. We implement and evaluate our approach by applying it to the OBSEA Observatory, and can be used to demonstrate the ability to assess data quality for temperature, salinity, air pressure and wind speed and direction observations off the coast of Garraf, in the north-eastern Spain.

  1. Continuous integration and quality control for scientific software

    NASA Astrophysics Data System (ADS)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  2. Measuring up: Implementing a dental quality measure in the electronic health record context.

    PubMed

    Bhardwaj, Aarti; Ramoni, Rachel; Kalenderian, Elsbeth; Neumann, Ana; Hebballi, Nutan B; White, Joel M; McClellan, Lyle; Walji, Muhammad F

    2016-01-01

    Quality improvement requires using quality measures that can be implemented in a valid manner. Using guidelines set forth by the Meaningful Use portion of the Health Information Technology for Economic and Clinical Health Act, the authors assessed the feasibility and performance of an automated electronic Meaningful Use dental clinical quality measure to determine the percentage of children who received fluoride varnish. The authors defined how to implement the automated measure queries in a dental electronic health record. Within records identified through automated query, the authors manually reviewed a subsample to assess the performance of the query. The automated query results revealed that 71.0% of patients had fluoride varnish compared with the manual chart review results that indicated 77.6% of patients had fluoride varnish. The automated quality measure performance results indicated 90.5% sensitivity, 90.8% specificity, 96.9% positive predictive value, and 75.2% negative predictive value. The authors' findings support the feasibility of using automated dental quality measure queries in the context of sufficient structured data. Information noted only in free text rather than in structured data would require using natural language processing approaches to effectively query electronic health records. To participate in self-directed quality improvement, dental clinicians must embrace the accountability era. Commitment to quality will require enhanced documentation to support near-term automated calculation of quality measures. Copyright © 2016 American Dental Association. Published by Elsevier Inc. All rights reserved.

  3. The Stanford Automated Mounter: Enabling High-Throughput Protein Crystal Screening at SSRL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, C.A.; Cohen, A.E.

    2009-05-26

    The macromolecular crystallography experiment lends itself perfectly to high-throughput technologies. The initial steps including the expression, purification, and crystallization of protein crystals, along with some of the later steps involving data processing and structure determination have all been automated to the point where some of the last remaining bottlenecks in the process have been crystal mounting, crystal screening, and data collection. At the Stanford Synchrotron Radiation Laboratory, a National User Facility that provides extremely brilliant X-ray photon beams for use in materials science, environmental science, and structural biology research, the incorporation of advanced robotics has enabled crystals to be screenedmore » in a true high-throughput fashion, thus dramatically accelerating the final steps. Up to 288 frozen crystals can be mounted by the beamline robot (the Stanford Auto-Mounting System) and screened for diffraction quality in a matter of hours without intervention. The best quality crystals can then be remounted for the collection of complete X-ray diffraction data sets. Furthermore, the entire screening and data collection experiment can be controlled from the experimenter's home laboratory by means of advanced software tools that enable network-based control of the highly automated beamlines.« less

  4. Online, offline, realtime: recent developments in industrial photogrammetry

    NASA Astrophysics Data System (ADS)

    Boesemann, Werner

    2003-01-01

    In recent years industrial photogrammetry has emerged from a highly specialized niche technology to a well established tool in industrial coordinate measurement applications with numerous installations in a significantly growing market of flexible and portable optical measurement systems. This is due to the development of powerful, but affordable video and computer technology. The increasing industrial requirements for accuracy, speed, robustness and ease of use of these systems together with a demand for the highest possible degree of automation have forced universities and system manufacturer to develop hard- and software solutions to meet these requirements. The presentation will show the latest trends in hardware development, especially new generation digital and/or intelligent cameras, aspects of image engineering like use of controlled illumination or projection technologies, and algorithmic and software aspects like automation strategies or new camera models. The basic qualities of digital photogrammetry- like portability and flexibility on one hand and fully automated quality control on the other - sometimes lead to certain conflicts in the design of measurement systems for different online, offline, or real-time solutions. The presentation will further show, how these tools and methods are combined in different configurations to be able to cover the still growing demands of the industrial end-users.

  5. Photogrammetry in the line: recent developments in industrial photogrammetry

    NASA Astrophysics Data System (ADS)

    Boesemann, Werner

    2003-05-01

    In recent years industrial photogrammetry has emerged from a highly specialized niche technology to a well established tool in industrial coordinate measurement applications with numerous installations in a significantly growing market of flexible and portable optical measurement systems. This is due to the development of powerful, but affordable video and computer technology. The increasing industrial requirements for accuracy, speed, robustness and ease of use of these systems together with a demand for the highest possible degree of automation have forced universities and system manufacturers to develop hard- and software solutions to meet these requirements. The presentation will show the latest trends in hardware development, especially new generation digital and/or intelligent cameras, aspects of image engineering like use of controlled illumination or projection technologies,and algorithmic and software aspects like automation strategies or new camera models. The basic qualities of digital photogrammetry-like portability and flexibility on one hand and fully automated quality control on the other -- sometimes lead to certain conflicts in the design of measurement systems for different online, offline or real-time solutions. The presentation will further show, how these tools and methods are combined in different configurations to be able to cover the still growing demands of the industrial end-users.

  6. An automated workflow for patient-specific quality control of contour propagation

    NASA Astrophysics Data System (ADS)

    Beasley, William J.; McWilliam, Alan; Slevin, Nicholas J.; Mackay, Ranald I.; van Herk, Marcel

    2016-12-01

    Contour propagation is an essential component of adaptive radiotherapy, but current contour propagation algorithms are not yet sufficiently accurate to be used without manual supervision. Manual review of propagated contours is time-consuming, making routine implementation of real-time adaptive radiotherapy unrealistic. Automated methods of monitoring the performance of contour propagation algorithms are therefore required. We have developed an automated workflow for patient-specific quality control of contour propagation and validated it on a cohort of head and neck patients, on which parotids were outlined by two observers. Two types of error were simulated—mislabelling of contours and introducing noise in the scans before propagation. The ability of the workflow to correctly predict the occurrence of errors was tested, taking both sets of observer contours as ground truth, using receiver operator characteristic analysis. The area under the curve was 0.90 and 0.85 for the observers, indicating good ability to predict the occurrence of errors. This tool could potentially be used to identify propagated contours that are likely to be incorrect, acting as a flag for manual review of these contours. This would make contour propagation more efficient, facilitating the routine implementation of adaptive radiotherapy.

  7. Does bacteriology laboratory automation reduce time to results and increase quality management?

    PubMed

    Dauwalder, O; Landrieve, L; Laurent, F; de Montclos, M; Vandenesch, F; Lina, G

    2016-03-01

    Due to reductions in financial and human resources, many microbiological laboratories have merged to build very large clinical microbiology laboratories, which allow the use of fully automated laboratory instruments. For clinical chemistry and haematology, automation has reduced the time to results and improved the management of laboratory quality. The aim of this review was to examine whether fully automated laboratory instruments for microbiology can reduce time to results and impact quality management. This study focused on solutions that are currently available, including the BD Kiestra™ Work Cell Automation and Total Lab Automation and the Copan WASPLab(®). Copyright © 2015 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  8. Managing laboratory automation in a changing pharmaceutical industry

    PubMed Central

    Rutherford, Michael L.

    1995-01-01

    The health care reform movement in the USA and increased requirements by regulatory agencies continue to have a major impact on the pharmaceutical industry and the laboratory. Laboratory management is expected to improve effciency by providing more analytical results at a lower cost, increasing customer service, reducing cycle time, while ensuring accurate results and more effective use of their staff. To achieve these expectations, many laboratories are using robotics and automated work stations. Establishing automated systems presents many challenges for laboratory management, including project and hardware selection, budget justification, implementation, validation, training, and support. To address these management challenges, the rationale for project selection and implementation, the obstacles encountered, project outcome, and learning points for several automated systems recently implemented in the Quality Control Laboratories at Eli Lilly are presented. PMID:18925014

  9. Advanced Manufacturing Systems in Food Processing and Packaging Industry

    NASA Astrophysics Data System (ADS)

    Shafie Sani, Mohd; Aziz, Faieza Abdul

    2013-06-01

    In this paper, several advanced manufacturing systems in food processing and packaging industry are reviewed, including: biodegradable smart packaging and Nano composites, advanced automation control system consists of fieldbus technology, distributed control system and food safety inspection features. The main purpose of current technology in food processing and packaging industry is discussed due to major concern on efficiency of the plant process, productivity, quality, as well as safety. These application were chosen because they are robust, flexible, reconfigurable, preserve the quality of the food, and efficient.

  10. Automated quality control for stitching of textile articles

    NASA Technical Reports Server (NTRS)

    Miller, Jeffrey L. (Inventor); Markus, Alan (Inventor)

    1999-01-01

    Quality control for stitching of a textile article is performed by measuring thread tension in the stitches as the stitches are being made, determining locations of the stitches, and generating a map including the locations and stitching data derived from the measured thread tensions. The stitching data can be analyzed, off-line or in real time, to identify defective stitches. Defective stitches can then be repaired. Real time analysis of the thread tensions allows problems such as broken needle threads to be corrected immediately.

  11. Crossword: A Fully Automated Algorithm for the Segmentation and Quality Control of Protein Microarray Images

    PubMed Central

    2015-01-01

    Biological assays formatted as microarrays have become a critical tool for the generation of the comprehensive data sets required for systems-level understanding of biological processes. Manual annotation of data extracted from images of microarrays, however, remains a significant bottleneck, particularly for protein microarrays due to the sensitivity of this technology to weak artifact signal. In order to automate the extraction and curation of data from protein microarrays, we describe an algorithm called Crossword that logically combines information from multiple approaches to fully automate microarray segmentation. Automated artifact removal is also accomplished by segregating structured pixels from the background noise using iterative clustering and pixel connectivity. Correlation of the location of structured pixels across image channels is used to identify and remove artifact pixels from the image prior to data extraction. This component improves the accuracy of data sets while reducing the requirement for time-consuming visual inspection of the data. Crossword enables a fully automated protocol that is robust to significant spatial and intensity aberrations. Overall, the average amount of user intervention is reduced by an order of magnitude and the data quality is increased through artifact removal and reduced user variability. The increase in throughput should aid the further implementation of microarray technologies in clinical studies. PMID:24417579

  12. Cost Effective Prototyping

    NASA Technical Reports Server (NTRS)

    Wickman, Jerry L.; Kundu, Nikhil K.

    1996-01-01

    This laboratory exercise seeks to develop a cost effective prototype development. The exercise has the potential of linking part design, CAD, mold development, quality control, metrology, mold flow, materials testing, fixture design, automation, limited parts production and other issues as related to plastics manufacturing.

  13. A scalable, fully automated process for construction of sequence-ready human exome targeted capture libraries

    PubMed Central

    2011-01-01

    Genome targeting methods enable cost-effective capture of specific subsets of the genome for sequencing. We present here an automated, highly scalable method for carrying out the Solution Hybrid Selection capture approach that provides a dramatic increase in scale and throughput of sequence-ready libraries produced. Significant process improvements and a series of in-process quality control checkpoints are also added. These process improvements can also be used in a manual version of the protocol. PMID:21205303

  14. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  15. Intelligent robot trends for factory automation

    NASA Astrophysics Data System (ADS)

    Hall, Ernest L.

    1997-09-01

    An intelligent robot is a remarkably useful combination of a manipulator, sensors and controls. The use of these machines in factory automation can improve productivity, increase product quality and improve competitiveness. This paper presents a discussion of recent economic and technical trends. The robotics industry now has a billion-dollar market in the U.S. and is growing. Feasibility studies are presented which also show unaudited healthy rates of return for a variety of robotic applications. Technically, the machines are faster, cheaper, more repeatable, more reliable and safer. The knowledge base of inverse kinematic and dynamic solutions and intelligent controls is increasing. More attention is being given by industry to robots, vision and motion controls. New areas of usage are emerging for service robots, remote manipulators and automated guided vehicles. However, the road from inspiration to successful application is still long and difficult, often taking decades to achieve a new product. More cooperation between government, industry and universities is needed to speed the development of intelligent robots that will benefit both industry and society.

  16. Quality Control of Laser-Beam-Melted Parts by a Correlation Between Their Mechanical Properties and a Three-Dimensional Surface Analysis

    NASA Astrophysics Data System (ADS)

    Grimm, T.; Wiora, G.; Witt, G.

    2017-03-01

    Good correlations between three-dimensional surface analyses of laser-beam-melted parts of nickel alloy HX and their mechanical properties were found. The surface analyses were performed with a confocal microscope, which offers a more profound surface data basis than a conventional, two-dimensional tactile profilometry. This new approach results in a wide range of three-dimensional surface parameters, which were each evaluated with respect to their feasibility for quality control in additive manufacturing. As a result of an automated surface analysis process by the confocal microscope and an industrial six-axis robot, the results are an innovative approach for quality control in additive manufacturing.

  17. Manual versus Automated Carotid Artery Plaque Component Segmentation in High and Lower Quality 3.0 Tesla MRI Scans

    PubMed Central

    Smits, Loek P.; van Wijk, Diederik F.; Duivenvoorden, Raphael; Xu, Dongxiang; Yuan, Chun; Stroes, Erik S.; Nederveen, Aart J.

    2016-01-01

    Purpose To study the interscan reproducibility of manual versus automated segmentation of carotid artery plaque components, and the agreement between both methods, in high and lower quality MRI scans. Methods 24 patients with 30–70% carotid artery stenosis were planned for 3T carotid MRI, followed by a rescan within 1 month. A multicontrast protocol (T1w,T2w, PDw and TOF sequences) was used. After co-registration and delineation of the lumen and outer wall, segmentation of plaque components (lipid-rich necrotic cores (LRNC) and calcifications) was performed both manually and automated. Scan quality was assessed using a visual quality scale. Results Agreement for the detection of LRNC (Cohen’s kappa (k) is 0.04) and calcification (k = 0.41) between both manual and automated segmentation methods was poor. In the high-quality scans (visual quality score ≥ 3), the agreement between manual and automated segmentation increased to k = 0.55 and k = 0.58 for, respectively, the detection of LRNC and calcification larger than 1 mm2. Both manual and automated analysis showed good interscan reproducibility for the quantification of LRNC (intraclass correlation coefficient (ICC) of 0.94 and 0.80 respectively) and calcified plaque area (ICC of 0.95 and 0.77, respectively). Conclusion Agreement between manual and automated segmentation of LRNC and calcifications was poor, despite a good interscan reproducibility of both methods. The agreement between both methods increased to moderate in high quality scans. These findings indicate that image quality is a critical determinant of the performance of both manual and automated segmentation of carotid artery plaque components. PMID:27930665

  18. Neuropsychologic assessment of a population-based sample of Gulf War veterans.

    PubMed

    Wallin, Mitchell T; Wilken, Jeffrey; Alfaro, Mercedes H; Rogers, Catherine; Mahan, Clare; Chapman, Julie C; Fratto, Timothy; Sullivan, Cynthia; Kang, Han; Kane, Robert

    2009-09-01

    The objective of this project was to compare neuropsychologic performance and quality of life in a population-based sample of deployed Gulf War (GW) veterans with and without multisymptom complaints. The study participants were obtained from the 30,000 member population-based National Health Survey of GW-era veterans conducted in 1995. Cases (N=25) were deployed to the year 1990 and 1991 GW and met Center for Disease Control and Prevention criteria for multisymptom GW illness (GWI). Controls (N=16) were deployed to the 1990 and 1991 GW but did not meet Center for Disease Control and Prevention criteria for GWI. There were no significant differences in composite scores on the traditional and computerized neuropsychologic battery (automated neuropsychologic assessment metrics) between GW cases and controls using bivariate techniques. Multiple linear regression analyses controlling for demographic and clinical variables revealed composite automated neuropsychologic assessment metrics scores were associated with age (b=-7.8; P=0.084), and education (b=22.9; P=0.0012), but not GW case or control status (b=-63.9; P=0.22). Compared with controls, GW cases had significantly more impairment on the Personality Assessment Inventory and the short form-36. Compared with GW controls, GW cases meeting criteria for GWI had preserved cognition function but had significant psychiatric symptoms and lower quality of life.

  19. e-Measures: insight into the challenges and opportunities of automating publicly reported quality measures

    PubMed Central

    Garrido, Terhilda; Kumar, Sudheen; Lekas, John; Lindberg, Mark; Kadiyala, Dhanyaja; Whippy, Alan; Crawford, Barbara; Weissberg, Jed

    2014-01-01

    Using electronic health records (EHR) to automate publicly reported quality measures is receiving increasing attention and is one of the promises of EHR implementation. Kaiser Permanente has fully or partly automated six of 13 the joint commission measure sets. We describe our experience with automation and the resulting time savings: a reduction by approximately 50% of abstractor time required for one measure set alone (surgical care improvement project). However, our experience illustrates the gap between the current and desired states of automated public quality reporting, which has important implications for measure developers, accrediting entities, EHR vendors, public/private payers, and government. PMID:23831833

  20. Platform for Automated Real-Time High Performance Analytics on Medical Image Data.

    PubMed

    Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A

    2018-03-01

    Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.

  1. Emerging Technologies in the Workplace.

    ERIC Educational Resources Information Center

    Ammon, Adelaide; Robertson, Lyle

    1985-01-01

    Presents survey responses of 100 Michigan firms regarding the use of advanced technologies, employment growth projections in skilled occupations, and views about community college education. Examines the impact of the introduction of office automation, telecommunications, computer-aided design, laser, quality control, materials management, and…

  2. Automated estimation of image quality for coronary computed tomographic angiography using machine learning.

    PubMed

    Nakanishi, Rine; Sankaran, Sethuraman; Grady, Leo; Malpeso, Jenifer; Yousfi, Razik; Osawa, Kazuhiro; Ceponiene, Indre; Nazarat, Negin; Rahmani, Sina; Kissel, Kendall; Jayawardena, Eranthi; Dailing, Christopher; Zarins, Christopher; Koo, Bon-Kwon; Min, James K; Taylor, Charles A; Budoff, Matthew J

    2018-03-23

    Our goal was to evaluate the efficacy of a fully automated method for assessing the image quality (IQ) of coronary computed tomography angiography (CCTA). The machine learning method was trained using 75 CCTA studies by mapping features (noise, contrast, misregistration scores, and un-interpretability index) to an IQ score based on manual ground truth data. The automated method was validated on a set of 50 CCTA studies and subsequently tested on a new set of 172 CCTA studies against visual IQ scores on a 5-point Likert scale. The area under the curve in the validation set was 0.96. In the 172 CCTA studies, our method yielded a Cohen's kappa statistic for the agreement between automated and visual IQ assessment of 0.67 (p < 0.01). In the group where good to excellent (n = 163), fair (n = 6), and poor visual IQ scores (n = 3) were graded, 155, 5, and 2 of the patients received an automated IQ score > 50 %, respectively. Fully automated assessment of the IQ of CCTA data sets by machine learning was reproducible and provided similar results compared with visual analysis within the limits of inter-operator variability. • The proposed method enables automated and reproducible image quality assessment. • Machine learning and visual assessments yielded comparable estimates of image quality. • Automated assessment potentially allows for more standardised image quality. • Image quality assessment enables standardization of clinical trial results across different datasets.

  3. Human Connectome Project Informatics: quality control, database services, and data visualization

    PubMed Central

    Marcus, Daniel S.; Harms, Michael P.; Snyder, Abraham Z.; Jenkinson, Mark; Wilson, J Anthony; Glasser, Matthew F.; Barch, Deanna M.; Archie, Kevin A.; Burgess, Gregory C.; Ramaratnam, Mohana; Hodge, Michael; Horton, William; Herrick, Rick; Olsen, Timothy; McKay, Michael; House, Matthew; Hileman, Michael; Reid, Erin; Harwell, John; Coalson, Timothy; Schindler, Jon; Elam, Jennifer S.; Curtiss, Sandra W.; Van Essen, David C.

    2013-01-01

    The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study. PMID:23707591

  4. From Field Notes to Data Portal - A Scalable Data QA/QC Framework for Tower Networks: Progress and Preliminary Results

    NASA Astrophysics Data System (ADS)

    Sturtevant, C.; Hackley, S.; Lee, R.; Holling, G.; Bonarrigo, S.

    2017-12-01

    Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. Data quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from humans or the natural environment. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process heavily relying on visual inspection of data. In addition, notes of measurement interference are often recorded on paper without an explicit pathway to data flagging. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. We present a scalable QA/QC framework in development for NEON that combines the efficiency and standardization of automated checks with the power and flexibility of human review. This framework includes fast-response monitoring of sensor health, a mobile application for electronically recording maintenance activities, traditional point-based automated quality flagging, and continuous monitoring of quality outcomes and longer-term holistic evaluations. This framework maintains the traceability of quality information along the entirety of the data generation pipeline, and explicitly links field reports of measurement interference to quality flagging. Preliminary results show that data quality can be effectively monitored and managed for a multitude of sites with a small group of QA/QC staff. Several components of this framework are open-source, including a R-Shiny application for efficiently monitoring, synthesizing, and investigating data quality issues.

  5. Technology Transfer Opportunities: Automated Ground-Water Monitoring, A Proven Technology

    USGS Publications Warehouse

    Smith, Kirk P.; Granato, Gregory E.

    1998-01-01

    Introduction The U.S. Geological Survey (USGS) has developed and tested an automated ground-water monitoring system that measures and records values of selected water-quality properties and constituents using protocols approved for manual sampling. Prototypes using the automated process have demonstrated the ability to increase the quantity and quality of data collected and have shown the potential for reducing labor and material costs for ground-water quality data collection. Automated ground-water monitoring systems can be used to monitor known or potential contaminant sites, such as near landfills, underground storage tanks, or other facilities where potential contaminants are stored, to serve as early warning systems monitoring ground-water quality near public water-supply wells, and for ground-water quality research.

  6. A Transparent and Transferable Framework for Tracking Quality Information in Large Datasets

    PubMed Central

    Smith, Derek E.; Metzger, Stefan; Taylor, Jeffrey R.

    2014-01-01

    The ability to evaluate the validity of data is essential to any investigation, and manual “eyes on” assessments of data quality have dominated in the past. Yet, as the size of collected data continues to increase, so does the effort required to assess their quality. This challenge is of particular concern for networks that automate their data collection, and has resulted in the automation of many quality assurance and quality control analyses. Unfortunately, the interpretation of the resulting data quality flags can become quite challenging with large data sets. We have developed a framework to summarize data quality information and facilitate interpretation by the user. Our framework consists of first compiling data quality information and then presenting it through 2 separate mechanisms; a quality report and a quality summary. The quality report presents the results of specific quality analyses as they relate to individual observations, while the quality summary takes a spatial or temporal aggregate of each quality analysis and provides a summary of the results. Included in the quality summary is a final quality flag, which further condenses data quality information to assess whether a data product is valid or not. This framework has the added flexibility to allow “eyes on” information on data quality to be incorporated for many data types. Furthermore, this framework can aid problem tracking and resolution, should sensor or system malfunctions arise. PMID:25379884

  7. Automated DNA extraction platforms offer solutions to challenges of assessing microbial biofouling in oil production facilities.

    PubMed

    Oldham, Athenia L; Drilling, Heather S; Stamps, Blake W; Stevenson, Bradley S; Duncan, Kathleen E

    2012-11-20

    The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources.

  8. Automated DNA extraction platforms offer solutions to challenges of assessing microbial biofouling in oil production facilities

    PubMed Central

    2012-01-01

    The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources. PMID:23168231

  9. Practical Considerations for Optic Nerve Estimation in Telemedicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karnowski, Thomas Paul; Aykac, Deniz; Chaum, Edward

    The projected increase in diabetes in the United States and worldwide has created a need for broad-based, inexpensive screening for diabetic retinopathy (DR), an eye disease which can lead to vision impairment. A telemedicine network with retina cameras and automated quality control, physiological feature location, and lesion / anomaly detection is a low-cost way of achieving broad-based screening. In this work we report on the effect of quality estimation on an optic nerve (ON) detection method with a confidence metric. We report on an improvement of the fusion technique using a data set from an ophthalmologists practice then show themore » results of the method as a function of image quality on a set of images from an on-line telemedicine network collected in Spring 2009 and another broad-based screening program. We show that the fusion method, combined with quality estimation processing, can improve detection performance and also provide a method for utilizing a physician-in-the-loop for images that may exceed the capabilities of automated processing.« less

  10. Conventional versus automated measurement of blood pressure in primary care patients with systolic hypertension: randomised parallel design controlled trial

    PubMed Central

    Godwin, Marshall; Dawes, Martin; Kiss, Alexander; Tobe, Sheldon W; Grant, F Curry; Kaczorowski, Janusz

    2011-01-01

    Objective To compare the quality and accuracy of manual office blood pressure and automated office blood pressure using the awake ambulatory blood pressure as a gold standard. Design Multi-site cluster randomised controlled trial. Setting Primary care practices in five cities in eastern Canada. Participants 555 patients with systolic hypertension and no serious comorbidities under the care of 88 primary care physicians in 67 practices in the community. Interventions Practices were randomly allocated to either ongoing use of manual office blood pressure (control group) or automated office blood pressure (intervention group) using the BpTRU device. The last routine manual office blood pressure (mm Hg) was obtained from each patient’s medical record before enrolment. Office blood pressure readings were compared before and after enrolment in the intervention and control groups; all readings were also compared with the awake ambulatory blood pressure. Main outcome measure Difference in systolic blood pressure between awake ambulatory blood pressure minus automated office blood pressure and awake ambulatory blood pressure minus manual office blood pressure. Results Cluster randomisation allocated 31 practices (252 patients) to manual office blood pressure and 36 practices (303 patients) to automated office blood pressure measurement. The most recent routine manual office blood pressure (149.5 (SD 10.8)/81.4 (8.3)) was higher than automated office blood pressure (135.6 (17.3)/77.7 (10.9)) (P<0.001). In the control group, routine manual office blood pressure before enrolment (149.9 (10.7)/81.8 (8.5)) was reduced to 141.4 (14.6)/80.2 (9.5) after enrolment (P<0.001/P=0.01), but the reduction in the intervention group from manual office to automated office blood pressure was significantly greater (P<0.001/P=0.02). On the first study visit after enrolment, the estimated mean difference for the intervention group between the awake ambulatory systolic/diastolic blood pressure and automated office blood pressure (−2.3 (95% confidence interval −0.31 to −4.3)/−3.3 (−2.7 to −4.4)) was less (P=0.006/P=0.26) than the difference in the control group between the awake ambulatory blood pressure and the manual office blood pressure (−6.5 (−4.3 to −8.6)/−4.3 (−2.9 to −5.8)). Systolic/diastolic automated office blood pressure showed a stronger (P<0.001) within group correlation (r=0.34/r=0.56) with awake ambulatory blood pressure after enrolment compared with manual office blood pressure versus awake ambulatory blood pressure before enrolment (r=0.10/r= 0.40); the mean difference in r was 0.24 (0.12 to 0.36)/0.16 (0.07 to 0.25)). The between group correlation comparing diastolic automated office blood pressure and awake ambulatory blood pressure (r=0.56) was stronger (P<0.001) than that for manual office blood pressure versus awake ambulatory blood pressure (r=0.30); the mean difference in r was 0.26 (0.09 to 0.41). Digit preference with readings ending in zero was substantially reduced by use of automated office blood pressure. Conclusion In compliant, otherwise healthy, primary care patients with systolic hypertension, introduction of automated office blood pressure into routine primary care significantly reduced the white coat response compared with the ongoing use of manual office blood pressure measurement. The quality and accuracy of automated office blood pressure in relation to the awake ambulatory blood pressure was also significantly better when compared with manual office blood pressure. Trial registration Clinical trials NCT 00214053. PMID:21300709

  11. Production and quality assurance automation in the Goddard Space Flight Center Flight Dynamics Facility

    NASA Technical Reports Server (NTRS)

    Chapman, K. B.; Cox, C. M.; Thomas, C. W.; Cuevas, O. O.; Beckman, R. M.

    1994-01-01

    The Flight Dynamics Facility (FDF) at the NASA Goddard Space Flight Center (GSFC) generates numerous products for NASA-supported spacecraft, including the Tracking and Data Relay Satellites (TDRS's), the Hubble Space Telescope (HST), the Extreme Ultraviolet Explorer (EUVE), and the space shuttle. These products include orbit determination data, acquisition data, event scheduling data, and attitude data. In most cases, product generation involves repetitive execution of many programs. The increasing number of missions supported by the FDF has necessitated the use of automated systems to schedule, execute, and quality assure these products. This automation allows the delivery of accurate products in a timely and cost-efficient manner. To be effective, these systems must automate as many repetitive operations as possible and must be flexible enough to meet changing support requirements. The FDF Orbit Determination Task (ODT) has implemented several systems that automate product generation and quality assurance (QA). These systems include the Orbit Production Automation System (OPAS), the New Enhanced Operations Log (NEOLOG), and the Quality Assurance Automation Software (QA Tool). Implementation of these systems has resulted in a significant reduction in required manpower, elimination of shift work and most weekend support, and improved support quality, while incurring minimal development cost. This paper will present an overview of the concepts used and experiences gained from the implementation of these automation systems.

  12. A solvent-extraction module for cyclotron production of high-purity technetium-99m.

    PubMed

    Martini, Petra; Boschi, Alessandra; Cicoria, Gianfranco; Uccelli, Licia; Pasquali, Micòl; Duatti, Adriano; Pupillo, Gaia; Marengo, Mario; Loriggiola, Massimo; Esposito, Juan

    2016-12-01

    The design and fabrication of a fully-automated, remotely controlled module for the extraction and purification of technetium-99m (Tc-99m), produced by proton bombardment of enriched Mo-100 molybdenum metallic targets in a low-energy medical cyclotron, is here described. After dissolution of the irradiated solid target in hydrogen peroxide, Tc-99m was obtained under the chemical form of 99m TcO 4 - , in high radionuclidic and radiochemical purity, by solvent extraction with methyl ethyl ketone (MEK). The extraction process was accomplished inside a glass column-shaped vial especially designed to allow for an easy automation of the whole procedure. Recovery yields were always >90% of the loaded activity. The final pertechnetate saline solution Na 99m TcO 4 , purified using the automated module here described, is within the Pharmacopoeia quality control parameters and is therefore a valid alternative to generator-produced 99m Tc. The resulting automated module is cost-effective and easily replicable for in-house production of high-purity Tc-99m by cyclotrons. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Composite-Metal-Matrix Arc-Spray Process

    NASA Technical Reports Server (NTRS)

    Westfall, Leonard J.

    1987-01-01

    Arc-spray "monotape" process automated, low in cost, and produces at high rate. Ideal for development of new metal-matrix composites. "Monotape" reproducible and of high quality. Process carried out in controlled gas environment with programmable matrix-deposition rates, resulting in significant cost saving

  14. Translations on Environmental Quality, Number 148

    DTIC Science & Technology

    1977-10-03

    Article 9. Designs for gas filtering installations must include the proper control and automation facilities as per articles 16 and 17. Article 10...mandatorily equipped with: 1. Locking armature at the gas entrance and exit with manual or remote control flanges for the installation of end-caps in...instruments shall be mounted on the control panel of the gas filtering system or the control panel for the technological process. Article 17. (1) The gas

  15. Analysis of the Parameters Required for Performance Monitoring and Assessment of Military Communications Systems by Military Technical Controller

    DTIC Science & Technology

    1975-12-01

    139 APPENDIX A* BASIC CONCEPT OF MILITARY TECHNICAL CONTROL.142 6 APIENDIX Es TEST EQUIPMENI REQUIRED FOR lEASURF.4ENr OF 1AF’AMETE RS...Control ( SATEC ) Automatic Facilities heport Army Automated Quality Monitoring Reporting System (AQMPS) Army Autcmated Technical Control-Semi (ATC-Semi...technical control then beco.. es equipment status monitoring. All the major equipment in a system wculd have internal sensors with properly selected parameters

  16. Poster - Thur Eve - 76: A quality control to achieve planning consistency in arc radiotherapy of the prostate.

    PubMed

    Zeng, G; Murphy, J; Annis, S-L; Wu, X; Wang, Y; McGowan, T; Macpherson, M

    2012-07-01

    To report a quality control program in prostate radiation therapy at our center that includes semi-automated planning process to generate high quality plans and in-house software to track plan quality in the subsequent clinical application. Arc planning in Eclipse v10.0 was preformed for both intact prostate and post-prostatectomy treatments. The planning focuses on DVH requirements and dose distributions being able to tolerate daily setup variations. A modified structure set is used to standardize the optimization, including short rectum and bladder in the fields to effectively tighten dose to target and a rectum expansion with 1cm cropped from PTV to block dose and shape posterior isodose lines. Structure, plan and optimization templates are used to streamline plan generation. DVH files are exported from Eclipse to a quality tracking software with GUI written in Matlab that can report the dose-volume data either for an individual patient or over a patient population. For 100 intact prostate patients treated with 78Gy, rectal D50, D25, D15 and D5 are 30.1±6.2Gy, 50.6±7.9Gy, 65.9±6.0Gy and 76.6±1.4Gy respectively, well below the limits 50Gy, 65Gy, 75Gy and 78Gy respectively. For prostate bed with prescription of 66Gy, rectal D50 is 35.9±6.9Gy. In both sites, PTV is covered by 95% prescription and the hotspots are less than 5%. The semi-automated planning method can efficiently create high quality plans while the tracking software can monitor the feedback from clinical application. It is a comprehensive and robust quality control program in radiation therapy. © 2012 American Association of Physicists in Medicine.

  17. Automated Dissolution for Enteric-Coated Aspirin Tablets: A Case Study for Method Transfer to a RoboDis II.

    PubMed

    Ibrahim, Sarah A; Martini, Luigi

    2014-08-01

    Dissolution method transfer is a complicated yet common process in the pharmaceutical industry. With increased pharmaceutical product manufacturing and dissolution acceptance requirements, dissolution testing has become one of the most labor-intensive quality control testing methods. There is an increased trend for automation in dissolution testing, particularly for large pharmaceutical companies to reduce variability and increase personnel efficiency. There is no official guideline for dissolution testing method transfer from a manual, semi-automated, to automated dissolution tester. In this study, a manual multipoint dissolution testing procedure for an enteric-coated aspirin tablet was transferred effectively and reproducibly to a fully automated dissolution testing device, RoboDis II. Enteric-coated aspirin samples were used as a model formulation to assess the feasibility and accuracy of media pH change during continuous automated dissolution testing. Several RoboDis II parameters were evaluated to ensure the integrity and equivalency of dissolution method transfer from a manual dissolution tester. This current study provides a systematic outline for the transfer of the manual dissolution testing protocol to an automated dissolution tester. This study further supports that automated dissolution testers compliant with regulatory requirements and similar to manual dissolution testers facilitate method transfer. © 2014 Society for Laboratory Automation and Screening.

  18. Development and implementation of an automatic integration system for fibre optic sensors in the braiding process with the objective of online-monitoring of composite structures

    NASA Astrophysics Data System (ADS)

    Hufenbach, W.; Gude, M.; Czulak, A.; Kretschmann, Martin

    2014-04-01

    Increasing economic, political and ecological pressure leads to steadily rising percentage of modern processing and manufacturing processes for fibre reinforced polymers in industrial batch production. Component weights beneath a level achievable by classic construction materials, which lead to a reduced energy and cost balance during product lifetime, justify the higher fabrication costs. However, complex quality control and failure prediction slow down the substitution by composite materials. High-resolution fibre-optic sensors (FOS), due their low diameter, high measuring point density and simple handling, show a high applicability potential for an automated sensor-integration in manufacturing processes, and therefore the online monitoring of composite products manufactured in industrial scale. Integrated sensors can be used to monitor manufacturing processes, part tests as well as the component structure during product life cycle, which simplifies allows quality control during production and the optimization of single manufacturing processes.[1;2] Furthermore, detailed failure analyses lead to a enhanced understanding of failure processes appearing in composite materials. This leads to a lower wastrel number and products of a higher value and longer product life cycle, whereby costs, material and energy are saved. This work shows an automation approach for FOS-integration in the braiding process. For that purpose a braiding wheel has been supplemented with an appliance for automatic sensor application, which has been used to manufacture preforms of high-pressure composite vessels with FOS-networks integrated between the fibre layers. All following manufacturing processes (vacuum infiltration, curing) and component tests (quasi-static pressure test, programmed delamination) were monitored with the help of the integrated sensor networks. Keywords: SHM, high-pressure composite vessel, braiding, automated sensor integration, pressure test, quality control, optic-fibre sensors, Rayleigh, Luna Technologies

  19. Laboratory automation of high-quality and efficient ligand-binding assays for biotherapeutic drug development.

    PubMed

    Wang, Jin; Patel, Vimal; Burns, Daniel; Laycock, John; Pandya, Kinnari; Tsoi, Jennifer; DeSilva, Binodh; Ma, Mark; Lee, Jean

    2013-07-01

    Regulated bioanalytical laboratories that run ligand-binding assays in support of biotherapeutics development face ever-increasing demand to support more projects with increased efficiency. Laboratory automation is a tool that has the potential to improve both quality and efficiency in a bioanalytical laboratory. The success of laboratory automation requires thoughtful evaluation of program needs and fit-for-purpose strategies, followed by pragmatic implementation plans and continuous user support. In this article, we present the development of fit-for-purpose automation of total walk-away and flexible modular modes. We shared the sustaining experience of vendor collaboration and team work to educate, promote and track the use of automation. The implementation of laboratory automation improves assay performance, data quality, process efficiency and method transfer to CRO in a regulated bioanalytical laboratory environment.

  20. Robotic solid phase extraction and high performance liquid chromatographic analysis of ranitidine in serum or plasma.

    PubMed

    Lloyd, T L; Perschy, T B; Gooding, A E; Tomlinson, J J

    1992-01-01

    A fully automated assay for the analysis of ranitidine in serum and plasma, with and without an internal standard, was validated. It utilizes robotic solid phase extraction with on-line high performance liquid chromatographic (HPLC) analysis. The ruggedness of the assay was demonstrated over a three-year period. A Zymark Py Technology II robotic system was used for serial processing from initial aspiration of samples from original collection containers, to final direct injection onto the on-line HPLC system. Automated serial processing with on-line analysis provided uniform sample history and increased productivity by freeing the chemist to analyse data and perform other tasks. The solid phase extraction efficiency was 94% throughout the assay range of 10-250 ng/mL. The coefficients of variation for within- and between-day quality control samples ranged from 1 to 6% and 1 to 5%, respectively. Mean accuracy for between-day standards and quality control results ranged from 97 to 102% of the respective theoretical concentrations.

  1. Combining Archetypes, Ontologies and Formalization Enables Automated Computation of Quality Indicators.

    PubMed

    Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald

    2017-01-01

    ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.

  2. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    NASA Technical Reports Server (NTRS)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network development. The changes were to include evaluation tools that can be applied to neural networks at each phase of the software engineering life cycle. The result was a formal evaluation approach to increase the product quality of systems that use neural networks for their implementation.

  3. Implementation of and experiences with new automation

    PubMed Central

    Mahmud, Ifte; Kim, David

    2000-01-01

    In an environment where cost, timeliness, and quality drives the business, it is essential to look for answers in technology where these challenges can be met. In the Novartis Pharmaceutical Quality Assurance Department, automation and robotics have become just the tools to meet these challenges. Although automation is a relatively new concept in our department, we have fully embraced it within just a few years. As our company went through a merger, there was a significant reduction in the workforce within the Quality Assurance Department through voluntary and involuntary separations. However the workload remained constant or in some cases actually increased. So even with reduction in laboratory personnel, we were challenged internally and from the headquarters in Basle to improve productivity while maintaining integrity in quality testing. Benchmark studies indicated the Suffern site to be the choice manufacturing site above other facilities. This is attributed to the Suffern facility employees' commitment to reduce cycle time, improve efficiency, and maintain high level of regulatory compliance. One of the stronger contributing factors was automation technology in the laboratoriess, and this technology will continue to help the site's status in the future. The Automation Group was originally formed about 2 years ago to meet the demands of high quality assurance testing throughput needs and to bring our testing group up to standard with the industry. Automation began with only two people in the group and now we have three people who are the next generation automation scientists. Even with such a small staff,we have made great strides in laboratory automation as we have worked extensively with each piece of equipment brought in. The implementation process of each project was often difficult because the second generation automation group came from the laboratory and without much automation experience. However, with the involvement from the users at ‘get-go’, we were able to successfully bring in many automation technologies. Our first experience with automation was SFA/SDAS, and then Zymark TPWII followed by Zymark Multi-dose. The future of product testing lies in automation, and we shall continue to explore the possibilities of improving the testing methodologies so that the chemists will be less burdened with repetitive and mundane daily tasks and be more focused on bringing quality into our products. PMID:18924695

  4. Implementation of and experiences with new automation.

    PubMed

    Mahmud, I; Kim, D

    2000-01-01

    In an environment where cost, timeliness, and quality drives the business, it is essential to look for answers in technology where these challenges can be met. In the Novartis Pharmaceutical Quality Assurance Department, automation and robotics have become just the tools to meet these challenges. Although automation is a relatively new concept in our department, we have fully embraced it within just a few years. As our company went through a merger, there was a significant reduction in the workforce within the Quality Assurance Department through voluntary and involuntary separations. However the workload remained constant or in some cases actually increased. So even with reduction in laboratory personnel, we were challenged internally and from the headquarters in Basle to improve productivity while maintaining integrity in quality testing. Benchmark studies indicated the Suffern site to be the choice manufacturing site above other facilities. This is attributed to the Suffern facility employees' commitment to reduce cycle time, improve efficiency, and maintain high level of regulatory compliance. One of the stronger contributing factors was automation technology in the laboratoriess, and this technology will continue to help the site's status in the future. The Automation Group was originally formed about 2 years ago to meet the demands of high quality assurance testing throughput needs and to bring our testing group up to standard with the industry. Automation began with only two people in the group and now we have three people who are the next generation automation scientists. Even with such a small staff,we have made great strides in laboratory automation as we have worked extensively with each piece of equipment brought in. The implementation process of each project was often difficult because the second generation automation group came from the laboratory and without much automation experience. However, with the involvement from the users at 'get-go', we were able to successfully bring in many automation technologies. Our first experience with automation was SFA/SDAS, and then Zymark TPWII followed by Zymark Multi-dose. The future of product testing lies in automation, and we shall continue to explore the possibilities of improving the testing methodologies so that the chemists will be less burdened with repetitive and mundane daily tasks and be more focused on bringing quality into our products.

  5. Evaluation of Various Radar Data Quality Control Algorithms Based on Accumulated Radar Rainfall Statistics

    NASA Technical Reports Server (NTRS)

    Robinson, Michael; Steiner, Matthias; Wolff, David B.; Ferrier, Brad S.; Kessinger, Cathy; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The primary function of the TRMM Ground Validation (GV) Program is to create GV rainfall products that provide basic validation of satellite-derived precipitation measurements for select primary sites. A fundamental and extremely important step in creating high-quality GV products is radar data quality control. Quality control (QC) processing of TRMM GV radar data is based on some automated procedures, but the current QC algorithm is not fully operational and requires significant human interaction to assure satisfactory results. Moreover, the TRMM GV QC algorithm, even with continuous manual tuning, still can not completely remove all types of spurious echoes. In an attempt to improve the current operational radar data QC procedures of the TRMM GV effort, an intercomparison of several QC algorithms has been conducted. This presentation will demonstrate how various radar data QC algorithms affect accumulated radar rainfall products. In all, six different QC algorithms will be applied to two months of WSR-88D radar data from Melbourne, Florida. Daily, five-day, and monthly accumulated radar rainfall maps will be produced for each quality-controlled data set. The QC algorithms will be evaluated and compared based on their ability to remove spurious echoes without removing significant precipitation. Strengths and weaknesses of each algorithm will be assessed based on, their abilit to mitigate both erroneous additions and reductions in rainfall accumulation from spurious echo contamination and true precipitation removal, respectively. Contamination from individual spurious echo categories will be quantified to further diagnose the abilities of each radar QC algorithm. Finally, a cost-benefit analysis will be conducted to determine if a more automated QC algorithm is a viable alternative to the current, labor-intensive QC algorithm employed by TRMM GV.

  6. Effect of different alcohol levels on take-over performance in conditionally automated driving.

    PubMed

    Wiedemann, Katharina; Naujoks, Frederik; Wörle, Johanna; Kenntner-Mabiala, Ramona; Kaussner, Yvonne; Neukum, Alexandra

    2018-06-01

    Automated driving systems are getting pushed into the consumer market, with varying degrees of automation. Most often the driver's task will consist of being available as a fall-back level when the automation reaches its limits. These so-called take-over situations have attracted a great body of research, focusing on various human factors aspects (e.g., sleepiness) that could undermine the safety of control transitions between automated and manual driving. However, a major source of accidents in manual driving, alcohol consumption, has been a non-issue so far, although a false understanding of the driver's responsibility (i.e., being available as a fallback level) might promote driving under its influence. In this experiment, N = 36 drivers were exposed to different levels of blood alcohol concentrations (BACs: placebo vs. 0.05% vs. 0.08%) in a high fidelity driving simulator, and the effect on take-over time and quality was assessed. The results point out that a 0.08% BAC increases the time needed to re-engage in the driving task and impairs several aspects of longitudinal and lateral vehicle control, whereas 0.05% BAC did only go along with descriptive impairments in fewer parameters. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Improved compliance by BPM-driven workflow automation.

    PubMed

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  8. Assessment of the Current Level of Automation in the Manufacture of Fuel Cell Systems for Combined Heat and Power Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ulsh, M.; Wheeler, D.; Protopappas, P.

    The U.S. Department of Energy (DOE) is interested in supporting manufacturing research and development (R&D) for fuel cell systems in the 10-1,000 kilowatt (kW) power range relevant to stationary and distributed combined heat and power applications, with the intent to reduce manufacturing costs and increase production throughput. To assist in future decision-making, DOE requested that the National Renewable Energy Laboratory (NREL) provide a baseline understanding of the current levels of adoption of automation in manufacturing processes and flow, as well as of continuous processes. NREL identified and visited or interviewed key manufacturers, universities, and laboratories relevant to the study usingmore » a standard questionnaire. The questionnaire covered the current level of vertical integration, the importance of quality control developments for automation, the current level of automation and source of automation design, critical balance of plant issues, potential for continuous cell manufacturing, key manufacturing steps or processes that would benefit from DOE support for manufacturing R&D, the potential for cell or stack design changes to support automation, and the relationship between production volume and decisions on automation.« less

  9. Best-Quality Vessel Identification Using Vessel Quality Measure in Multiple-Phase Coronary CT Angiography.

    PubMed

    Hadjiiski, Lubomir; Liu, Jordan; Chan, Heang-Ping; Zhou, Chuan; Wei, Jun; Chughtai, Aamer; Kuriakose, Jean; Agarwal, Prachi; Kazerooni, Ella

    2016-01-01

    The detection of stenotic plaques strongly depends on the quality of the coronary arterial tree imaged with coronary CT angiography (cCTA). However, it is time consuming for the radiologist to select the best-quality vessels from the multiple-phase cCTA for interpretation in clinical practice. We are developing an automated method for selection of the best-quality vessels from coronary arterial trees in multiple-phase cCTA to facilitate radiologist's reading or computerized analysis. Our automated method consists of vessel segmentation, vessel registration, corresponding vessel branch matching, vessel quality measure (VQM) estimation, and automatic selection of best branches based on VQM. For every branch, the VQM was calculated as the average radial gradient. An observer preference study was conducted to visually compare the quality of the selected vessels. 167 corresponding branch pairs were evaluated by two radiologists. The agreement between the first radiologist and the automated selection was 76% with kappa of 0.49. The agreement between the second radiologist and the automated selection was also 76% with kappa of 0.45. The agreement between the two radiologists was 81% with kappa of 0.57. The observer preference study demonstrated the feasibility of the proposed automated method for the selection of the best-quality vessels from multiple cCTA phases.

  10. Automated data processing and radioassays.

    PubMed

    Samols, E; Barrows, G H

    1978-04-01

    Radioassays include (1) radioimmunoassays, (2) competitive protein-binding assays based on competition for limited antibody or specific binding protein, (3) immunoradiometric assay, based on competition for excess labeled antibody, and (4) radioreceptor assays. Most mathematical models describing the relationship between labeled ligand binding and unlabeled ligand concentration have been based on the law of mass action or the isotope dilution principle. These models provide useful data reduction programs, but are theoretically unfactory because competitive radioassay usually is not based on classical dilution principles, labeled and unlabeled ligand do not have to be identical, antibodies (or receptors) are frequently heterogenous, equilibrium usually is not reached, and there is probably steric and cooperative influence on binding. An alternative, more flexible mathematical model based on the probability or binding collisions being restricted by the surface area of reactive divalent sites on antibody and on univalent antigen has been derived. Application of these models to automated data reduction allows standard curves to be fitted by a mathematical expression, and unknown values are calculated from binding data. The vitrues and pitfalls are presented of point-to-point data reduction, linear transformations, and curvilinear fitting approaches. A third-order polynomial using the square root of concentration closely approximates the mathematical model based on probability, and in our experience this method provides the most acceptable results with all varieties of radioassays. With this curvilinear system, linear point connection should be used between the zero standard and the beginning of significant dose response, and also towards saturation. The importance is stressed of limiting the range of reported automated assay results to that portion of the standard curve that delivers optimal sensitivity. Published methods for automated data reduction of Scatchard plots for radioreceptor assay are limited by calculation of a single mean K value. The quality of the input data is generally the limiting factor in achieving good precision with automated as it is with manual data reduction. The major advantages of computerized curve fitting include: (1) handling large amounts of data rapidly and without computational error; (2) providing useful quality-control data; (3) indicating within-batch variance of the test results; (4) providing ongoing quality-control charts and between assay variance.

  11. Automated monitoring compared to standard care for the early detection of sepsis in critically ill patients.

    PubMed

    Warttig, Sheryl; Alderson, Phil; Evans, David Jw; Lewis, Sharon R; Kourbeti, Irene S; Smith, Andrew F

    2018-06-25

    Sepsis is a life-threatening condition that is usually diagnosed when a patient has a suspected or documented infection, and meets two or more criteria for systemic inflammatory response syndrome (SIRS). The incidence of sepsis is higher among people admitted to critical care settings such as the intensive care unit (ICU) than among people in other settings. If left untreated sepsis can quickly worsen; severe sepsis has a mortality rate of 40% or higher, depending on definition. Recognition of sepsis can be challenging as it usually requires patient data to be combined from multiple unconnected sources, and interpreted correctly, which can be complex and time consuming to do. Electronic systems that are designed to connect information sources together, and automatically collate, analyse, and continuously monitor the information, as well as alerting healthcare staff when pre-determined diagnostic thresholds are met, may offer benefits by facilitating earlier recognition of sepsis and faster initiation of treatment, such as antimicrobial therapy, fluid resuscitation, inotropes, and vasopressors if appropriate. However, there is the possibility that electronic, automated systems do not offer benefits, or even cause harm. This might happen if the systems are unable to correctly detect sepsis (meaning that treatment is not started when it should be, or it is started when it shouldn't be), or healthcare staff may not respond to alerts quickly enough, or get 'alarm fatigue' especially if the alarms go off frequently or give too many false alarms. To evaluate whether automated systems for the early detection of sepsis can reduce the time to appropriate treatment (such as initiation of antibiotics, fluids, inotropes, and vasopressors) and improve clinical outcomes in critically ill patients in the ICU. We searched CENTRAL; MEDLINE; Embase; CINAHL; ISI Web of science; and LILACS, clinicaltrials.gov, and the World Health Organization trials portal. We searched all databases from their date of inception to 18 September 2017, with no restriction on country or language of publication. We included randomized controlled trials (RCTs) that compared automated sepsis-monitoring systems to standard care (such as paper-based systems) in participants of any age admitted to intensive or critical care units for critical illness. We defined an automated system as any process capable of screening patient records or data (one or more systems) automatically at intervals for markers or characteristics that are indicative of sepsis. We defined critical illness as including, but not limited to postsurgery, trauma, stroke, myocardial infarction, arrhythmia, burns, and hypovolaemic or haemorrhagic shock. We excluded non-randomized studies, quasi-randomized studies, and cross-over studies . We also excluded studies including people already diagnosed with sepsis. We used the standard methodological procedures expected by Cochrane. Our primary outcomes were: time to initiation of antimicrobial therapy; time to initiation of fluid resuscitation; and 30-day mortality. Secondary outcomes included: length of stay in ICU; failed detection of sepsis; and quality of life. We used GRADE to assess the quality of evidence for each outcome. We included three RCTs in this review. It was unclear if the RCTs were three separate studies involving 1199 participants in total, or if they were reports from the same study involving fewer participants. We decided to treat the studies separately, as we were unable to make contact with the study authors to clarify.All three RCTs are of very low study quality because of issues with unclear randomization methods, allocation concealment and uncertainty of effect size. Some of the studies were reported as abstracts only and contained limited data, which prevented meaningful analysis and assessment of potential biases.The studies included participants who all received automated electronic monitoring during their hospital stay. Participants were randomized to an intervention group (automated alerts sent from the system) or to usual care (no automated alerts sent from the system).Evidence from all three studies reported 'Time to initiation of antimicrobial therapy'. We were unable to pool the data, but the largest study involving 680 participants reported median time to initiation of antimicrobial therapy in the intervention group of 5.6 hours (interquartile range (IQR) 2.3 to 19.7) in the intervention group (n = not stated) and 7.8 hours (IQR 2.5 to 33.1) in the control group (n = not stated).No studies reported 'Time to initiation of fluid resuscitation' or the adverse event 'Mortality at 30 days'. However very low-quality evidence was available where mortality was reported at other time points. One study involving 77 participants reported 14-day mortality of 20% in the intervention group and 21% in the control group (numerator and denominator not stated). One study involving 442 participants reported mortality at 28 days, or discharge was 14% in the intervention group and 10% in the control group (numerator and denominator not reported). Sample sizes were not reported adequately for these outcomes and so we could not estimate confidence intervals.Very low-quality evidence from one study involving 442 participants reported 'Length of stay in ICU'. Median length of stay was 3.0 days in the intervention group (IQR = 2.0 to 5.0), and 3.0 days (IQR 2.0 to 4.0 in the control).Very low-quality evidence from one study involving at least 442 participants reported the adverse effect 'Failed detection of sepsis'. Data were only reported for failed detection of sepsis in two participants and it wasn't clear which group(s) this outcome occurred in.No studies reported 'Quality of life'. It is unclear what effect automated systems for monitoring sepsis have on any of the outcomes included in this review. Very low-quality evidence is only available on automated alerts, which is only one component of automated monitoring systems. It is uncertain whether such systems can replace regular, careful review of the patient's condition by experienced healthcare staff.

  12. Automated X-ray quality control of catalytic converters

    NASA Astrophysics Data System (ADS)

    Shashishekhar, N.; Veselitza, D.

    2017-02-01

    Catalytic converters are devices attached to the exhaust system of automobile or other engines to eliminate or substantially reduce polluting emissions. They consist of coated substrates enclosed in a stainless steel housing. The substrate is typically made of ceramic honeycombs; however stainless steel foil honeycombs are also used. The coating is usually a slurry of alumina, silica, rare earth oxides and platinum group metals. The slurry also known as the wash coat is applied to the substrate in two doses, one on each end of the substrate; in some cases multiple layers of coating are applied. X-ray imaging is used to inspect the applied coating depth on a substrate to confirm compliance with quality requirements. Automated image analysis techniques are employed to measure the coating depth from the X-ray image. Coating depth is assessed by analysis of attenuation line profiles in the image. Edge detection algorithms with noise reduction and outlier rejection are used to calculate the coating depth at a specified point along an attenuation line profile. Quality control of the product is accomplished using several attenuation line profile regions for coating depth measurements, with individual pass or fail criteria specified for each region.

  13. Digital control and data acquisition for high-value GTA welding

    NASA Astrophysics Data System (ADS)

    George, T. G.; Franco-Ferreira, E. A.

    Electric power for the Cassini space probe will be provided by radioisotope thermoelectric generators (RTG's) thermally driven by General-Purpose Heat Source (GPHS) modules. Each GPHS module contains four, 150-g, pellets of Pu-238O2, and each of the four pellets is encapsulated within a thin-wall iridium-alloy shell. GTA girth welding of these capsules is performed at Los Alamos National Laboratory (LANL) on an automated, digitally-controlled welding system. Baseline design considerations for system automation and strategies employed to maximize process yield, improve process consistency, and generate required quality assurance information are discussed. Design of the automated girth welding system was driven by a number of factors which militated for precise parametric control and data acquisition. Foremost among these factors was the extraordinary value of the capsule components. In addition, DOE order 5700.6B, which took effect on 23 Sep. 1986, required that all operations adhere to strict levels of process quality assurance. A detailed technical specification for the GPHS welding system was developed on the basis of a joint LANL/Westinghouse Savannah River Company (WSRC) design effort. After a competitive bidding process, Jetline Engineering, Inc., of Irvine, California, was selected as the system manufacturer. During the period over which four identical welding systems were fabricated, very close liason was maintained between the LANL/WSRC technical representatives and the vendor. The level of rapport was outstanding, and the end result was the 1990 delivery of four systems that met or exceeded all specification requirements.

  14. Automation of irrigation systems to control irrigation applications and crop water use efficiency

    USDA-ARS?s Scientific Manuscript database

    Agricultural irrigation management to slow water withdrawals from non-replenishing quality water resources is a global endeavor and vital to sustaining irrigated agriculture and dependent rural economies. Research in site-specific irrigation management has shown that water use efficiency, and crop p...

  15. Implementing GermWatcher, an enterprise infection control application.

    PubMed

    Doherty, Joshua; Noirot, Laura A; Mayfield, Jennie; Ramiah, Sridhar; Huang, Christine; Dunagan, Wm Claiborne; Bailey, Thomas C

    2006-01-01

    Automated surveillance tools can provide significant advantages to infection control practitioners. When stored in a relational database, the data collected can also be used to support numerous research and quality improvement opportunities. A previously described electronic infection control surveillance system was remodeled to provide multi-hospital support, an XML based rule set, and interoperability with an enterprise terminology server. This paper describes the new architecture being used at hospitals across BJC HealthCare.

  16. Manufacturing Execution Systems: Examples of Performance Indicator and Operational Robustness Tools.

    PubMed

    Gendre, Yannick; Waridel, Gérard; Guyon, Myrtille; Demuth, Jean-François; Guelpa, Hervé; Humbert, Thierry

    Manufacturing Execution Systems (MES) are computerized systems used to measure production performance in terms of productivity, yield, and quality. In the first part, performance indicator and overall equipment effectiveness (OEE), process robustness tools and statistical process control are described. The second part details some tools to help process robustness and control by operators by preventing deviations from target control charts. MES was developed by Syngenta together with CIMO for automation.

  17. Automated digital magnetofluidics

    NASA Astrophysics Data System (ADS)

    Schneider, J.; Garcia, A. A.; Marquez, M.

    2008-08-01

    Drops can be moved in complex patterns on superhydrophobic surfaces using a reconfigured computer-controlled x-y metrology stage with a high degree of accuracy, flexibility, and reconfigurability. The stage employs a DMC-4030 controller which has a RISC-based, clock multiplying processor with DSP functions, accepting encoder inputs up to 22 MHz, provides servo update rates as high as 32 kHz, and processes commands at rates as fast as 40 milliseconds. A 6.35 mm diameter cylindrical NdFeB magnet is translated by the stage causing water drops to move by the action of induced magnetization of coated iron microspheres that remain in the drop and are attracted to the rare earth magnet through digital magnetofluidics. Water drops are easily moved in complex patterns in automated digital magnetofluidics at an average speed of 2.8 cm/s over a superhydrophobic polyethylene surface created by solvent casting. With additional components, some potential uses for this automated microfluidic system include characterization of superhydrophobic surfaces, water quality analysis, and medical diagnostics.

  18. Optimization of Robotic Spray Painting process Parameters using Taguchi Method

    NASA Astrophysics Data System (ADS)

    Chidhambara, K. V.; Latha Shankar, B.; Vijaykumar

    2018-02-01

    Automated spray painting process is gaining interest in industry and research recently due to extensive application of spray painting in automobile industries. Automating spray painting process has advantages of improved quality, productivity, reduced labor, clean environment and particularly cost effectiveness. This study investigates the performance characteristics of an industrial robot Fanuc 250ib for an automated painting process using statistical tool Taguchi’s Design of Experiment technique. The experiment is designed using Taguchi’s L25 orthogonal array by considering three factors and five levels for each factor. The objective of this work is to explore the major control parameters and to optimize the same for the improved quality of the paint coating measured in terms of Dry Film thickness(DFT), which also results in reduced rejection. Further Analysis of Variance (ANOVA) is performed to know the influence of individual factors on DFT. It is observed that shaping air and paint flow are the most influencing parameters. Multiple regression model is formulated for estimating predicted values of DFT. Confirmation test is then conducted and comparison results show that error is within acceptable level.

  19. Effects of Meteorological Data Quality on Snowpack Modeling

    NASA Astrophysics Data System (ADS)

    Havens, S.; Marks, D. G.; Robertson, M.; Hedrick, A. R.; Johnson, M.

    2017-12-01

    Detailed quality control of meteorological inputs is the most time-intensive component of running the distributed, physically-based iSnobal snow model, and the effect of data quality of the inputs on the model is unknown. The iSnobal model has been run operationally since WY2013, and is currently run in several basins in Idaho and California. The largest amount of user input during modeling is for the quality control of precipitation, temperature, relative humidity, solar radiation, wind speed and wind direction inputs. Precipitation inputs require detailed user input and are crucial to correctly model the snowpack mass. This research applies a range of quality control methods to meteorological input, from raw input with minimal cleaning, to complete user-applied quality control. The meteorological input cleaning generally falls into two categories. The first is global minimum/maximum and missing value correction that could be corrected and/or interpolated with automated processing. The second category is quality control for inputs that are not globally erroneous, yet are still unreasonable and generally indicate malfunctioning measurement equipment, such as temperature or relative humidity that remains constant, or does not correlate with daily trends observed at nearby stations. This research will determine how sensitive model outputs are to different levels of quality control and guide future operational applications.

  20. Implementation and Test of the Automatic Flight Dynamics Operations for Geostationary Satellite Mission

    NASA Astrophysics Data System (ADS)

    Park, Sangwook; Lee, Young-Ran; Hwang, Yoola; Javier Santiago Noguero Galilea

    2009-12-01

    This paper describes the Flight Dynamics Automation (FDA) system for COMS Flight Dynamics System (FDS) and its test result in terms of the performance of the automation jobs. FDA controls the flight dynamics functions such as orbit determination, orbit prediction, event prediction, and fuel accounting. The designed FDA is independent from the specific characteristics which are defined by spacecraft manufacturer or specific satellite missions. Therefore, FDA could easily links its autonomous job control functions to any satellite mission control system with some interface modification. By adding autonomous system along with flight dynamics system, it decreases the operator’s tedious and repeated jobs but increase the usability and reliability of the system. Therefore, FDA is used to improve the completeness of whole mission control system’s quality. The FDA is applied to the real flight dynamics system of a geostationary satellite, COMS and the experimental test is performed. The experimental result shows the stability and reliability of the mission control operations through the automatic job control.

  1. Automated Quality Assessment of Structural Magnetic Resonance Brain Images Based on a Supervised Machine Learning Algorithm.

    PubMed

    Pizarro, Ricardo A; Cheng, Xi; Barnett, Alan; Lemaitre, Herve; Verchinski, Beth A; Goldman, Aaron L; Xiao, Ena; Luo, Qian; Berman, Karen F; Callicott, Joseph H; Weinberger, Daniel R; Mattay, Venkata S

    2016-01-01

    High-resolution three-dimensional magnetic resonance imaging (3D-MRI) is being increasingly used to delineate morphological changes underlying neuropsychiatric disorders. Unfortunately, artifacts frequently compromise the utility of 3D-MRI yielding irreproducible results, from both type I and type II errors. It is therefore critical to screen 3D-MRIs for artifacts before use. Currently, quality assessment involves slice-wise visual inspection of 3D-MRI volumes, a procedure that is both subjective and time consuming. Automating the quality rating of 3D-MRI could improve the efficiency and reproducibility of the procedure. The present study is one of the first efforts to apply a support vector machine (SVM) algorithm in the quality assessment of structural brain images, using global and region of interest (ROI) automated image quality features developed in-house. SVM is a supervised machine-learning algorithm that can predict the category of test datasets based on the knowledge acquired from a learning dataset. The performance (accuracy) of the automated SVM approach was assessed, by comparing the SVM-predicted quality labels to investigator-determined quality labels. The accuracy for classifying 1457 3D-MRI volumes from our database using the SVM approach is around 80%. These results are promising and illustrate the possibility of using SVM as an automated quality assessment tool for 3D-MRI.

  2. Standard operating procedure changed pre-hospital critical care anaesthesiologists’ behaviour: a quality control study

    PubMed Central

    2013-01-01

    Introduction The ability of standard operating procedures to improve pre-hospital critical care by changing pre-hospital physician behaviour is uncertain. We report data from a prospective quality control study of the effect on pre-hospital critical care anaesthesiologists’ behaviour of implementing a standard operating procedure for pre-hospital controlled ventilation. Materials and methods Anaesthesiologists from eight pre-hospital critical care teams in the Central Denmark Region prospectively registered pre-hospital advanced airway-management data according to the Utstein-style template. We collected pre-intervention data from February 1st 2011 to January 31st 2012, implemented the standard operating procedure on February 1st 2012 and collected post intervention data from February 1st 2012 until October 31st 2012. We included transported patients of all ages in need of controlled ventilation treated with pre-hospital endotracheal intubation or the insertion of a supraglottic airways device. The objective was to evaluate whether the development and implementation of a standard operating procedure for controlled ventilation during transport could change pre-hospital critical care anaesthesiologists’ behaviour and thereby increase the use of automated ventilators in these patients. Results The implementation of a standard operating procedure increased the overall prevalence of automated ventilator use in transported patients in need of controlled ventilation from 0.40 (0.34-0.47) to 0.74 (0.69-0.80) with a prevalence ratio of 1.85 (1.57-2.19) (p = 0.00). The prevalence of automated ventilator use in transported traumatic brain injury patients in need of controlled ventilation increased from 0.44 (0.26-0.62) to 0.85 (0.62-0.97) with a prevalence ratio of 1.94 (1.26-3.0) (p = 0.0039). The prevalence of automated ventilator use in patients transported after return of spontaneous circulation following pre-hospital cardiac arrest increased from 0.39 (0.26-0.48) to 0.69 (0.58-0.78) with a prevalence ratio of 1.79 (1.36-2.35) (p = 0.00). Conclusion We have shown that the implementation of a standard operating procedure for pre-hospital controlled ventilation can significantly change pre-hospital critical care anaesthesiologists’ behaviour. PMID:24308781

  3. An Automated HIV-1 Env-Pseudotyped Virus Production for Global HIV Vaccine Trials

    PubMed Central

    Fuss, Martina; Mazzotta, Angela S.; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A.; Montefiori, David C.; von Briesen, Hagen; Zimmermann, Heiko; Meyerhans, Andreas

    2012-01-01

    Background Infections with HIV still represent a major human health problem worldwide and a vaccine is the only long-term option to fight efficiently against this virus. Standardized assessments of HIV-specific immune responses in vaccine trials are essential for prioritizing vaccine candidates in preclinical and clinical stages of development. With respect to neutralizing antibodies, assays with HIV-1 Env-pseudotyped viruses are a high priority. To cover the increasing demands of HIV pseudoviruses, a complete cell culture and transfection automation system has been developed. Methodology/Principal Findings The automation system for HIV pseudovirus production comprises a modified Tecan-based Cellerity system. It covers an area of 5×3 meters and includes a robot platform, a cell counting machine, a CO2 incubator for cell cultivation and a media refrigerator. The processes for cell handling, transfection and pseudovirus production have been implemented according to manual standard operating procedures and are controlled and scheduled autonomously by the system. The system is housed in a biosafety level II cabinet that guarantees protection of personnel, environment and the product. HIV pseudovirus stocks in a scale from 140 ml to 1000 ml have been produced on the automated system. Parallel manual production of HIV pseudoviruses and comparisons (bridging assays) confirmed that the automated produced pseudoviruses were of equivalent quality as those produced manually. In addition, the automated method was fully validated according to Good Clinical Laboratory Practice (GCLP) guidelines, including the validation parameters accuracy, precision, robustness and specificity. Conclusions An automated HIV pseudovirus production system has been successfully established. It allows the high quality production of HIV pseudoviruses under GCLP conditions. In its present form, the installed module enables the production of 1000 ml of virus-containing cell culture supernatant per week. Thus, this novel automation facilitates standardized large-scale productions of HIV pseudoviruses for ongoing and upcoming HIV vaccine trials. PMID:23300558

  4. High‐throughput automated scoring of Ki67 in breast cancer tissue microarrays from the Breast Cancer Association Consortium

    PubMed Central

    Howat, William J; Daley, Frances; Zabaglo, Lila; McDuffus, Leigh‐Anne; Blows, Fiona; Coulson, Penny; Raza Ali, H; Benitez, Javier; Milne, Roger; Brenner, Herman; Stegmaier, Christa; Mannermaa, Arto; Chang‐Claude, Jenny; Rudolph, Anja; Sinn, Peter; Couch, Fergus J; Tollenaar, Rob A.E.M.; Devilee, Peter; Figueroa, Jonine; Sherman, Mark E; Lissowska, Jolanta; Hewitt, Stephen; Eccles, Diana; Hooning, Maartje J; Hollestelle, Antoinette; WM Martens, John; HM van Deurzen, Carolien; Investigators, kConFab; Bolla, Manjeet K; Wang, Qin; Jones, Michael; Schoemaker, Minouk; Broeks, Annegien; van Leeuwen, Flora E; Van't Veer, Laura; Swerdlow, Anthony J; Orr, Nick; Dowsett, Mitch; Easton, Douglas; Schmidt, Marjanka K; Pharoah, Paul D; Garcia‐Closas, Montserrat

    2016-01-01

    Abstract Automated methods are needed to facilitate high‐throughput and reproducible scoring of Ki67 and other markers in breast cancer tissue microarrays (TMAs) in large‐scale studies. To address this need, we developed an automated protocol for Ki67 scoring and evaluated its performance in studies from the Breast Cancer Association Consortium. We utilized 166 TMAs containing 16,953 tumour cores representing 9,059 breast cancer cases, from 13 studies, with information on other clinical and pathological characteristics. TMAs were stained for Ki67 using standard immunohistochemical procedures, and scanned and digitized using the Ariol system. An automated algorithm was developed for the scoring of Ki67, and scores were compared to computer assisted visual (CAV) scores in a subset of 15 TMAs in a training set. We also assessed the correlation between automated Ki67 scores and other clinical and pathological characteristics. Overall, we observed good discriminatory accuracy (AUC = 85%) and good agreement (kappa = 0.64) between the automated and CAV scoring methods in the training set. The performance of the automated method varied by TMA (kappa range= 0.37–0.87) and study (kappa range = 0.39–0.69). The automated method performed better in satisfactory cores (kappa = 0.68) than suboptimal (kappa = 0.51) cores (p‐value for comparison = 0.005); and among cores with higher total nuclei counted by the machine (4,000–4,500 cells: kappa = 0.78) than those with lower counts (50–500 cells: kappa = 0.41; p‐value = 0.010). Among the 9,059 cases in this study, the correlations between automated Ki67 and clinical and pathological characteristics were found to be in the expected directions. Our findings indicate that automated scoring of Ki67 can be an efficient method to obtain good quality data across large numbers of TMAs from multicentre studies. However, robust algorithm development and rigorous pre‐ and post‐analytical quality control procedures are necessary in order to ensure satisfactory performance. PMID:27499923

  5. Informatics applied to cytology

    PubMed Central

    Hornish, Maryanne; Goulart, Robert A.

    2008-01-01

    Automation and emerging information technologies are being adopted by cytology laboratories to augment Pap test screening and improve diagnostic accuracy. As a result, informatics, the application of computers and information systems to information management, has become essential for the successful operation of the cytopathology laboratory. This review describes how laboratory information management systems can be used to achieve an automated and seamless workflow process. The utilization of software, electronic databases and spreadsheets to perform necessary quality control measures are discussed, as well as a Lean production system and Six Sigma approach, to reduce errors in the cytopathology laboratory. PMID:19495402

  6. Pilot vehicle interface on the advanced fighter technology integration F-16

    NASA Technical Reports Server (NTRS)

    Dana, W. H.; Smith, W. B.; Howard, J. D.

    1986-01-01

    This paper focuses on the work load aspects of the pilot vehicle interface in regard to the new technologies tested during AMAS Phase II. Subjects discussed in this paper include: a wide field-of-view head-up display; automated maneuvering attack system/sensor tracker system; master modes that configure flight controls and mission avionics; a modified helmet mounted sight; improved multifunction display capability; a voice interactive command system; ride qualities during automated weapon delivery; a color moving map; an advanced digital map display; and a g-induced loss-of-consciousness and spatial disorientation autorecovery system.

  7. Automated flight test management system

    NASA Technical Reports Server (NTRS)

    Hewett, M. D.; Tartt, D. M.; Agarwal, A.

    1991-01-01

    The Phase 1 development of an automated flight test management system (ATMS) as a component of a rapid prototyping flight research facility for artificial intelligence (AI) based flight concepts is discussed. The ATMS provides a flight engineer with a set of tools that assist in flight test planning, monitoring, and simulation. The system is also capable of controlling an aircraft during flight test by performing closed loop guidance functions, range management, and maneuver-quality monitoring. The ATMS is being used as a prototypical system to develop a flight research facility for AI based flight systems concepts at NASA Ames Dryden.

  8. Low-dose abdominal computed tomography for detection of urinary stone disease - Impact of additional spectral shaping of the X-ray beam on image quality and dose parameters.

    PubMed

    Dewes, Patricia; Frellesen, Claudia; Scholtz, Jan-Erik; Fischer, Sebastian; Vogl, Thomas J; Bauer, Ralf W; Schulz, Boris

    2016-06-01

    To evaluate a novel tin filter-based abdominal CT protocol for urolithiasis in terms of image quality and CT dose parameters. 130 consecutive patients with suspected urolithiasis underwent non-enhanced CT with three different protocols: 48 patients (group 1) were examined at tin-filtered 150kV (150kV Sn) on a third-generation dual-source-CT, 33 patients were examined with automated kV-selection (110-140kV) based on the scout view on the same CT-device (group 2), and 49 patients were examined on a second-generation dual-source-CT (group 3) with automated kV-selection (100-140kV). Automated exposure control was active in all groups. Image quality was subjectively evaluated on a 5-point-likert-scale by two radiologists and interobserver agreement as well as signal-to-noise-ratio (SNR) was calculated. Dose-length-product (DLP) and volume CT dose index (CTDIvol) were compared. Image quality was rated in favour for the tin filter protocol with excellent interobserver agreement (ICC=0.86-0.91) and the difference reached statistical significance (p<0.001). SNR was significantly higher in group 1 and 2 compared to second-generation DSCT (p<0.001). On third-generation dual-source CT, there was no significant difference in SNR between the 150kV Sn and the automated kV selection protocol (p=0.5). The DLP of group 1 was 23% and 21% (p<0.002) lower in comparison to group 2 and 3, respectively. So was the CTDIvol of group 1 compared to group 2 (-36%) and 3 (-32%) (p<0.001). Additional shaping of a 150kV source spectrum by a tin filter substantially lowers patient exposure while improving image quality on un-enhanced abdominal computed tomography for urinary stone disease. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. [Automation and organization of technological process of urinalysis].

    PubMed

    Kolenkin, S M; Kishkun, A A; Kol'chenko, O L

    2000-12-01

    Results of introduction into practice of a working model of industrial technology of laboratory studies and KONE Specific Supra and Miditron M devices are shown as exemplified by clinical analysis of the urine. This technology helps standardize all stages and operations, improves the efficiency of quality control of laboratory studies, rationally organizes the work at all stages of the process, creates a system for permanent improvement of the efficiency of investigations at the preanalytical, analytical, and postanalytical stages of technological process of laboratory studies. As a result of introduction of this technology into laboratory practice, violations of quality criteria of clinical urinalysis decreased from 15 to 8% at the preanalytical stage and from 6 to 3% at the analytical stage. Automation of the analysis decreased the need in reagents 3-fold and improved the productivity at the analytical stage 4-fold.

  10. Test Analysis Tools to Ensure Higher Quality of On-Board Real Time Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Boudillet, O.; Mescam, J.-C.; Dalemagne, D.

    2008-08-01

    EADS Astrium Space Transportation, in its Les Mureaux premises, is responsible for the French M51 nuclear deterrent missile onboard SW. There was also developed over 1 million of line of code, mostly in ADA, for the Automated Transfer Vehicle (ATV) onboard SW and the flight control SW of the ARIANE5 launcher which has put it into orbit. As part of the ATV SW, ASTRIUM ST has developed the first Category A SW ever qualified for a European space application. To ensure that all these embedded SW have been developed with the highest quality and reliability level, specific development tools have been designed to cover the steps of source code verification, automated validation test or complete target instruction coverage verification. Three of such dedicated tools are presented here.

  11. Launch Control System Software Development System Automation Testing

    NASA Technical Reports Server (NTRS)

    Hwang, Andrew

    2017-01-01

    The Spaceport Command and Control System (SCCS) is the National Aeronautics and Space Administration's (NASA) launch control system for the Orion capsule and Space Launch System, the next generation manned rocket currently in development. This system requires high quality testing that will measure and test the capabilities of the system. For the past two years, the Exploration and Operations Division at Kennedy Space Center (KSC) has assigned a group including interns and full-time engineers to develop automated tests to save the project time and money. The team worked on automating the testing process for the SCCS GUI that would use streamed simulated data from the testing servers to produce data, plots, statuses, etc. to the GUI. The software used to develop automated tests included an automated testing framework and an automation library. The automated testing framework has a tabular-style syntax, which means the functionality of a line of code must have the appropriate number of tabs for the line to function as intended. The header section contains either paths to custom resources or the names of libraries being used. The automation library contains functionality to automate anything that appears on a desired screen with the use of image recognition software to detect and control GUI components. The data section contains any data values strictly created for the current testing file. The body section holds the tests that are being run. The function section can include any number of functions that may be used by the current testing file or any other file that resources it. The resources and body section are required for all test files; the data and function sections can be left empty if the data values and functions being used are from a resourced library or another file. To help equip the automation team with better tools, the Project Lead of the Automated Testing Team, Jason Kapusta, assigned the task to install and train an optical character recognition (OCR) tool to Brandon Echols, a fellow intern, and I. The purpose of the OCR tool is to analyze an image and find the coordinates of any group of text. Some issues that arose while installing the OCR tool included the absence of certain libraries needed to train the tool and an outdated software version. We eventually resolved the issues and successfully installed the OCR tool. Training the tool required many images and different fonts and sizes, but in the end the tool learned to accurately decipher the text in the images and their coordinates. The OCR tool produced a file that contained significant metadata for each section of text, but only the text and coordinates of the text was required for our purpose. The team made a script to parse the information we wanted from the OCR file to a different file that would be used by automation functions within the automated framework. Since a majority of development and testing for the automated test cases for the GUI in question has been done using live simulated data on the workstations at the Launch Control Center (LCC), a large amount of progress has been made. As of this writing, about 60% of all of automated testing has been implemented. Additionally, the OCR tool will help make our automated tests more robust due to the tool's text recognition being highly scalable to different text fonts and text sizes. Soon we will have the whole test system automated, allowing for more full-time engineers working on development projects.

  12. Development of a novel automated cell isolation, expansion, and characterization platform.

    PubMed

    Franscini, Nicola; Wuertz, Karin; Patocchi-Tenzer, Isabel; Durner, Roland; Boos, Norbert; Graf-Hausner, Ursula

    2011-06-01

    Implementation of regenerative medicine in the clinical setting requires not only biological inventions, but also the development of reproducible and safe method for cell isolation and expansion. As the currently used manual techniques do not fulfill these requirements, there is a clear need to develop an adequate robotic platform for automated, large-scale production of cells or cell-based products. Here, we demonstrate an automated liquid-handling cell-culture platform that can be used to isolate, expand, and characterize human primary cells (e.g., from intervertebral disc tissue) with results that are comparable to the manual procedure. Specifically, no differences could be observed for cell yield, viability, aggregation rate, growth rate, and phenotype. Importantly, all steps-from the enzymatic isolation of cells through the biopsy to the final quality control-can be performed completely by the automated system because of novel tools that were incorporated into the platform. This automated cell-culture platform can therefore replace entirely manual processes in areas that require high throughput while maintaining stability and safety, such as clinical or industrial settings. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.

  13. Concept for estimating mitochondrial DNA haplogroups using a maximum likelihood approach (EMMA)☆

    PubMed Central

    Röck, Alexander W.; Dür, Arne; van Oven, Mannis; Parson, Walther

    2013-01-01

    The assignment of haplogroups to mitochondrial DNA haplotypes contributes substantial value for quality control, not only in forensic genetics but also in population and medical genetics. The availability of Phylotree, a widely accepted phylogenetic tree of human mitochondrial DNA lineages, led to the development of several (semi-)automated software solutions for haplogrouping. However, currently existing haplogrouping tools only make use of haplogroup-defining mutations, whereas private mutations (beyond the haplogroup level) can be additionally informative allowing for enhanced haplogroup assignment. This is especially relevant in the case of (partial) control region sequences, which are mainly used in forensics. The present study makes three major contributions toward a more reliable, semi-automated estimation of mitochondrial haplogroups. First, a quality-controlled database consisting of 14,990 full mtGenomes downloaded from GenBank was compiled. Together with Phylotree, these mtGenomes serve as a reference database for haplogroup estimates. Second, the concept of fluctuation rates, i.e. a maximum likelihood estimation of the stability of mutations based on 19,171 full control region haplotypes for which raw lane data is available, is presented. Finally, an algorithm for estimating the haplogroup of an mtDNA sequence based on the combined database of full mtGenomes and Phylotree, which also incorporates the empirically determined fluctuation rates, is brought forward. On the basis of examples from the literature and EMPOP, the algorithm is not only validated, but both the strength of this approach and its utility for quality control of mitochondrial haplotypes is also demonstrated. PMID:23948335

  14. Implementation of a state of the art automated system for the production of cloud/water vapor motion winds from geostationary satellites

    NASA Technical Reports Server (NTRS)

    Velden, Christopher

    1995-01-01

    The research objectives in this proposal were part of a continuing program at UW-CIMSS to develop and refine an automated geostationary satellite winds processing system which can be utilized in both research and operational environments. The majority of the originally proposed tasks were successfully accomplished, and in some cases the progress exceeded the original goals. Much of the research and development supported by this grant resulted in upgrades and modifications to the existing automated satellite winds tracking algorithm. These modifications were put to the test through case study demonstrations and numerical model impact studies. After being successfully demonstrated, the modifications and upgrades were implemented into the NESDIS algorithms in Washington DC, and have become part of the operational support. A major focus of the research supported under this grant attended to the continued development of water vapor tracked winds from geostationary observations. The fully automated UW-CIMSS tracking algorithm has been tuned to provide complete upper-tropospheric coverage from this data source, with data set quality close to that of operational cloud motion winds. Multispectral water vapor observations were collected and processed from several different geostationary satellites. The tracking and quality control algorithms were tuned and refined based on ground-truth comparisons and case studies involving impact on numerical model analyses and forecasts. The results have shown the water vapor motion winds are of good quality, complement the cloud motion wind data, and can have a positive impact in NWP on many meteorological scales.

  15. An online intervention for reducing depressive symptoms: secondary benefits for self-esteem, empowerment and quality of life.

    PubMed

    Crisp, Dimity; Griffiths, Kathleen; Mackinnon, Andrew; Bennett, Kylie; Christensen, Helen

    2014-04-30

    Internet-based interventions are increasingly recognized as effective for the treatment and prevention of depression; however, there is a paucity of research investigating potential secondary benefits. From a consumer perspective, improvements in indicators of wellbeing such as perceived quality of life may represent the most important outcomes for evaluating the effectiveness of an intervention. This study investigated the 'secondary' benefits for self-esteem, empowerment, quality of life and perceived social support of two 12-week online depression interventions when delivered alone and in combination. Participants comprised 298 adults displaying elevated psychological distress. Participants were randomised to receive: an Internet Support Group (ISG); an automated Internet psycho-educational training program for depression; a combination of these conditions; or a control website. Analyses were performed on an intent-to-treat basis. Following the automated training program immediate improvements were shown in participants׳ self-esteem and empowerment relative to control participants. Improvements in perceived quality of life were reported 6-months following the completion of the intervention when combined with an ISG. These findings provide initial evidence for the effectiveness of this online intervention for improving individual wellbeing beyond the primary aim of the treatment. However, further research is required to investigate the mechanisms underlying improvement in these secondary outcomes. Crown Copyright © 2014. Published by Elsevier Ireland Ltd. All rights reserved.

  16. Comparability of automated human induced pluripotent stem cell culture: a pilot study.

    PubMed

    Archibald, Peter R T; Chandra, Amit; Thomas, Dave; Chose, Olivier; Massouridès, Emmanuelle; Laâbi, Yacine; Williams, David J

    2016-12-01

    Consistent and robust manufacturing is essential for the translation of cell therapies, and the utilisation automation throughout the manufacturing process may allow for improvements in quality control, scalability, reproducibility and economics of the process. The aim of this study was to measure and establish the comparability between alternative process steps for the culture of hiPSCs. Consequently, the effects of manual centrifugation and automated non-centrifugation process steps, performed using TAP Biosystems' CompacT SelecT automated cell culture platform, upon the culture of a human induced pluripotent stem cell (hiPSC) line (VAX001024c07) were compared. This study, has demonstrated that comparable morphologies and cell diameters were observed in hiPSCs cultured using either manual or automated process steps. However, non-centrifugation hiPSC populations exhibited greater cell yields, greater aggregate rates, increased pluripotency marker expression, and decreased differentiation marker expression compared to centrifugation hiPSCs. A trend for decreased variability in cell yield was also observed after the utilisation of the automated process step. This study also highlights the detrimental effect of the cryopreservation and thawing processes upon the growth and characteristics of hiPSC cultures, and demonstrates that automated hiPSC manufacturing protocols can be successfully transferred between independent laboratories.

  17. A 10 cm Dual Frequency Doppler Weather Radar. Part I. The Radar System.

    DTIC Science & Technology

    1982-10-25

    Evaluation System ( RAMCES )". The step attenuator required for this calibration can be programmed remotely, has low power and temperature coefficients, and...Control and Evaluation System". The Quality Assurance/Fault Location Network makes use of fault location techniques at critical locations in the radar and...quasi-con- tinuous monitoring of radar performance. The Radar Monitor, Control and Evaluation System provides for automated system calibration and

  18. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    USGS Publications Warehouse

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  19. Fully automated, internally controlled quantification of hepatitis B Virus DNA by real-time PCR by use of the MagNA Pure LC and LightCycler instruments.

    PubMed

    Leb, Victoria; Stöcher, Markus; Valentine-Thon, Elizabeth; Hölzl, Gabriele; Kessler, Harald; Stekel, Herbert; Berg, Jörg

    2004-02-01

    We report on the development of a fully automated real-time PCR assay for the quantitative detection of hepatitis B virus (HBV) DNA in plasma with EDTA (EDTA plasma). The MagNA Pure LC instrument was used for automated DNA purification and automated preparation of PCR mixtures. Real-time PCR was performed on the LightCycler instrument. An internal amplification control was devised as a PCR competitor and was introduced into the assay at the stage of DNA purification to permit monitoring for sample adequacy. The detection limit of the assay was found to be 200 HBV DNA copies/ml, with a linear dynamic range of 8 orders of magnitude. When samples from the European Union Quality Control Concerted Action HBV Proficiency Panel 1999 were examined, the results were found to be in acceptable agreement with the HBV DNA concentrations of the panel members. In a clinical laboratory evaluation of 123 EDTA plasma samples, a significant correlation was found with the results obtained by the Roche HBV Monitor test on the Cobas Amplicor analyzer within the dynamic range of that system. In conclusion, the newly developed assay has a markedly reduced hands-on time, permits monitoring for sample adequacy, and is suitable for the quantitative detection of HBV DNA in plasma in a routine clinical laboratory.

  20. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. Published by Elsevier B.V.

  1. Can a combination of average of normals and "real time" External Quality Assurance replace Internal Quality Control?

    PubMed

    Badrick, Tony; Graham, Peter

    2018-03-28

    Internal Quality Control and External Quality Assurance are separate but related processes that have developed independently in laboratory medicine over many years. They have different sample frequencies, statistical interpretations and immediacy. Both processes have evolved absorbing new understandings of the concept of laboratory error, sample material matrix and assay capability. However, we do not believe at the coalface that either process has led to much improvement in patient outcomes recently. It is the increasing reliability and automation of analytical platforms along with improved stability of reagents that has reduced systematic and random error, which in turn has minimised the risk of running less frequent IQC. We suggest that it is time to rethink the role of both these processes and unite them into a single approach using an Average of Normals model supported by more frequent External Quality Assurance samples. This new paradigm may lead to less confusion for laboratory staff and quicker responses to and identification of out of control situations.

  2. Community pharmacies automation: any impact on counselling duration and job satisfaction?

    PubMed

    Cavaco, Afonso Miguel; Krookas, Anette Aaland

    2014-04-01

    One key indicator of the quality of health practitioners-patient interaction is the encounters' duration. Automation have been presented as beneficial to pharmacy staff work with patients and thus with a potential impact on pharmacists' and technicians' job satisfaction. To compare the interaction length between pharmacy staff and patients, as well as their job satisfaction, in community pharmacies with and without automation. Portuguese community pharmacies with and without automation. This cross-sectional study followed a quasi-experimental design, divided in two phases. In the first, paired community pharmacies with and without automation were purposively selected for a non-participant overt observation. The second phase comprised a job satisfaction questionnaire of both pharmacists and technical staff. Practitioners and patients demographic and interactional data, as well as job satisfaction, were statistically compared across automation. Interaction length and job satisfaction. Sixty-eight practitioners from 10 automated and non-automated pharmacies produced 721 registered interaction episodes. Automation had no significant influence in interaction duration, controlling for gender and professional categories, being significantly longer with older patients (p = 0.017). On average, staff working at the pharmacy counter had 45 % of free time from direct patient contact. The mean overall satisfaction in this sample was 5.52 (SD = 0.98) out of a maximum score of seven, with no significant differences with automation as well as between professional categories, only with a significant lower job satisfaction for younger pharmacists. As with previous studies in other settings, duration of the interactions was not influenced by pharmacy automation, as well as practitioners' job satisfaction, while practitioners' time constrains seem to be a subjective perception.

  3. Development of quality control and instrumentation performance metrics for diffuse optical spectroscopic imaging instruments in the multi-center clinical environment

    NASA Astrophysics Data System (ADS)

    Keene, Samuel T.; Cerussi, Albert E.; Warren, Robert V.; Hill, Brian; Roblyer, Darren; Leproux, AnaÑ--s.; Durkin, Amanda F.; O'Sullivan, Thomas D.; Haghany, Hosain; Mantulin, William W.; Tromberg, Bruce J.

    2013-03-01

    Instrument equivalence and quality control are critical elements of multi-center clinical trials. We currently have five identical Diffuse Optical Spectroscopic Imaging (DOSI) instruments enrolled in the American College of Radiology Imaging Network (ACRIN, #6691) trial located at five academic clinical research sites in the US. The goal of the study is to predict the response of breast tumors to neoadjuvant chemotherapy in 60 patients. In order to reliably compare DOSI measurements across different instruments, operators and sites, we must be confident that the data quality is comparable. We require objective and reliable methods for identifying, correcting, and rejecting low quality data. To achieve this goal, we developed and tested an automated quality control algorithm that rejects data points below the instrument noise floor, improves tissue optical property recovery, and outputs a detailed data quality report. Using a new protocol for obtaining dark-noise data, we applied the algorithm to ACRIN patient data and successfully improved the quality of recovered physiological data in some cases.

  4. Digital control and data acquisition for high-value GTA welding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, T.G.; Franco-Ferreira, E.A.

    1993-10-01

    Electric power for the Cassini space probe wig be provided by radioisotope thermoelectric generators (RTGs) thermally driven by General-Purpose Heat Source (GPHS) modules. Each GPHS module contains four, 150-g, pellets of {sup 238}PuO{sub 2}, and each of the four pellets is encapsulated within a thin-wall iridium-alloy shell. GTA girth welding of these capsules is performed at Los Alamos National Laboratory (LANL) on an automated, digitally-controlled welding system. This paper discusses baseline design considerations for system automation and strategies employed to maximize process yield, improve process consistency, and generate required quality assurance information. Design of the automated girth welding system wasmore » driven by a number of factors which militated for precise parametric control and data acquisition. Foremost among these factors was the extraordinary value of the capsule components. In addition, DOE order 5700.6B, which took effect on 23 September 1986, required that all operations adhere to strict levels of process quality assurance. A detailed technical specification for the GPHS welding system was developed on the basis of a joint Lanl/Westinghouse Savannah River Company (WSRC) design effort. After a competitive bidding process, Jetline Engineering, Inc., of Irvine, California, was selected as the system manufacturer. During the period over which four identical welding systems were fabricated, very close liason was maintained between the LANL/WSRC technical representatives and the vendor. The level of rapport was outstanding, and the end result was the 1990 delivery of four systems that met or exceeded all specification requirements.« less

  5. An Automated Treatment Plan Quality Control Tool for Intensity-Modulated Radiation Therapy Using a Voxel-Weighting Factor-Based Re-Optimization Algorithm.

    PubMed

    Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura

    2016-01-01

    Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use.

  6. AN EVALUATION OF SAMPLE DISPERSION MEDIAS USED WITH ACCELERATED SOLVENT EXTRACTION FOR THE EXTRACTION AND RECOVERY OF ARSENICALS FROM LFB AND DORM-2

    EPA Science Inventory

    An accelerated solvent extraction (ASE) device was evaluated as a semi-automated means for extracting arsenicals from quality control (QC) samples and DORM-2 [standard reference material (SRM)]. Unlike conventional extraction procedures, the ASE requires that the sample be dispe...

  7. Automated, Miniaturized and Integrated Quality Control-on-Chip (QC-on-a-Chip) for Advanced Cell Therapy Applications

    NASA Astrophysics Data System (ADS)

    Wartmann, David; Rothbauer, Mario; Kuten, Olga; Barresi, Caterina; Visus, Carmen; Felzmann, Thomas; Ertl, Peter

    2015-09-01

    The combination of microfabrication-based technologies with cell biology has laid the foundation for the development of advanced in vitro diagnostic systems capable of evaluating cell cultures under defined, reproducible and standardizable measurement conditions. In the present review we describe recent lab-on-a-chip developments for cell analysis and how these methodologies could improve standard quality control in the field of manufacturing cell-based vaccines for clinical purposes. We highlight in particular the regulatory requirements for advanced cell therapy applications using as an example dendritic cell-based cancer vaccines to describe the tangible advantages of microfluidic devices that overcome most of the challenges associated with automation, miniaturization and integration of cell-based assays. As its main advantage lab-on-a-chip technology allows for precise regulation of culturing conditions, while simultaneously monitoring cell relevant parameters using embedded sensory systems. State-of-the-art lab-on-a-chip platforms for in vitro assessment of cell cultures and their potential future applications for cell therapies and cancer immunotherapy are discussed in the present review.

  8. Microengineering of Metals and Ceramics: Part II: Special Replication Techniques, Automation and Properties; Volume 4: Advanced Micro & Nanosystems

    NASA Astrophysics Data System (ADS)

    Baltes, Henry; Brand, Oliver; Fedder, Gary K.; Hierold, Christofer; Korvink, Jan G.; Tabata, Osamu; Löhe, Detlef; Haußelt, Jürgen

    2005-10-01

    Microstructures, electronics, nanotechnology - these vast fields of research are growing together as the size gap narrows and many different materials are combined. Current research, engineering sucesses and newly commercialized products hint at the immense innovative potentials and future applications that open up once mankind controls shape and function from the atomic level right up to the visible world without any gaps. Continuing from the previous volume, authors from three major competence centres for microengineering here cover all aspects of specialized replication techniques and how to employ state-of-the-art technologies for testing and characterizing micro-scale components, and illustrate quality control aspects and strategies for automation of production procedures in view of future industrial production and commercialisation.

  9. The effect of JPEG compression on automated detection of microaneurysms in retinal images

    NASA Astrophysics Data System (ADS)

    Cree, M. J.; Jelinek, H. F.

    2008-02-01

    As JPEG compression at source is ubiquitous in retinal imaging, and the block artefacts introduced are known to be of similar size to microaneurysms (an important indicator of diabetic retinopathy) it is prudent to evaluate the effect of JPEG compression on automated detection of retinal pathology. Retinal images were acquired at high quality and then compressed to various lower qualities. An automated microaneurysm detector was run on the retinal images of various qualities of JPEG compression and the ability to predict the presence of diabetic retinopathy based on the detected presence of microaneurysms was evaluated with receiver operating characteristic (ROC) methodology. The negative effect of JPEG compression on automated detection was observed even at levels of compression sometimes used in retinal eye-screening programmes and these may have important clinical implications for deciding on acceptable levels of compression for a fully automated eye-screening programme.

  10. Information support for decision making on dispatching control of water distribution in irrigation

    NASA Astrophysics Data System (ADS)

    Yurchenko, I. F.

    2018-05-01

    The research has been carried out on developing the technique of supporting decision making for on-line control, operational management of water allocation for the interfarm irrigation projects basing on the analytical patterns of dispatcher control. This technique provides an increase of labour productivity as well as higher management quality due to the improved level of automation, as well as decision making optimization taking into account diagnostics of the issues, solutions classification, information being required to the decision makers.

  11. Automated work packages architecture: An initial set of human factors and instrumentation and controls requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Vivek; Oxstrand, Johanna H.; Le Blanc, Katya L.

    The work management process in current fleets of national nuclear power plants is so highly dependent on large technical staffs and quality of work instruction, i.e., paper-based, that this puts nuclear energy at somewhat of a long-term economic disadvantage and increase the possibility of human errors. Technologies like mobile portable devices and computer-based procedures can play a key role in improving the plant work management process, thereby increasing productivity and decreasing cost. Automated work packages are a fundamentally an enabling technology for improving worker productivity and human performance in nuclear power plants work activities because virtually every plant work activitymore » is accomplished using some form of a work package. As part of this year’s research effort, automated work packages architecture is identified and an initial set of requirements identified, that are essential and necessary for implementation of automated work packages in nuclear power plants.« less

  12. Amorphous silicon photovoltaic manufacturing technology, phase 2A

    NASA Astrophysics Data System (ADS)

    Duran, G.; Mackamul, K.; Metcalf, D.

    1995-01-01

    Utility Power Group (UPG), and its lower-tier subcontractor, Advanced Photovoltaic Systems, Inc. (APS) have conducted efforts in developing their manufacturing lines. UPG has focused on the automation of encapsulation and termination processes developed in Phase 1. APS has focused on completion of the encapsulation and module design tasks, while continuing the process and quality control and automation projects. The goal is to produce 55 watt (stabilized) EP50 modules in a new facility. In the APS Trenton EUREKA manufacturing facility, APS has: (1) Developed high throughput lamination procedures; (2) Optimized existing module designs; (3) Developed new module designs for architectural applications; (4) Developed enhanced deposition parameter control; (5) Designed equipment required to manufacture new EUREKA modules developed during Phase II; (6) Improved uniformity of thin-film materials deposition; and (7) Improved the stabilized power output of the APS EP50 EUREKA module to 55 watts. In the APS Fairfield EUREKA manufacturing facility, APS has: (1) Introduced the new products developed under Phase 1 into the APS Fairfield EUREKA module production line; (2) Increased the extent of automation in the production line; (3) Introduced Statistical Process Control to the module production line; and (4) Transferred-progress made in the APS Trenton facility into the APS Fairfield facility.

  13. Operational quality control of daily precipitation using spatio-climatological consistency testing

    NASA Astrophysics Data System (ADS)

    Scherrer, S. C.; Croci-Maspoli, M.; van Geijtenbeek, D.; Naguel, C.; Appenzeller, C.

    2010-09-01

    Quality control (QC) of meteorological data is of utmost importance for climate related decisions. The search for an effective automated QC of precipitation data has proven difficult and many weather services still use mainly manual inspection of daily precipitation including MeteoSwiss. However, man power limitations force many weather services to move towards less labour intensive and more automated QC with the challenge to keeping data quality high. In the last decade, several approaches have been presented to objectify daily precipitation QC. Here we present a spatio-climatological approach that will be implemented operationally at MeteoSwiss. It combines the information from the event based spatial distribution of everyday's precipitation field and the historical information of the interpolation error using different precipitation intensity intervals. Expert judgement shows that the system is able to detect potential outliers very well (hardly any missed errors) without creating too many false alarms that need human inspection. 50-80% of all flagged values have been classified as real errors by the data editor. This is much better than the roughly 15-20% using standard spatial regression tests. Very helpful in the QC process is the automatic redistribution of accumulated several day sums. Manual inspection in operations can be reduced and the QC of precipitation objectified substantially.

  14. Determining RNA quality for NextGen sequencing: some exceptions to the gold standard rule of 23S to 16S rRNA ratio

    USDA-ARS?s Scientific Manuscript database

    Using next-generation-sequencing technology to assess entire transcriptomes requires high quality starting RNA. Currently, RNA quality is routinely judged using automated microfluidic gel electrophoresis platforms and associated algorithms. Here we report that such automated methods generate false-n...

  15. Data Quality Screening Service

    NASA Technical Reports Server (NTRS)

    Strub, Richard; Lynnes, Christopher; Hearty, Thomas; Won, Young-In; Fox, Peter; Zednik, Stephan

    2013-01-01

    A report describes the Data Quality Screening Service (DQSS), which is designed to help automate the filtering of remote sensing data on behalf of science users. Whereas this process often involves much research through quality documents followed by laborious coding, the DQSS is a Web Service that provides data users with data pre-filtered to their particular criteria, while at the same time guiding the user with filtering recommendations of the cognizant data experts. The DQSS design is based on a formal semantic Web ontology that describes data fields and the quality fields for applying quality control within a data product. The accompanying code base handles several remote sensing datasets and quality control schemes for data products stored in Hierarchical Data Format (HDF), a common format for NASA remote sensing data. Together, the ontology and code support a variety of quality control schemes through the implementation of the Boolean expression with simple, reusable conditional expressions as operands. Additional datasets are added to the DQSS simply by registering instances in the ontology if they follow a quality scheme that is already modeled in the ontology. New quality schemes are added by extending the ontology and adding code for each new scheme.

  16. The use of an automated flight test management system in the development of a rapid-prototyping flight research facility

    NASA Technical Reports Server (NTRS)

    Duke, Eugene L.; Hewett, Marle D.; Brumbaugh, Randal W.; Tartt, David M.; Antoniewicz, Robert F.; Agarwal, Arvind K.

    1988-01-01

    An automated flight test management system (ATMS) and its use to develop a rapid-prototyping flight research facility for artificial intelligence (AI) based flight systems concepts are described. The ATMS provides a flight test engineer with a set of tools that assist in flight planning and simulation. This system will be capable of controlling an aircraft during the flight test by performing closed-loop guidance functions, range management, and maneuver-quality monitoring. The rapid-prototyping flight research facility is being developed at the Dryden Flight Research Facility of the NASA Ames Research Center (Ames-Dryden) to provide early flight assessment of emerging AI technology. The facility is being developed as one element of the aircraft automation program which focuses on the qualification and validation of embedded real-time AI-based systems.

  17. Development of Moire machine vision

    NASA Technical Reports Server (NTRS)

    Harding, Kevin G.

    1987-01-01

    Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.

  18. An automated technique for manufacturing thermoplastic stringers in continuous length

    NASA Astrophysics Data System (ADS)

    Pantelakis, Sp.; Baxevani, E.; Spelz, U.

    In the present work an automated Continuous Compression Moulding Technique for the manufacture of stringers in continuous length is presented. The method combines pultrusion and hot-pressing. The technique is utilized for the production of L-shape stringers which are widely applied in aerospace constructions. The investigation was carried out on carbon reinforced PEEK (C/PEEK), as well as, for comparison, on the thermoplastic composites carbon reinforced polyethersulfon (C/PES), glass and carbon reinforced polyphenylene-sulfide (G/PPS, C/PPS) and Kevlar reinforced Polyamide 6 (K/PA 6). For the materials investigated the optimized process parameters for manufacturing the L-shape stringers were derived experimentally. To achieve this goal, the quality of the produced parts was controlled by using non-destructive testing techniques. Parts providing satisfactory quality were also tested destructively to measure their mechanical properties. The investigation results have shown the suitability of the technique to produce continuous length stringers.

  19. Development of Moire machine vision

    NASA Astrophysics Data System (ADS)

    Harding, Kevin G.

    1987-10-01

    Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.

  20. Phased Array Probe Optimization for the Inspection of Titanium Billets

    NASA Astrophysics Data System (ADS)

    Rasselkorde, E.; Cooper, I.; Wallace, P.; Lupien, V.

    2010-02-01

    The manufacturing process of titanium billets can produce multiple sub-surface defects that are particularly difficult to detect during the early stages of production. Failure to detect these defects can lead to subsequent in-service failure. A new and novel automated quality control system is being developed for the inspection of titanium billets destined for use in aerospace applications. The sensors will be deployed by an automated system to minimise the use of manual inspections, which should improve the quality and reliability of these critical inspections early on in the manufacturing process. This paper presents the first part of the work, which is the design and the simulation of the phased array ultrasonic inspection of the billets. A series of phased array transducers were designed to optimise the ultrasonic inspection of a ten inch diameter billet made from Titanium 6Al-4V. A comparison was performed between different probes including a 2D annular sectorial array.

  1. Multidate Landsat lake quality monitoring program

    NASA Technical Reports Server (NTRS)

    Fisher, L. T.; Scarpace, F. L.; Thomsen, R. G.

    1979-01-01

    A unified package of files and programs has been developed to automate the multidate Landsat-derived analyses of water quality for about 3000 inland lakes throughout Wisconsin. A master lakes file which stores geographic information on the lakes, a file giving the latitudes and longitudes of control points for scene navigation, and a program to estimate control point locations and produce microfiche character maps for scene navigation are among the files and programs of the system. The use of ground coordinate systems to isolate irregular shaped areas which can be accessed at will appears to provide an economical means of restricting the size of the data set.

  2. Prototype space station automation system delivered and demonstrated at NASA

    NASA Technical Reports Server (NTRS)

    Block, Roger F.

    1987-01-01

    The Automated Subsystem Control for Life Support System (ASCLSS) program has successfully developed and demonstrated a generic approach to the automation and control of Space Station subsystems. The hierarchical and distributed real time controls system places the required controls authority at every level of the automation system architecture. As a demonstration of the automation technique, the ASCLSS system automated the Air Revitalization Group (ARG) of the Space Station regenerative Environmental Control and Life Support System (ECLSS) using real-time, high fidelity simulators of the ARG processess. This automation system represents an early flight prototype and an important test bed for evaluating Space Station controls technology including future application of ADA software in real-time control and the development and demonstration of embedded artificial intelligence and expert systems (AI/ES) in distributed automation and controls systems.

  3. Towards a Visual Quality Metric for Digital Video

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1998-01-01

    The advent of widespread distribution of digital video creates a need for automated methods for evaluating visual quality of digital video. This is particularly so since most digital video is compressed using lossy methods, which involve the controlled introduction of potentially visible artifacts. Compounding the problem is the bursty nature of digital video, which requires adaptive bit allocation based on visual quality metrics. In previous work, we have developed visual quality metrics for evaluating, controlling, and optimizing the quality of compressed still images. These metrics incorporate simplified models of human visual sensitivity to spatial and chromatic visual signals. The challenge of video quality metrics is to extend these simplified models to temporal signals as well. In this presentation I will discuss a number of the issues that must be resolved in the design of effective video quality metrics. Among these are spatial, temporal, and chromatic sensitivity and their interactions, visual masking, and implementation complexity. I will also touch on the question of how to evaluate the performance of these metrics.

  4. Paver automation for road surfacing

    NASA Astrophysics Data System (ADS)

    Tihonov, A.; Velichkin, V.

    2017-10-01

    The paper discusses factors that bear on the quality of motor road pavement as access roads and highways are built and used. A block diagram is proposed to organize elements of the automatic control system to control the asphalt paver’s mechanisms; the system is based on a microprocessor onboard controller to maintain preset elevation of the finishing plate; description of its operation principle is offered. The paper names primary converters to control the finishing plate elevation. A new control method is described to control the machine’s straight-line movement with GLONASS Satellite Positioning System (SPS) during operation.

  5. A quantum leap into the IED age

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patterson, R.C.

    1996-11-01

    The integration of pattern recognition, artificial intelligence and advanced communication technologies in utility substation IED`s (Intelligent Electronic Devices) has opened the door to practical and cost effective automation of power distribution systems. A major driver for the application of these new technologies has been the research directed toward the detection of high-impedance faults. The commercial products which embody these complex detection functions have already expanded to include most of the protection, control, and monitoring required at a utility substation. These new Super-IED`s enable major utility initiatives, such as power quality management, improved public safety, operation and maintenance productivity, and powermore » system automation.« less

  6. Langley Research Center's Unitary Plan Wind Tunnel: Testing Capabilities and Recent Modernization Activities

    NASA Technical Reports Server (NTRS)

    Micol, John R.

    2001-01-01

    Description, capabilities, initiatives, and utilization of the NASA Langley Research Center's Unitary Plan Wind Tunnel are presented. A brief overview of the facility's operational capabilities and testing techniques is provided. A recent Construction of Facilities (CoF) project to improve facility productivity and efficiency through facility automation has been completed and is discussed. Several new and maturing thrusts are underway that include systematic efforts to provide credible assessment for data quality, modifications to the new automation control system for increased compatibility with the Modern Design Of Experiments (MDOE) testing methodology, and process improvements for better test coordination, planning, and execution.

  7. Langley Research Center's Unitary Plan Wind Tunnel: Testing Capabilities and Recent Modernization Activities

    NASA Technical Reports Server (NTRS)

    Micol, John R.

    2001-01-01

    Description, capabilities, initiatives, and utilization of the NASA Langley Research Center's Unitary Plan Wind Tunnel are presented. A brief overview of the facility's operational capabilities and testing techniques is provided. A recent Construction of Facilities (Car) project to improve facility productivity and efficiency through facility automation has been completed and is discussed. Several new and maturing thrusts are underway that include systematic efforts to provide credible assessment for data quality, modifications to the new automation control system for increased compatibility with the Modern Design of Experiments (MDOE) testing methodology, and process improvements for better test coordination, planning, and execution.

  8. An automated model-based aim point distribution system for solar towers

    NASA Astrophysics Data System (ADS)

    Schwarzbözl, Peter; Rong, Amadeus; Macke, Ansgar; Säck, Jan-Peter; Ulmer, Steffen

    2016-05-01

    Distribution of heliostat aim points is a major task during central receiver operation, as the flux distribution produced by the heliostats varies continuously with time. Known methods for aim point distribution are mostly based on simple aim point patterns and focus on control strategies to meet local temperature and flux limits of the receiver. Lowering the peak flux on the receiver to avoid hot spots and maximizing thermal output are obviously competing targets that call for a comprehensive optimization process. This paper presents a model-based method for online aim point optimization that includes the current heliostat field mirror quality derived through an automated deflectometric measurement process.

  9. A clinical data repository enhances hospital infection control.

    PubMed Central

    Samore, M.; Lichtenberg, D.; Saubermann, L.; Kawachi, C.; Carmeli, Y.

    1997-01-01

    We describe the benefits of a relational database of hospital clinical data (Clinical Data Repository; CDR) for an infection control program. The CDR consists of > 40 Sybase tables, and is directly accessible for ad hoc queries by members of the infection control unit who have been granted privileges for access by the Information Systems Department. The data elements and functional requirements most useful for surveillance of nosocomial infections, antibiotic use, and resistant organisms are characterized. Specific applications of the CDR are presented, including the use of automated definitions of nosocomial infection, graphical monitoring of resistant organisms with quality control limits, and prospective detection of inappropriate antibiotic use. Hospital surveillance and quality improvement activities are significantly benefited by the availability of a querable set of tables containing diverse clinical data. PMID:9357588

  10. The Changing Hardwood Export Market and Research to Keep the U.S. Competitive

    Treesearch

    Philip A. Araman

    1988-01-01

    Primary hardwood processors face many interrelated market, product, processing, and resource problems generated by the increasing export market. In processing, yields and quality must be increased and costs must be reduced to stay competitive. Computer-aided and computer-controlled automated processing is also needed. The industry needs to keep its products competitive...

  11. Weld analysis and control system

    NASA Technical Reports Server (NTRS)

    Kennedy, Larry Z. (Inventor); Rodgers, Michael H. (Inventor); Powell, Bradley W. (Inventor); Burroughs, Ivan A. (Inventor); Goode, K. Wayne (Inventor)

    1994-01-01

    The invention is a Weld Analysis and Control System developed for active weld system control through real time weld data acquisition. Closed-loop control is based on analysis of weld system parameters and weld geometry. The system is adapted for use with automated welding apparatus having a weld controller which is capable of active electronic control of all aspects of a welding operation. Enhanced graphics and data displays are provided for post-weld analysis. The system provides parameter acquisition, including seam location which is acquired for active torch cross-seam positioning. Torch stand-off is also monitored for control. Weld bead and parent surface geometrical parameters are acquired as an indication of weld quality. These parameters include mismatch, peaking, undercut, underfill, crown height, weld width, puddle diameter, and other measurable information about the weld puddle regions, such as puddle symmetry, etc. These parameters provide a basis for active control as well as post-weld quality analysis and verification. Weld system parameters, such as voltage, current and wire feed rate, are also monitored and archived for correlation with quality parameters.

  12. Development and Evaluation of a Measure of Library Automation.

    ERIC Educational Resources Information Center

    Pungitore, Verna L.

    1986-01-01

    Construct validity and reliability estimates indicate that study designed to measure utilization of automation in public and academic libraries was successful in tentatively identifying and measuring three subdimensions of level of automation: quality of hardware, method of software development, and number of automation specialists. Questionnaire…

  13. Selecting automation for the clinical chemistry laboratory.

    PubMed

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  14. A feasibility study of cerebral oximetry during in-hospital mechanical and manual cardiopulmonary resuscitation*.

    PubMed

    Parnia, Sam; Nasir, Asad; Ahn, Anna; Malik, Hanan; Yang, Jie; Zhu, Jiawen; Dorazi, Francis; Richman, Paul

    2014-04-01

    A major hurdle limiting the ability to improve the quality of resuscitation has been the lack of a noninvasive real-time detection system capable of monitoring the quality of cerebral and other organ perfusion, as well as oxygen delivery during cardiopulmonary resuscitation. Here, we report on a novel system of cerebral perfusion targeted resuscitation. An observational study evaluating the role of cerebral oximetry (Equanox; Nonin, Plymouth, MI, and Invos; Covidien, Mansfield, MA) as a real-time marker of cerebral perfusion and oxygen delivery together with the impact of an automated mechanical chest compression system (Life Stat; Michigan Instruments, Grand Rapids, MI) on oxygen delivery and return of spontaneous circulation following in-hospital cardiac arrest. Tertiary medical center. In-hospital cardiac arrest patients (n = 34). Cerebral oximetry provided real-time information regarding the quality of perfusion and oxygen delivery. The use of automated mechanical chest compression device (n = 12) was associated with higher regional cerebral oxygen saturation compared with manual chest compression device (n = 22) (53.1% ± 23.4% vs 24% ± 25%, p = 0.002). There was a significant difference in mean regional cerebral oxygen saturation (median % ± interquartile range) in patients who achieved return of spontaneous circulation (n = 15) compared with those without return of spontaneous circulation (n = 19) (47.4% ± 21.4% vs 23% ± 18.42%, p < 0.001). After controlling for patients achieving return of spontaneous circulation or not, significantly higher mean regional cerebral oxygen saturation levels during cardiopulmonary resuscitation were observed in patients who were resuscitated using automated mechanical chest compression device (p < 0.001). The integration of cerebral oximetry into cardiac arrest resuscitation provides a novel noninvasive method to determine the quality of cerebral perfusion and oxygen delivery to the brain. The use of automated mechanical chest compression device during in-hospital cardiac arrest may lead to improved oxygen delivery and organ perfusion.

  15. Long-term quality assurance of [(18)F]-fluorodeoxyglucose (FDG) manufacturing.

    PubMed

    Gaspar, Ludovit; Reich, Michal; Kassai, Zoltan; Macasek, Fedor; Rodrigo, Luis; Kruzliak, Peter; Kovac, Peter

    2016-01-01

    Nine years of experience with 2286 commercial synthesis allowed us to deliver comprehensive information on the quality of (18)F-FDG production. Semi-automated FDG production line using Cyclone 18/9 machine (IBA Belgium), TRACERLab MXFDG synthesiser (GE Health, USA) using alkalic hydrolysis, grade "A" isolator with dispensing robotic unit (Tema Sinergie, Italy), and automatic control system under GAMP5 (minus2, Slovakia) was assessed by TQM tools as highly reliable aseptic production line, fully compliant with Good Manufacturing Practice and just-in-time delivery of FDG radiopharmaceutical. Fluoride-18 is received in steady yield and of very high radioactive purity. Synthesis yields exhibited high variance connected probably with quality of disposable cassettes and chemicals sets. Most performance non-conformities within the manufacturing cycle occur at mechanical nodes of dispensing unit. The long-term monitoring of 2286 commercial synthesis indicated high reliability of automatic synthesizers. Shewhart chart and ANOVA analysis showed that minor non-compliances occurred were mostly caused by the declinations of less experienced staff from standard operation procedures, and also by quality of automatic cassettes. Only 15 syntheses were found unfinished and in 4 cases the product was out-of-specification of European Pharmacopoeia. Most vulnerable step of manufacturing was dispensing and filling in grade "A" isolator. Its cleanliness and sterility was fully controlled under the investigated period by applying hydrogen peroxide vapours (VHP). Our experience with quality assurance in the production of [(18)F]-fluorodeoxyglucose (FDG) at production facility of BIONT based on TRACERlab MXFDG production module can be used for bench-marking of the emerging manufacturing and automated manufacturing systems.

  16. Long-term quality assurance of [18F]-fluorodeoxyglucose (FDG) manufacturing

    PubMed Central

    Gaspar, Ludovit; Reich, Michal; Kassai, Zoltan; Macasek, Fedor; Rodrigo, Luis; Kruzliak, Peter; Kovac, Peter

    2016-01-01

    Nine years of experience with 2286 commercial synthesis allowed us to deliver comprehensive information on the quality of 18F-FDG production. Semi-automated FDG production line using Cyclone 18/9 machine (IBA Belgium), TRACERLab MXFDG synthesiser (GE Health, USA) using alkalic hydrolysis, grade “A” isolator with dispensing robotic unit (Tema Sinergie, Italy), and automatic control system under GAMP5 (minus2, Slovakia) was assessed by TQM tools as highly reliable aseptic production line, fully compliant with Good Manufacturing Practice and just-in-time delivery of FDG radiopharmaceutical. Fluoride-18 is received in steady yield and of very high radioactive purity. Synthesis yields exhibited high variance connected probably with quality of disposable cassettes and chemicals sets. Most performance non-conformities within the manufacturing cycle occur at mechanical nodes of dispensing unit. The long-term monitoring of 2286 commercial synthesis indicated high reliability of automatic synthesizers. Shewhart chart and ANOVA analysis showed that minor non-compliances occurred were mostly caused by the declinations of less experienced staff from standard operation procedures, and also by quality of automatic cassettes. Only 15 syntheses were found unfinished and in 4 cases the product was out-of-specification of European Pharmacopoeia. Most vulnerable step of manufacturing was dispensing and filling in grade “A” isolator. Its cleanliness and sterility was fully controlled under the investigated period by applying hydrogen peroxide vapours (VHP). Our experience with quality assurance in the production of [18F]-fluorodeoxyglucose (FDG) at production facility of BIONT based on TRACERlab MXFDG production module can be used for bench-marking of the emerging manufacturing and automated manufacturing systems. PMID:27508102

  17. Quality Control of Wind Data from 50-MHz Doppler Radar Wind Profiler

    NASA Technical Reports Server (NTRS)

    Vacek, Austin

    2016-01-01

    Upper-level wind profiles obtained from a 50-MHz Doppler Radar Wind Profiler (DRWP) instrument at Kennedy Space Center are incorporated in space launch vehicle design and day-of-launch operations to assess wind effects on the vehicle during ascent. Automated and manual quality control (QC) techniques are implemented to remove spurious data in the upper-level wind profiles caused from atmospheric and non-atmospheric artifacts over the 2010-2012 period of record (POR). By adding the new quality controlled profiles with older profiles from 1997-2009, a robust database will be constructed of upper-level wind characteristics. Statistical analysis will determine the maximum, minimum, and 95th percentile of the wind components from the DRWP profiles over recent POR and compare against the older database. Additionally, this study identifies specific QC flags triggered during the QC process to understand how much data is retained and removed from the profiles.

  18. Quality Control of Wind Data from 50-MHz Doppler Radar Wind Profiler

    NASA Technical Reports Server (NTRS)

    Vacek, Austin

    2015-01-01

    Upper-level wind profiles obtained from a 50-MHz Doppler Radar Wind Profiler (DRWP) instrument at Kennedy Space Center are incorporated in space launch vehicle design and day-of-launch operations to assess wind effects on the vehicle during ascent. Automated and manual quality control (QC) techniques are implemented to remove spurious data in the upper-level wind profiles caused from atmospheric and non-atmospheric artifacts over the 2010-2012 period of record (POR). By adding the new quality controlled profiles with older profiles from 1997-2009, a robust database will be constructed of upper-level wind characteristics. Statistical analysis will determine the maximum, minimum, and 95th percentile of the wind components from the DRWP profiles over recent POR and compare against the older database. Additionally, this study identifies specific QC flags triggered during the QC process to understand how much data is retained and removed from the profiles.

  19. Multi-method automated diagnostics of rotating machines

    NASA Astrophysics Data System (ADS)

    Kostyukov, A. V.; Boychenko, S. N.; Shchelkanov, A. V.; Burda, E. A.

    2017-08-01

    The automated machinery diagnostics and monitoring systems utilized within the petrochemical plants are an integral part of the measures taken to ensure safety and, as a consequence, the efficiency of these industrial facilities. Such systems are often limited in their functionality due to the specifics of the diagnostic techniques adopted. As the diagnostic techniques applied in each system are limited, and machinery defects can have different physical nature, it becomes necessary to combine several diagnostics and monitoring systems to control various machinery components. Such an approach is inconvenient, since it requires additional measures to bring the diagnostic results in a single view of the technical condition of production assets. In this case, we mean by a production facility a bonded complex of a process unit, a drive, a power source and lines. A failure of any of these components will cause an outage of the production asset, which is unacceptable. The purpose of the study is to test a combined use of vibration diagnostics and partial discharge techniques within the diagnostic systems of enterprises for automated control of the technical condition of rotating machinery during maintenance and at production facilities. The described solutions allow you to control the condition of mechanical and electrical components of rotating machines. It is shown that the functionality of the diagnostics systems can be expanded with minimal changes in technological chains of repair and operation of rotating machinery. Automation of such systems reduces the influence of the human factor on the quality of repair and diagnostics of the machinery.

  20. Automated Subsystem Control for Life Support System (ASCLSS)

    NASA Technical Reports Server (NTRS)

    Block, Roger F.

    1987-01-01

    The Automated Subsystem Control for Life Support Systems (ASCLSS) program has successfully developed and demonstrated a generic approach to the automation and control of space station subsystems. The automation system features a hierarchical and distributed real-time control architecture which places maximum controls authority at the lowest or process control level which enhances system autonomy. The ASCLSS demonstration system pioneered many automation and control concepts currently being considered in the space station data management system (DMS). Heavy emphasis is placed on controls hardware and software commonality implemented in accepted standards. The approach demonstrates successfully the application of real-time process and accountability with the subsystem or process developer. The ASCLSS system completely automates a space station subsystem (air revitalization group of the ASCLSS) which moves the crew/operator into a role of supervisory control authority. The ASCLSS program developed over 50 lessons learned which will aide future space station developers in the area of automation and controls..

  1. Automated startup of the MIT research reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kwok, K.S.

    1992-01-01

    This summary describes the development, implementation, and testing of a generic method for performing automated startups of nuclear reactors described by space-independent kinetics under conditions of closed-loop digital control. The technique entails first obtaining a reliable estimate of the reactor's initial degree of subcriticality and then substituting that estimate into a model-based control law so as to permit a power increase from subcritical on a demanded trajectory. The estimation of subcriticality is accomplished by application of the perturbed reactivity method. The shutdown reactor is perturbed by the insertion of reactivity at a known rate. Observation of the resulting period permitsmore » determination of the initial degree of subcriticality. A major advantage to this method is that repeated estimates are obtained of the same quantity. Hence, statistical methods can be applied to improve the quality of the calculation.« less

  2. Material quality development during the automated tow placement process

    NASA Astrophysics Data System (ADS)

    Tierney, John Joseph

    Automated tow placement (ATP) of thermoplastic composites builds on the existing industrial base for equipment, robotics and kinematic placement of material with the aim of further cost reduction by eliminating the autoclave entirely. During ATP processing, thermoplastic composite tows are deposited on a preconsolidated substrate at rates ranging from 10--100mm/s and consolidated using the localized application of heat and pressure by a tow placement head mounted on a robot. The process is highly non-isothermal subjecting the material to multiple heating and cooling rates approaching 1000°C/sec. The requirement for the ATP process is to achieve the same quality in seconds (low void content, full translation of mechanical properties and degree of bonding and minimal warpage) as the autoclave process achieves in hours. The scientific challenge was to first understand and then model the relationships between processing, material response, microstructure and quality. The important phenomena affecting quality investigated in this study include a steady state heat transfer simulation, consolidation and deconsolidation (void dynamics), intimate contact and polymer interdiffusion (degree of bonding/mechanical properties) and residual stress and warpage (crystallization and viscoelastic response). A fundamental understanding of the role of materials related to these mechanisms and their relationship to final quality is developed and applied towards a method of process control and optimization.

  3. Automation and control of off-planet oxygen production processes

    NASA Technical Reports Server (NTRS)

    Marner, W. J.; Suitor, J. W.; Schooley, L. S.; Cellier, F. E.

    1990-01-01

    This paper addresses several aspects of the automation and control of off-planet production processes. First, a general approach to process automation and control is discussed from the viewpoint of translating human process control procedures into automated procedures. Second, the control issues for the automation and control of off-planet oxygen processes are discussed. Sensors, instruments, and components are defined and discussed in the context of off-planet applications, and the need for 'smart' components is clearly established.

  4. Image quality testing of assembled IR camera modules

    NASA Astrophysics Data System (ADS)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  5. A Quality-Control-Oriented Database for a Mesoscale Meteorological Observation Network

    NASA Astrophysics Data System (ADS)

    Lussana, C.; Ranci, M.; Uboldi, F.

    2012-04-01

    In the operational context of a local weather service, data accessibility and quality related issues must be managed by taking into account a wide set of user needs. This work describes the structure and the operational choices made for the operational implementation of a database system storing data from highly automated observing stations, metadata and information on data quality. Lombardy's environmental protection agency, ARPA Lombardia, manages a highly automated mesoscale meteorological network. A Quality Assurance System (QAS) ensures that reliable observational information is collected and disseminated to the users. The weather unit in ARPA Lombardia, at the same time an important QAS component and an intensive data user, has developed a database specifically aimed to: 1) providing quick access to data for operational activities and 2) ensuring data quality for real-time applications, by means of an Automatic Data Quality Control (ADQC) procedure. Quantities stored in the archive include hourly aggregated observations of: precipitation amount, temperature, wind, relative humidity, pressure, global and net solar radiation. The ADQC performs several independent tests on raw data and compares their results in a decision-making procedure. An important ADQC component is the Spatial Consistency Test based on Optimal Interpolation. Interpolated and Cross-Validation analysis values are also stored in the database, providing further information to human operators and useful estimates in case of missing data. The technical solution adopted is based on a LAMP (Linux, Apache, MySQL and Php) system, constituting an open source environment suitable for both development and operational practice. The ADQC procedure itself is performed by R scripts directly interacting with the MySQL database. Users and network managers can access the database by using a set of web-based Php applications.

  6. Machine Learning Approach to Automated Quality Identification of Human Induced Pluripotent Stem Cell Colony Images.

    PubMed

    Joutsijoki, Henry; Haponen, Markus; Rasku, Jyrki; Aalto-Setälä, Katriina; Juhola, Martti

    2016-01-01

    The focus of this research is on automated identification of the quality of human induced pluripotent stem cell (iPSC) colony images. iPS cell technology is a contemporary method by which the patient's cells are reprogrammed back to stem cells and are differentiated to any cell type wanted. iPS cell technology will be used in future to patient specific drug screening, disease modeling, and tissue repairing, for instance. However, there are technical challenges before iPS cell technology can be used in practice and one of them is quality control of growing iPSC colonies which is currently done manually but is unfeasible solution in large-scale cultures. The monitoring problem returns to image analysis and classification problem. In this paper, we tackle this problem using machine learning methods such as multiclass Support Vector Machines and several baseline methods together with Scaled Invariant Feature Transformation based features. We perform over 80 test arrangements and do a thorough parameter value search. The best accuracy (62.4%) for classification was obtained by using a k-NN classifier showing improved accuracy compared to earlier studies.

  7. Home telemonitoring of vital signs--technical challenges and future directions.

    PubMed

    Celler, Branko G; Sparks, Ross S

    2015-01-01

    The telemonitoring of vital signs from the home is an essential element of telehealth services for the management of patients with chronic conditions, such as congestive heart failure (CHF), chronic obstructive pulmonary disease (COPD), diabetes, or poorly controlled hypertension. Telehealth is now being deployed widely in both rural and urban settings, and in this paper, we discuss the contribution made by biomedical instrumentation, user interfaces, and automated risk stratification algorithms in developing a clinical diagnostic quality longitudinal health record at home. We identify technical challenges in the acquisition of high-quality biometric signals from unsupervised patients at home, identify new technical solutions and user interfaces, and propose new measurement modalities and signal processing techniques for increasing the quality and value of vital signs monitoring at home. We also discuss use of vital signs data for the automated risk stratification of patients, so that clinical resources can be targeted to those most at risk of unscheduled admission to hospital. New research is also proposed to integrate primary care, hospital, personal genomic, and telehealth electronic health records, and apply predictive analytics and data mining for enhancing clinical decision support.

  8. A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance

    NASA Astrophysics Data System (ADS)

    Wang, Jiazhou; Chen, Wenzhou; Studenski, Matthew; Cui, Yunfeng; Lee, Andrew J.; Xiao, Ying

    2013-07-01

    The goal of this work is to develop a plan-quality evaluation program for clinical routine and multi-institutional clinical trials so that the overall evaluation efficiency is improved. In multi-institutional clinical trials evaluating the plan quality is a time-consuming and labor-intensive process. In this note, we present a semi-automated plan-quality evaluation program which combines MIMVista, Java/MATLAB, and extensible markup language (XML). More specifically, MIMVista is used for data visualization; Java and its powerful function library are implemented for calculating dosimetry parameters; and to improve the clarity of the index definitions, XML is applied. The accuracy and the efficiency of the program were evaluated by comparing the results of the program with the manually recorded results in two RTOG trials. A slight difference of about 0.2% in volume or 0.6 Gy in dose between the semi-automated program and manual recording was observed. According to the criteria of indices, there are minimal differences between the two methods. The evaluation time is reduced from 10-20 min to 2 min by applying the semi-automated plan-quality evaluation program.

  9. Improving inferior vena cava filter retrieval rates with the define, measure, analyze, improve, control methodology.

    PubMed

    Sutphin, Patrick D; Reis, Stephen P; McKune, Angie; Ravanzo, Maria; Kalva, Sanjeeva P; Pillai, Anil K

    2015-04-01

    To design a sustainable process to improve optional inferior vena cava (IVC) filter retrieval rates based on the Define, Measure, Analyze, Improve, Control (DMAIC) methodology of the Six Sigma process improvement paradigm. DMAIC, an acronym for Define, Measure, Analyze, Improve, and Control, was employed to design and implement a quality improvement project to increase IVC filter retrieval rates at a tertiary academic hospital. Retrievable IVC filters were placed in 139 patients over a 2-year period. The baseline IVC filter retrieval rate (n = 51) was reviewed through a retrospective analysis, and two strategies were devised to improve the filter retrieval rate: (a) mailing of letters to clinicians and patients for patients who had filters placed within 8 months of implementation of the project (n = 43) and (b) a prospective automated scheduling of a clinic visit at 4 weeks after filter placement for all new patients (n = 45). The effectiveness of these strategies was assessed by measuring the filter retrieval rates and estimated increase in revenue to interventional radiology. IVC filter retrieval rates increased from a baseline of 8% to 40% with the mailing of letters and to 52% with the automated scheduling of a clinic visit 4 weeks after IVC filter placement. The estimated revenue per 100 IVC filters placed increased from $2,249 to $10,518 with the mailing of letters and to $17,022 with the automated scheduling of a clinic visit. Using the DMAIC methodology, a simple and sustainable quality improvement intervention was devised that markedly improved IVC filter retrieval rates in eligible patients. Copyright © 2015 SIR. Published by Elsevier Inc. All rights reserved.

  10. Test/score/report: Simulation techniques for automating the test process

    NASA Technical Reports Server (NTRS)

    Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.

    1994-01-01

    A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary to test a POCC/MOC software delivery, as well as improve the quality of the test process.

  11. Web Service for Positional Quality Assessment: the Wps Tier

    NASA Astrophysics Data System (ADS)

    Xavier, E. M. A.; Ariza-López, F. J.; Ureña-Cámara, M. A.

    2015-08-01

    In the field of spatial data every day we have more and more information available, but we still have little or very little information about the quality of spatial data. We consider that the automation of the spatial data quality assessment is a true need for the geomatic sector, and that automation is possible by means of web processing services (WPS), and the application of specific assessment procedures. In this paper we propose and develop a WPS tier centered on the automation of the positional quality assessment. An experiment using the NSSDA positional accuracy method is presented. The experiment involves the uploading by the client of two datasets (reference and evaluation data). The processing is to determine homologous pairs of points (by distance) and calculate the value of positional accuracy under the NSSDA standard. The process generates a small report that is sent to the client. From our experiment, we reached some conclusions on the advantages and disadvantages of WPSs when applied to the automation of spatial data accuracy assessments.

  12. Methods for Evaluating the Performance and Human Stress-Factors of Percussive Riveting

    NASA Astrophysics Data System (ADS)

    Ahn, Jonathan Y.

    The aerospace industry automates portions of their manufacturing and assembly processes. However, mechanics still remain vital to production, especially in areas where automated machines cannot fit, or have yet to match the quality of human craftsmanship. One such task is percussive riveting. Because percussive riveting is associated with a high risk of injury, these tool must be certified prior to release. The major contribution of this thesis is to develop a test bench capable of percussive riveting for ergonomic evaluation purposes. The major issues investigated are: (i) automate the tool evaluation method to be repeatable; (ii) demonstrate use of displacement and force sensors; and (iii) correlate performance and risk exposure of percussive tools. A test bench equipped with servomotors and pneumatic cylinders to control xyz-position of a rivet gun and bucking bar simultaneously, is used to explore this evaluation approach.

  13. Attenuation-based automatic kilovolt (kV)-selection in computed tomography of the chest: effects on radiation exposure and image quality.

    PubMed

    Eller, Achim; Wuest, Wolfgang; Scharf, Michael; Brand, Michael; Achenbach, Stephan; Uder, Michael; Lell, Michael M

    2013-12-01

    To evaluate an automated attenuation-based kV-selection in computed tomography of the chest in respect to radiation dose and image quality, compared to a standard 120 kV protocol. 104 patients were examined using a 128-slice scanner. Fifty examinations (58 ± 15 years, study group) were performed using the automated adaption of tube potential (100-140 kV), based on the attenuation profile of the scout scan, 54 examinations (62 ± 14 years, control group) with fixed 120 kV. Estimated CT dose index (CTDI) of the software-proposed setting was compared with a 120 kV protocol. After the scan CTDI volume (CTDIvol) and dose length product (DLP) were recorded. Image quality was assessed by region of interest (ROI) measurements, subjective image quality by two observers with a 4-point scale (3--excellent, 0--not diagnostic). The algorithm selected 100 kV in 78% and 120 kV in 22%. Overall CTDIvol reduction was 26.6% (34% in 100 kV) overall DLP reduction was 22.8% (32.1% in 100 kV) (all p<0.001). Subjective image quality was excellent in both groups. The attenuation based kV-selection algorithm enables relevant dose reduction (~27%) in chest-CT while keeping image quality parameters at high levels. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  14. 76 FR 81986 - Honeywell International, Inc., Automation and Control Solutions Division, Including On-Site...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ..., Inc., Automation and Control Solutions Division, Including On-Site Leased Workers From Manpower... International, Inc., Automation and Control Solutions Division, Rock Island, Illinois. The notice was published...., Automation and Control Solutions Division. The Department has determined that these workers were sufficiently...

  15. 75 FR 77664 - Honeywell International, Inc., Automation and Control Solutions Division, Including On-Site...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-13

    ..., Inc., Automation and Control Solutions Division, Including On-Site Leased Workers From Manpower...., Automation and Control Solutions Division, Rock Island, Illinois. The notice was published in the Federal...-site at the Rock Island, Illinois location of Honeywell International, Inc., Automation and Control...

  16. Is take-over time all that matters? The impact of visual-cognitive load on driver take-over quality after conditionally automated driving.

    PubMed

    Zeeb, Kathrin; Buchner, Axel; Schrauf, Michael

    2016-07-01

    Currently, development of conditionally automated driving systems which control both lateral and longitudinal vehicle guidance is attracting a great deal of attention. The driver no longer needs to constantly monitor the roadway, but must still be able to resume vehicle control if necessary. The relaxed attention requirement might encourage engagement in non-driving related secondary tasks, and the resulting effect on driver take-over is unclear. The aim of this study was to examine how engagement in three different naturalistic secondary tasks (writing an email, reading a news text, watching a video clip) impacted take-over performance. A driving simulator study was conducted and data from a total of 79 participants (mean age 40 years, 35 females) were used to examine response times and take-over quality. Drivers had to resume vehicle control in four different non-critical scenarios while engaging in secondary tasks. A control group did not perform any secondary tasks. There was no influence of the drivers' engagement in secondary tasks on the time required to return their hands to the steering wheel, and there seemed to be only little if any influence on the time the drivers needed to intervene in vehicle control. Take-over quality, however, deteriorated for distracted drivers, with drivers reading a news text and drivers watching a video deviating on average approximately 8-9cm more from the lane center. These findings seem to indicate that establishing motor readiness may be carried out almost reflexively, but cognitive processing of the situation is impaired by driver distraction. This, in turn, appears to determine take-over quality. The present findings emphasize the importance to consider both response times and take-over quality for a comprehensive understanding of factors that influence driver take-over. Furthermore, a training effect in response times was found to be moderated by the drivers' prior experience with driver assistance systems. This shows that besides driver distraction, driver-related factors influencing take-over performance exist. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. ProSens: integrated production control by automated inspection planning and efficient multisensor metrology

    NASA Astrophysics Data System (ADS)

    Glaser, Ulf; Li, Zhichao; Bichmann, Stephan, II; Pfeifer, Tilo

    2003-05-01

    By China's entry into the WTO, Chinese as well as German companies are facing the question, how to minimize the risk of unfamiliar cooperation partners when developing products. The rise of customer demands concerning quality, product diversity and the reduction of expenses require flexibility and efficiency with reliable component suppliers. In order to build and strengthen sino-german cooperations, a manufacturing control using homogenized and efficient measures to assure high quality is of vital importance. Lack of unifications may cause identical measurements conducted at subcontractors or customers to be carried out with different measurement processes which leads to incomparable results. Rapidly growing company cooperations and simultaneously decreasing of manufacturing scope cause substantial difficulties when coordinating joint quality control activities. "ProSens," a sino-german project consortium consisting of industrial users, technology producers and research institutes, aims at improving selected production processes by: Creation of a homogeneous quality awareness in sino-german cooperations. Sensitization for process accompanying metrology at an early stage of product development. Increase of the process performance by the use of integrated metrology. Reduction of production time and cost. Unification of quality control of complex products by means of efficient measurement strategies and CAD-based inspection planning.

  18. Medical ADP Systems: Automated Medical Records Hold Promise to Improve Patient Care

    DTIC Science & Technology

    1991-01-01

    automated medical records. The report discusses the potential benefits that automation could make to the quality of patient care and the factors that impede...information systems, but no organization has fully automated one of the most critical types of information, patient medical records. The patient medical record...its review of automated medical records. GAO’s objectives in this study were to identify the (1) benefits of automating patient records and (2) factors

  19. Elements of EAF automation processes

    NASA Astrophysics Data System (ADS)

    Ioana, A.; Constantin, N.; Dragna, E. C.

    2017-01-01

    Our article presents elements of Electric Arc Furnace (EAF) automation. So, we present and analyze detailed two automation schemes: the scheme of electrical EAF automation system; the scheme of thermic EAF automation system. The application results of these scheme of automation consists in: the sensitive reduction of specific consummation of electrical energy of Electric Arc Furnace, increasing the productivity of Electric Arc Furnace, increase the quality of the developed steel, increasing the durability of the building elements of Electric Arc Furnace.

  20. Tools for quality control of fingerprint databases

    NASA Astrophysics Data System (ADS)

    Swann, B. Scott; Libert, John M.; Lepley, Margaret A.

    2010-04-01

    Integrity of fingerprint data is essential to biometric and forensic applications. Accordingly, the FBI's Criminal Justice Information Services (CJIS) Division has sponsored development of software tools to facilitate quality control functions relative to maintaining its fingerprint data assets inherent to the Integrated Automated Fingerprint Identification System (IAFIS) and Next Generation Identification (NGI). This paper provides an introduction of two such tools. The first FBI-sponsored tool was developed by the National Institute of Standards and Technology (NIST) and examines and detects the spectral signature of the ridge-flow structure characteristic of friction ridge skin. The Spectral Image Validation/Verification (SIVV) utility differentiates fingerprints from non-fingerprints, including blank frames or segmentation failures erroneously included in data; provides a "first look" at image quality; and can identify anomalies in sample rates of scanned images. The SIVV utility might detect errors in individual 10-print fingerprints inaccurately segmented from the flat, multi-finger image acquired by one of the automated collection systems increasing in availability and usage. In such cases, the lost fingerprint can be recovered by re-segmentation from the now compressed multi-finger image record. The second FBI-sponsored tool, CropCoeff was developed by MITRE and thoroughly tested via NIST. CropCoeff enables cropping of the replacement single print directly from the compressed data file, thus avoiding decompression and recompression of images that might degrade fingerprint features necessary for matching.

  1. Modified SPC for short run test and measurement process in multi-stations

    NASA Astrophysics Data System (ADS)

    Koh, C. K.; Chin, J. F.; Kamaruddin, S.

    2018-03-01

    Due to short production runs and measurement error inherent in electronic test and measurement (T&M) processes, continuous quality monitoring through real-time statistical process control (SPC) is challenging. Industry practice allows the installation of guard band using measurement uncertainty to reduce the width of acceptance limit, as an indirect way to compensate the measurement errors. This paper presents a new SPC model combining modified guard band and control charts (\\bar{\\text{Z}} chart and W chart) for short runs in T&M process in multi-stations. The proposed model standardizes the observed value with measurement target (T) and rationed measurement uncertainty (U). S-factor (S f) is introduced to the control limits to improve the sensitivity in detecting small shifts. The model was embedded in automated quality control system and verified with a case study in real industry.

  2. First CLIPS Conference Proceedings, volume 1

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The first Conference of C Language Production Systems (CLIPS) hosted by the NASA-Lyndon B. Johnson Space Center in August 1990 is presented. Articles included engineering applications, intelligent tutors and training, intelligent software engineering, automated knowledge acquisition, network applications, verification and validation, enhancements to CLIPS, space shuttle quality control/diagnosis applications, space shuttle and real-time applications, and medical, biological, and agricultural applications.

  3. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    NASA Technical Reports Server (NTRS)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  4. Choice of mathematical models for technological process of glass rod drawing

    NASA Astrophysics Data System (ADS)

    Alekseeva, L. B.

    2017-10-01

    The technological process of drawing glass rods (light guides) is considered. Automated control of the drawing process is reduced to the process of making decisions to ensure a given quality. The drawing process is considered as a control object, including the drawing device (control device) and the optical fiber forming zone (control object). To study the processes occurring in the formation zone, mathematical models are proposed, based on the continuum mechanics basics. To assess the influence of disturbances, a transfer function is obtained from the basis of the wave equation. Obtaining the regression equation also adequately describes the drawing process.

  5. Measuring, managing and maximizing refinery performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bascur, O.A.; Kennedy, J.P.

    1996-01-01

    Implementing continuous quality improvement is a confluence of total quality management, people empowerment, performance indicators and information engineering. Supporting information technologies allow a refiner to narrow the gap between management objectives and the process control level. Dynamic performance monitoring benefits come from production cost savings, improved communications and enhanced decision making. A refinery workgroup information flow model helps automate continuous improvement of processes, performance and the organization. The paper discusses the rethinking of refinery operations, dynamic performance monitoring, continuous process improvement, the knowledge coordinator and repository manager, an integrated plant operations workflow, and successful implementation.

  6. Novel, simple and fast automated synthesis of 18F-choline in a single Synthera module

    NASA Astrophysics Data System (ADS)

    Litman, Y.; Pace, P.; Silva, L.; Hormigo, C.; Caro, R.; Gutierrez, H.; Bastianello, M.; Casale, G.

    2012-12-01

    The aim of this work is to develop a method to produce 18F-Fluorocholine in a single Synthera module with high yield, quality and reproducibility. We give special importance to the details of the drying and distillation procedures. After 5 syntheses we report a decay corrected yield of (27 ± 2) % (mean ± S.D.). The radiochemical purity was > 95%, and the other quality control parameters were within the specifications. Product 18F-fluorocholine was administrated to 17 humans with no observed side-effects.

  7. Software development environments: Status and trends

    NASA Technical Reports Server (NTRS)

    Duffel, Larry E.

    1988-01-01

    Currently software engineers are the essential integrating factors tying several components together. The components consist of process, methods, computers, tools, support environments, and software engineers. The engineers today empower the tools versus the tools empowering the engineers. Some of the issues in software engineering are quality, managing the software engineering process, and productivity. A strategy to accomplish this is to promote the evolution of software engineering from an ad hoc, labor intensive activity to a managed, technology supported discipline. This strategy may be implemented by putting the process under management control, adopting appropriate methods, inserting the technology that provides automated support for the process and methods, collecting automated tools into an integrated environment and educating the personnel.

  8. Comparing clinical automated, medical record, and hybrid data sources for diabetes quality measures.

    PubMed

    Kerr, Eve A; Smith, Dylan M; Hogan, Mary M; Krein, Sarah L; Pogach, Leonard; Hofer, Timothy P; Hayward, Rodney A

    2002-10-01

    Little is known about the relative reliability of medical record and clinical automated data, sources commonly used to assess diabetes quality of care. The agreement between diabetes quality measures constructed from clinical automated versus medical record data sources was compared, and the performance of hybrid measures derived from a combination of the two data sources was examined. Medical records were abstracted for 1,032 patients with diabetes who received care from 21 facilities in 4 Veterans Integrated Service Networks. Automated data were obtained from a central Veterans Health Administration diabetes registry containing information on laboratory tests and medication use. Success rates were higher for process measures derived from medical record data than from automated data, but no substantial differences among data sources were found for the intermediate outcome measures. Agreement for measures derived from the medical record compared with automated data was moderate for process measures but high for intermediate outcome measures. Hybrid measures yielded success rates similar to those of medical record-based measures but would have required about 50% fewer chart reviews. Agreement between medical record and automated data was generally high. Yet even in an integrated health care system with sophisticated information technology, automated data tended to underestimate the success rate in technical process measures for diabetes care and yielded different quartile performance rankings for facilities. Applying hybrid methodology yielded results consistent with the medical record but required less data to come from medical record reviews.

  9. An Automated Treatment Plan Quality Control Tool for Intensity-Modulated Radiation Therapy Using a Voxel-Weighting Factor-Based Re-Optimization Algorithm

    PubMed Central

    Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura

    2016-01-01

    Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use. PMID:26930204

  10. Colometer: a real-time quality feedback system for screening colonoscopy.

    PubMed

    Filip, Dobromir; Gao, Xuexin; Angulo-Rodríguez, Leticia; Mintchev, Martin P; Devlin, Shane M; Rostom, Alaa; Rosen, Wayne; Andrews, Christopher N

    2012-08-28

    To investigate the performance of a new software-based colonoscopy quality assessment system. The software-based system employs a novel image processing algorithm which detects the levels of image clarity, withdrawal velocity, and level of the bowel preparation in a real-time fashion from live video signal. Threshold levels of image blurriness and the withdrawal velocity below which the visualization could be considered adequate have initially been determined arbitrarily by review of sample colonoscopy videos by two experienced endoscopists. Subsequently, an overall colonoscopy quality rating was computed based on the percentage of the withdrawal time with adequate visualization (scored 1-5; 1, when the percentage was 1%-20%; 2, when the percentage was 21%-40%, etc.). In order to test the proposed velocity and blurriness thresholds, screening colonoscopy withdrawal videos from a specialized ambulatory colon cancer screening center were collected, automatically processed and rated. Quality ratings on the withdrawal were compared to the insertion in the same patients. Then, 3 experienced endoscopists reviewed the collected videos in a blinded fashion and rated the overall quality of each withdrawal (scored 1-5; 1, poor; 3, average; 5, excellent) based on 3 major aspects: image quality, colon preparation, and withdrawal velocity. The automated quality ratings were compared to the averaged endoscopist quality ratings using Spearman correlation coefficient. Fourteen screening colonoscopies were assessed. Adenomatous polyps were detected in 4/14 (29%) of the collected colonoscopy video samples. As a proof of concept, the Colometer software rated colonoscope withdrawal as having better visualization than the insertion in the 10 videos which did not have any polyps (average percent time with adequate visualization: 79% ± 5% for withdrawal and 50% ± 14% for insertion, P < 0.01). Withdrawal times during which no polyps were removed ranged from 4-12 min. The median quality rating from the automated system and the reviewers was 3.45 [interquartile range (IQR), 3.1-3.68] and 3.00 (IQR, 2.33-3.67) respectively for all colonoscopy video samples. The automated rating revealed a strong correlation with the reviewer's rating (ρ coefficient= 0.65, P = 0.01). There was good correlation of the automated overall quality rating and the mean endoscopist withdrawal speed rating (Spearman r coefficient= 0.59, P = 0.03). There was no correlation of automated overall quality rating with mean endoscopists image quality rating (Spearman r coefficient= 0.41, P = 0.15). The results from a novel automated real-time colonoscopy quality feedback system strongly agreed with the endoscopists' quality assessments. Further study is required to validate this approach.

  11. PRECOG: a tool for automated extraction and visualization of fitness components in microbial growth phenomics.

    PubMed

    Fernandez-Ricaud, Luciano; Kourtchenko, Olga; Zackrisson, Martin; Warringer, Jonas; Blomberg, Anders

    2016-06-23

    Phenomics is a field in functional genomics that records variation in organismal phenotypes in the genetic, epigenetic or environmental context at a massive scale. For microbes, the key phenotype is the growth in population size because it contains information that is directly linked to fitness. Due to technical innovations and extensive automation our capacity to record complex and dynamic microbial growth data is rapidly outpacing our capacity to dissect and visualize this data and extract the fitness components it contains, hampering progress in all fields of microbiology. To automate visualization, analysis and exploration of complex and highly resolved microbial growth data as well as standardized extraction of the fitness components it contains, we developed the software PRECOG (PREsentation and Characterization Of Growth-data). PRECOG allows the user to quality control, interact with and evaluate microbial growth data with ease, speed and accuracy, also in cases of non-standard growth dynamics. Quality indices filter high- from low-quality growth experiments, reducing false positives. The pre-processing filters in PRECOG are computationally inexpensive and yet functionally comparable to more complex neural network procedures. We provide examples where data calibration, project design and feature extraction methodologies have a clear impact on the estimated growth traits, emphasising the need for proper standardization in data analysis. PRECOG is a tool that streamlines growth data pre-processing, phenotypic trait extraction, visualization, distribution and the creation of vast and informative phenomics databases.

  12. Modeling human response errors in synthetic flight simulator domain

    NASA Technical Reports Server (NTRS)

    Ntuen, Celestine A.

    1992-01-01

    This paper presents a control theoretic approach to modeling human response errors (HRE) in the flight simulation domain. The human pilot is modeled as a supervisor of a highly automated system. The synthesis uses the theory of optimal control pilot modeling for integrating the pilot's observation error and the error due to the simulation model (experimental error). Methods for solving the HRE problem are suggested. Experimental verification of the models will be tested in a flight quality handling simulation.

  13. Automated single-slide staining device. [in clinical bacteriology

    NASA Technical Reports Server (NTRS)

    Wilkins, J. R.; Mills, S. M.

    1975-01-01

    An automatic single-slide Gram staining device is described. A timer-actuated solenoid controls the dispensing of gentian violet, Gram iodine solution, decolorizer, and 1% aqueous safranin in proper sequence and for the time required for optimum staining. The amount of stain or reagent delivered is controlled by means of stopcocks below each solenoid. Used stains and reagents can be flushed automatically or manually. Smears Gram stained automatically are equal in quality to those prepared manually. The time to complete one Gram cycle is 4.80 min.

  14. Specificity of Good Manufacturing Practice (GMP) for Biomedical Cell Products.

    PubMed

    Tulina, M A; Pyatigorskaya, N V

    2018-03-01

    The article describes special aspects of Good Manufacturing Practice (GMP) for biomedical cell products (BMCP) that imply high standards of aseptics throughout the entire productio process, strict requirements to donors and to the procedure of biomaterial isolation, guaranty of tracing BMCP products, defining processing procedures which allow to identify BMCP as minimally manipulated; continuous quality control and automation of the control process at all stages of manufacturing, which will ensure product release simultaneously with completion of technological operations.

  15. In vivo automated quantification of quality of apples during storage using optical coherence tomography images

    NASA Astrophysics Data System (ADS)

    Srivastava, Vishal; Dalal, Devjyoti; Kumar, Anuj; Prakash, Surya; Dalal, Krishna

    2018-06-01

    Moisture content is an important feature of fruits and vegetables. As 80% of apple content is water, so decreasing the moisture content will degrade the quality of apples (Golden Delicious). The computational and texture features of the apples were extracted from optical coherence tomography (OCT) images. A support vector machine with a Gaussian kernel model was used to perform automated classification. To evaluate the quality of wax coated apples during storage in vivo, our proposed method opens up the possibility of fully automated quantitative analysis based on the morphological features of apples. Our results demonstrate that the analysis of the computational and texture features of OCT images may be a good non-destructive method for the assessment of the quality of apples.

  16. Automation in future air traffic management: effects of decision aid reliability on controller performance and mental workload.

    PubMed

    Metzger, Ulla; Parasuraman, Raja

    2005-01-01

    Future air traffic management concepts envisage shared decision-making responsibilities between controllers and pilots, necessitating that controllers be supported by automated decision aids. Even as automation tools are being introduced, however, their impact on the air traffic controller is not well understood. The present experiments examined the effects of an aircraft-to-aircraft conflict decision aid on performance and mental workload of experienced, full-performance level controllers in a simulated Free Flight environment. Performance was examined with both reliable (Experiment 1) and inaccurate automation (Experiment 2). The aid improved controller performance and reduced mental workload when it functioned reliably. However, detection of a particular conflict was better under manual conditions than under automated conditions when the automation was imperfect. Potential or actual applications of the results include the design of automation and procedures for future air traffic control systems.

  17. Design And Implementation Of Integrated Vision-Based Robotic Workcells

    NASA Astrophysics Data System (ADS)

    Chen, Michael J.

    1985-01-01

    Reports have been sparse on large-scale, intelligent integration of complete robotic systems for automating the microelectronics industry. This paper describes the application of state-of-the-art computer-vision technology for manufacturing of miniaturized electronic components. The concepts of FMS - Flexible Manufacturing Systems, work cells, and work stations and their control hierarchy are illustrated in this paper. Several computer-controlled work cells used in the production of thin-film magnetic heads are described. These cells use vision for in-process control of head-fixture alignment and real-time inspection of production parameters. The vision sensor and other optoelectronic sensors, coupled with transport mechanisms such as steppers, x-y-z tables, and robots, have created complete sensorimotor systems. These systems greatly increase the manufacturing throughput as well as the quality of the final product. This paper uses these automated work cells as examples to exemplify the underlying design philosophy and principles in the fabrication of vision-based robotic systems.

  18. AVR Microcontroller-based automated technique for analysis of DC motors

    NASA Astrophysics Data System (ADS)

    Kaur, P.; Chatterji, S.

    2014-01-01

    This paper provides essential information on the development of a 'dc motor test and analysis control card' using AVR series ATMega32 microcontroller. This card can be interfaced to PC and calculates parameters like motor losses, efficiency and plot characteristics for dc motors. Presently, there are different tests and methods available to evaluate motor parameters, but a single and universal user-friendly automated set-up has been discussed in this paper. It has been accomplished by designing a data acquisition and SCR bridge firing hardware based on AVR ATMega32 microcontroller. This hardware has the capability to drive the phase-controlled rectifiers and acquire real-time values of current, voltage, temperature and speed of motor. Various analyses feasible with the designed hardware are of immense importance for dc motor manufacturers and quality-sensitive users. Authors, through this paper aim to provide details of this AVR-based hardware which can be used for dc motor parameter analysis and also for motor control applications.

  19. How gamma radiation processing systems are benefiting from the latest advances in information technology

    NASA Astrophysics Data System (ADS)

    Gibson, Wayne H.; Levesque, Daniel

    2000-03-01

    This paper discusses how gamma irradiation plants are putting the latest advances in computer and information technology to use for better process control, cost savings, and strategic advantages. Some irradiator operations are gaining significant benefits by integrating computer technology and robotics with real-time information processing, multi-user databases, and communication networks. The paper reports on several irradiation facilities that are making good use of client/server LANs, user-friendly graphics interfaces, supervisory control and data acquisition (SCADA) systems, distributed I/O with real-time sensor devices, trending analysis, real-time product tracking, dynamic product scheduling, and automated dosimetry reading. These plants are lowering costs by fast and reliable reconciliation of dosimetry data, easier validation to GMP requirements, optimizing production flow, and faster release of sterilized products to market. There is a trend in the manufacturing sector towards total automation using "predictive process control". Real-time verification of process parameters "on-the-run" allows control parameters to be adjusted appropriately, before the process strays out of limits. Applying this technology to the gamma radiation process, control will be based on monitoring the key parameters such as time, and making adjustments during the process to optimize quality and throughput. Dosimetry results will be used as a quality control measurement rather than as a final monitor for the release of the product. Results are correlated with the irradiation process data to quickly and confidently reconcile variations. Ultimately, a parametric process control system utilizing responsive control, feedback and verification will not only increase productivity and process efficiency, but can also result in operating within tighter dose control set points.

  20. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-31

    requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry

  1. The standard calibration instrument automation system for the atomic absorption spectrophotometer. Part 3: Program documentation

    NASA Astrophysics Data System (ADS)

    Ryan, D. P.; Roth, G. S.

    1982-04-01

    Complete documentation of the 15 programs and 11 data files of the EPA Atomic Absorption Instrument Automation System is presented. The system incorporates the following major features: (1) multipoint calibration using first, second, or third degree regression or linear interpolation, (2) timely quality control assessments for spiked samples, duplicates, laboratory control standards, reagent blanks, and instrument check standards, (3) reagent blank subtraction, and (4) plotting of calibration curves and raw data peaks. The programs of this system are written in Data General Extended BASIC, Revision 4.3, as enhanced for multi-user, real-time data acquisition. They run in a Data General Nova 840 minicomputer under the operating system RDOS, Revision 6.2. There is a functional description, a symbol definitions table, a functional flowchart, a program listing, and a symbol cross reference table for each program. The structure of every data file is also detailed.

  2. Effects of Automation Types on Air Traffic Controller Situation Awareness and Performance

    NASA Technical Reports Server (NTRS)

    Sethumadhavan, A.

    2009-01-01

    The Joint Planning and Development Office has proposed the introduction of automated systems to help air traffic controllers handle the increasing volume of air traffic in the next two decades (JPDO, 2007). Because fully automated systems leave operators out of the decision-making loop (e.g., Billings, 1991), it is important to determine the right level and type of automation that will keep air traffic controllers in the loop. This study examined the differences in the situation awareness (SA) and collision detection performance of individuals when they worked with information acquisition, information analysis, decision and action selection and action implementation automation to control air traffic (Parasuraman, Sheridan, & Wickens, 2000). When the automation was unreliable, the time taken to detect an upcoming collision was significantly longer for all the automation types compared with the information acquisition automation. This poor performance following automation failure was mediated by SA, with lower SA yielding poor performance. Thus, the costs associated with automation failure are greater when automation is applied to higher order stages of information processing. Results have practical implications for automation design and development of SA training programs.

  3. CALL FOR PAPERS: 13th International Conference on Force and Mass Measurement

    NASA Astrophysics Data System (ADS)

    1992-01-01

    10 14 May 1993, Helsinki Fair Centre, Finland Scope of the Conference The Conference reports and reviews the state of the art and future trends in force and mass measurements in science and industry. Emphasis is on the applications of new methods, current problems in calibration and quality control, as well as on advancements in new sensor technologies and industrial application of force and mass measurements. Main Themes and Topics 1. The state of the art and development trends in force and mass measurements Development and stability of high level mass standards Mass comparators and force standard machine New research topics in mass and force 2. Calibration and quality control Calibration methods Estimation of uncertainties and classification of accuracies Relations between calibration, testing and quality control Requirements for quality control Verification of weighing instruments and their main devices 3. Application of force and mass measurements Automatic weighing Mass flow measurements Quality control in process industry Sensor technologies Practical applications Special applications in industry, trade, etc. Deadline for submission of abstracts: 30 June 1992. For further information please contact: Finnish Society of Automation, Asemapäällikönkatu 12C, SF-00520 HELSINKI, Finland Phone: Int. +3580 1461 644, Fax: Int. +3580 1461 650

  4. Intelligent robot trends and predictions for the .net future

    NASA Astrophysics Data System (ADS)

    Hall, Ernest L.

    2001-10-01

    An intelligent robot is a remarkably useful combination of a manipulator, sensors and controls. The use of these machines in factory automation can improve productivity, increase product quality and improve competitiveness. This paper presents a discussion of recent and future technical and economic trends. During the past twenty years the use of industrial robots that are equipped not only with precise motion control systems but also with sensors such as cameras, laser scanners, or tactile sensors that permit adaptation to a changing environment has increased dramatically. Intelligent robot products have been developed in many cases for factory automation and for some hospital and home applications. To reach an even higher degree of applications, the addition of learning may be required. Recently, learning theories such as the adaptive critic have been proposed. In this type of learning, a critic provides a grade to the controller of an action module such as a robot. The adaptive critic is a good model for human learning. In general, the critic may be considered to be the human with the teach pendant, plant manager, line supervisor, quality inspector or the consumer. If the ultimate critic is the consumer, then the quality inspector must model the consumer's decision-making process and use this model in the design and manufacturing operations. Can the adaptive critic be used to advance intelligent robots? Intelligent robots have historically taken decades to be developed and reduced to practice. Methods for speeding this development include technology such as rapid prototyping and product development and government, industry and university cooperation.

  5. Evaluation of the use of automatic exposure control and automatic tube potential selection in low-dose cerebrospinal fluid shunt head CT.

    PubMed

    Wallace, Adam N; Vyhmeister, Ross; Bagade, Swapnil; Chatterjee, Arindam; Hicks, Brandon; Ramirez-Giraldo, Juan Carlos; McKinstry, Robert C

    2015-06-01

    Cerebrospinal fluid shunts are primarily used for the treatment of hydrocephalus. Shunt complications may necessitate multiple non-contrast head CT scans resulting in potentially high levels of radiation dose starting at an early age. A new head CT protocol using automatic exposure control and automated tube potential selection has been implemented at our institution to reduce radiation exposure. The purpose of this study was to evaluate the reduction in radiation dose achieved by this protocol compared with a protocol with fixed parameters. A retrospective sample of 60 non-contrast head CT scans assessing for cerebrospinal fluid shunt malfunction was identified, 30 of which were performed with each protocol. The radiation doses of the two protocols were compared using the volume CT dose index and dose length product. The diagnostic acceptability and quality of each scan were evaluated by three independent readers. The new protocol lowered the average volume CT dose index from 15.2 to 9.2 mGy representing a 39 % reduction (P < 0.01; 95 % CI 35-44 %) and lowered the dose length product from 259.5 to 151.2 mGy/cm representing a 42 % reduction (P < 0.01; 95 % CI 34-50 %). The new protocol produced diagnostically acceptable scans with comparable image quality to the fixed parameter protocol. A pediatric shunt non-contrast head CT protocol using automatic exposure control and automated tube potential selection reduced patient radiation dose compared with a fixed parameter protocol while producing diagnostic images of comparable quality.

  6. Validation of a method for the determination of zolpidem in human plasma using LC with fluorescence detection.

    PubMed

    Ring, P R; Bostick, J M

    2000-04-01

    A sensitive and selective high-performance liquid chromatography (HPLC) method was developed for the determination of zolpidem in human plasma. Zolpidem and the internal standard (trazodone) were extracted from human plasma that had been made basic. The basic sample was loaded onto a conditioned Bond Elut C18 cartridge, rinsed with water and eluted with methanol. Forty microliters were then injected onto the LC system. Separation was achieved on a C18 column (150 x 4.6 mm, 5 microm) with a mobile phase composed of acetonitrile:50 mM potassium phosphate monobasic at pH 6.0 (4:6, v/v). Detection was by fluorescence, with excitation at 254 nm and emission at 400 nm. The retention times of zolpidem and internal standard were approximately 4.7 and 5.3 min, respectively. The LC run time was 8 min. The assay was linear in concentration range 1-400 ng/ml for zolpidem in human plasma. The analysis of quality control samples for zolpidem (3, 30, and 300 ng/ml) demonstrated excellent precision with relative standard deviations (RSD) of 3.7, 4.6, and 3.0%, respectively (n = 18). The method was accurate with all intraday (n = 6) and overall (n = 18) mean concentrations within 5.8% from nominal at all quality control sample concentrations. This method was also performed using a Gilson Aspec XL automated sample processor and autoinjector. The samples were manually fortified with internal standard and made basic. The aspec then performed the solid phase extraction and made injections of the samples onto the LC system. Using the automated procedure for analysis, quality control samples for zolpidem (3, 30, and 300 ng/ml) demonstrated acceptable precision with RSD values of 9.0, 4.9, and 5.1%, respectively (n = 12). The method was accurate with all intracurve (n = 4) and overall (n = 12) mean values being less than 10.8% from nominal at all quality control sample concentrations.

  7. A comparison of adaptive and adaptable automation under different levels of environmental stress.

    PubMed

    Sauer, Juergen; Kao, Chung-Shan; Wastell, David

    2012-01-01

    The effectiveness of different forms of adaptive and adaptable automation was examined under low- and high-stress conditions, in the form of different levels of noise. Thirty-six participants were assigned to one of the three types of variable automation (adaptive event-based, adaptive performance-based and adaptable serving as a control condition). Participants received 3 h of training on a simulation of a highly automated process control task and were subsequently tested during a 4-h session under noise exposure and quiet conditions. The results for performance suggested no clear benefits of one automation control mode over the other two. However, it emerged that participants under adaptable automation adopted a more active system management strategy and reported higher levels of self-confidence than in the two adaptive control modes. Furthermore, the results showed higher levels of perceived workload, fatigue and anxiety for performance-based adaptive automation control than the other two modes. This study compared two forms of adaptive automation (where the automated system flexibly allocates tasks between human and machine) with adaptable automation (where the human allocates the tasks). The adaptable mode showed marginal advantages. This is of relevance, given that this automation mode may also be easier to design.

  8. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    Automated validation of flight-critical embedded systems is being done at ARC Dryden Flight Research Facility. The automated testing techniques are being used to perform closed-loop validation of man-rated flight control systems. The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 High Alpha Research Vehicle (HARV) automated test systems are discussed. Operationally applying automated testing techniques has accentuated flight control system features that either help or hinder the application of these techniques. The paper also discusses flight control system features which foster the use of automated testing techniques.

  9. User's manual for the National Water Information System of the U.S. Geological Survey: Automated Data Processing System (ADAPS)

    USGS Publications Warehouse

    ,

    2003-01-01

    The Automated Data Processing System (ADAPS) was developed for the processing, storage, and retrieval of water data, and is part of the National Water Information System (NWIS) developed by the U.S. Geological Survey. NWIS is a distributed water database in which data can be processed over a network of computers at U.S. Geological Survey offices throughout the United States. NWIS comprises four subsystems: ADAPS, the Ground-Water Site Inventory System (GWSI), the Water-Quality System (QWDATA), and the Site-Specific Water-Use Data System (SWUDS). This section of the NWIS User's Manual describes the automated data processing of continuously recorded water data, which primarily are surface-water data; however, the system also allows for the processing of water-quality and ground-water data. This manual describes various components and features of the ADAPS, and provides an overview of the data processing system and a description of the system framework. The components and features included are: (1) data collection and processing, (2) ADAPS menus and programs, (3) command line functions, (4) steps for processing station records, (5) postprocessor programs control files, (6) the standard format for transferring and entering unit and daily values, and (7) relational database (RDB) formats.

  10. Robotic voltammetry with carbon nanotube-based sensors: a superb blend for convenient high-quality antimicrobial trace analysis.

    PubMed

    Theanponkrang, Somjai; Suginta, Wipa; Weingart, Helge; Winterhalter, Mathias; Schulte, Albert

    2015-01-01

    A new automated pharmacoanalytical technique for convenient quantification of redox-active antibiotics has been established by combining the benefits of a carbon nanotube (CNT) sensor modification with electrocatalytic activity for analyte detection with the merits of a robotic electrochemical device that is capable of sequential nonmanual sample measurements in 24-well microtiter plates. Norfloxacin (NFX) and ciprofloxacin (CFX), two standard fluoroquinolone antibiotics, were used in automated calibration measurements by differential pulse voltammetry (DPV) and accomplished were linear ranges of 1-10 μM and 2-100 μM for NFX and CFX, respectively. The lowest detectable levels were estimated to be 0.3±0.1 μM (n=7) for NFX and 1.6±0.1 μM (n=7) for CFX. In standard solutions or tablet samples of known content, both analytes could be quantified with the robotic DPV microtiter plate assay, with recoveries within ±4% of 100%. And recoveries were as good when NFX was evaluated in human serum samples with added NFX. The use of simple instrumentation, convenience in execution, and high effectiveness in analyte quantitation suggest the merger between automated microtiter plate voltammetry and CNT-supported electrochemical drug detection as a novel methodology for antibiotic testing in pharmaceutical and clinical research and quality control laboratories.

  11. Fuzzy Control/Space Station automation

    NASA Technical Reports Server (NTRS)

    Gersh, Mark

    1990-01-01

    Viewgraphs on fuzzy control/space station automation are presented. Topics covered include: Space Station Freedom (SSF); SSF evolution; factors pointing to automation & robotics (A&R); astronaut office inputs concerning A&R; flight system automation and ground operations applications; transition definition program; and advanced automation software tools.

  12. Contour scanning of textile preforms using a light-section sensor for the automated manufacturing of fibre-reinforced plastics

    NASA Astrophysics Data System (ADS)

    Schmitt, R.; Niggemann, C.; Mersmann, C.

    2008-04-01

    Fibre-reinforced plastics (FRP) are particularly suitable for components where light-weight structures with advanced mechanical properties are required, e.g. for aerospace parts. Nevertheless, many manufacturing processes for FRP include manual production steps without an integrated quality control. A vital step in the process chain is the lay-up of the textile preform, as it greatly affects the geometry and the mechanical performance of the final part. In order to automate the FRP production, an inline machine vision system is needed for a closed-loop control of the preform lay-up. This work describes the development of a novel laser light-section sensor for optical inspection of textile preforms and its integration and validation in a machine vision prototype. The proposed method aims at the determination of the contour position of each textile layer through edge scanning. The scanning route is automatically derived by using texture analysis algorithms in a preliminary step. As sensor output a distinct stage profile is computed from the acquired greyscale image. The contour position is determined with sub-pixel accuracy using a novel algorithm based on a non-linear least-square fitting to a sigmoid function. The whole contour position is generated through data fusion of the measured edge points. The proposed method provides robust process automation for the FRP production improving the process quality and reducing the scrap quota. Hence, the range of economically feasible FRP products can be increased and new market segments with cost sensitive products can be addressed.

  13. The Implementation of an Automated Assessment Feedback and Quality Assurance System for ICT Courses

    ERIC Educational Resources Information Center

    Debuse, J.; Lawley, M.; Shibl, R.

    2007-01-01

    Providing detailed, constructive and helpful feedback is an important contribution to effective student learning. Quality assurance is also required to ensure consistency across all students and reduce error rates. However, with increasing workloads and student numbers these goals are becoming more difficult to achieve. An automated feedback…

  14. Automated Microfluidic Platform for Serial Polymerase Chain Reaction and High-Resolution Melting Analysis.

    PubMed

    Cao, Weidong; Bean, Brian; Corey, Scott; Coursey, Johnathan S; Hasson, Kenton C; Inoue, Hiroshi; Isano, Taisuke; Kanderian, Sami; Lane, Ben; Liang, Hongye; Murphy, Brian; Owen, Greg; Shinoda, Nobuhiko; Zeng, Shulin; Knight, Ivor T

    2016-06-01

    We report the development of an automated genetic analyzer for human sample testing based on microfluidic rapid polymerase chain reaction (PCR) with high-resolution melting analysis (HRMA). The integrated DNA microfluidic cartridge was used on a platform designed with a robotic pipettor system that works by sequentially picking up different test solutions from a 384-well plate, mixing them in the tips, and delivering mixed fluids to the DNA cartridge. A novel image feedback flow control system based on a Canon 5D Mark II digital camera was developed for controlling fluid movement through a complex microfluidic branching network without the use of valves. The same camera was used for measuring the high-resolution melt curve of DNA amplicons that were generated in the microfluidic chip. Owing to fast heating and cooling as well as sensitive temperature measurement in the microfluidic channels, the time frame for PCR and HRMA was dramatically reduced from hours to minutes. Preliminary testing results demonstrated that rapid serial PCR and HRMA are possible while still achieving high data quality that is suitable for human sample testing. © 2015 Society for Laboratory Automation and Screening.

  15. Control by quality: proposition of a typology.

    PubMed

    Pujo, P; Pillet, M

    The application of Quality tools and methods in industrial management has always had a fundamental impact on the control of production. It influences the behavior of the actors concerned, while introducing the necessary notions and formalizations, especially for production systems with little or no automation, which constitute a large part of the industrial activity. Several quality approaches are applied in the workshop and are implemented at the level of the control. In this paper, the authors present a typology of the various approaches that have successively influenced control, such as statistical process control, quality assurance, and continuous improvement. First the authors present a parallel between production control and quality organizational structure. They note the duality between control, which is aimed at increasing productivity, and quality, which aims to satisfy the needs of the customer. They also note the hierarchical organizational structure of these two systems of management with, at each level, the notion of a feedback loop. This notion is fundamental to any kind of decision making. The paper is organized around the operational, tactical, and strategic levels, by describing for each level the main methods and tools for control by quality. The overview of these tools and methods starts at the operational level, with the Statistical Process Control, the Taguchi technique, and the "six sigma" approach. On the tactical level, we find a quality system approach, with a documented description of the procedures introduced in the firm. The management system can refer here to Quality Assurance, Total Productive Maintenance, or Management by Total Quality. The formalization through procedures of the rules of decision governing the process control enhances the validity of these rules. This leads to the enhancement of their reliability and to their consolidation. All this counterbalances the human, intrinsically fluctuating, behavior of the control operators. Strategic control by quality is then detailed, and the two main approaches, the continuous improvement approach and the proactive improvement approach, are introduced. Finally, the authors observe that at each of the three levels, the continuous process improvement, which is a component of Total Quality, becomes an essential preoccupation for the control. Ultimately, the recursive utilization of the Deming cycle remains the best practice for the control by quality.

  16. Development and Testing of an Automated 4-Day Text Messaging Guidance as an Aid for Improving Colonoscopy Preparation.

    PubMed

    Walter, Benjamin Michael; Klare, Peter; Neu, Bruno; Schmid, Roland M; von Delius, Stefan

    2016-06-21

    In gastroenterology a sufficient colon cleansing improves adenoma detection rate and prevents the need for preterm repeat colonoscopies due to invalid preparation. It has been shown that patient education is of major importance for improvement of colon cleansing. Objective of this study was to assess the function of an automated text messaging (short message service, SMS)-supported colonoscopy preparation starting 4 days before colonoscopy appointment. After preevaluation to assess mobile phone usage in the patient population for relevance of this approach, a Web-based, automated SMS text messaging system was developed, following which a single-center feasibility study at a tertiary care center was performed. Patients scheduled for outpatient colonoscopy were invited to participate. Patients enrolled in the study group received automated information about dietary recommendations and bowel cleansing during colonoscopy preparation. Data of outpatient colonoscopies with regular preparation procedure were used for pair matching and served as control. Primary end point was feasibility of SMS text messaging support in colonoscopy preparation assessed as stable and satisfactory function of the system. Secondary end points were quality of bowel preparation according to the Boston Bowel Preparation Scale (BBPS) and patient satisfaction with SMS text messaging-provided information assessed by a questionnaire. Web-based SMS text messaging-supported colonoscopy preparation was successful and feasible in 19 of 20 patients. Mean (standard error of the mean, SEM) total BBPS score was slightly higher in the SMS group than in the control group (7.3, SEM 0.3 vs 6.4, SEM 0.2) and for each colonic region (left, transverse, and right colon). Patient satisfaction regarding SMS text messaging-based information was high. Using SMS for colonoscopy preparation with 4 days' guidance including dietary recommendation is a new approach to improve colonoscopy preparation. Quality of colonoscopy preparation was sufficient and patients were highly satisfied with the system during colonoscopy preparation.

  17. Development and Testing of an Automated 4-Day Text Messaging Guidance as an Aid for Improving Colonoscopy Preparation

    PubMed Central

    Klare, Peter; Neu, Bruno; Schmid, Roland M; von Delius, Stefan

    2016-01-01

    Background In gastroenterology a sufficient colon cleansing improves adenoma detection rate and prevents the need for preterm repeat colonoscopies due to invalid preparation. It has been shown that patient education is of major importance for improvement of colon cleansing. Objective Objective of this study was to assess the function of an automated text messaging (short message service, SMS)–supported colonoscopy preparation starting 4 days before colonoscopy appointment. Methods After preevaluation to assess mobile phone usage in the patient population for relevance of this approach, a Web-based, automated SMS text messaging system was developed, following which a single-center feasibility study at a tertiary care center was performed. Patients scheduled for outpatient colonoscopy were invited to participate. Patients enrolled in the study group received automated information about dietary recommendations and bowel cleansing during colonoscopy preparation. Data of outpatient colonoscopies with regular preparation procedure were used for pair matching and served as control. Primary end point was feasibility of SMS text messaging support in colonoscopy preparation assessed as stable and satisfactory function of the system. Secondary end points were quality of bowel preparation according to the Boston Bowel Preparation Scale (BBPS) and patient satisfaction with SMS text messaging–provided information assessed by a questionnaire. Results Web-based SMS text messaging–supported colonoscopy preparation was successful and feasible in 19 of 20 patients. Mean (standard error of the mean, SEM) total BBPS score was slightly higher in the SMS group than in the control group (7.3, SEM 0.3 vs 6.4, SEM 0.2) and for each colonic region (left, transverse, and right colon). Patient satisfaction regarding SMS text messaging–based information was high. Conclusions Using SMS for colonoscopy preparation with 4 days’ guidance including dietary recommendation is a new approach to improve colonoscopy preparation. Quality of colonoscopy preparation was sufficient and patients were highly satisfied with the system during colonoscopy preparation. PMID:27329204

  18. SU-C-9A-02: Structured Noise Index as An Automated Quality Control for Nuclear Medicine: A Two Year Experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, J; Christianson, O; Samei, E

    Purpose: Flood-field uniformity evaluation is an essential element in the assessment of nuclear medicine (NM) gamma cameras. It serves as the central element of the quality control (QC) program, acquired and analyzed on a daily basis prior to clinical imaging. Uniformity images are traditionally analyzed using pixel value-based metrics which often fail to capture subtle structure and patterns caused by changes in gamma camera performance requiring additional visual inspection which is subjective and time demanding. The goal of this project was to develop and implement a robust QC metrology for NM that is effective in identifying non-uniformity issues, reporting issuesmore » in a timely manner for efficient correction prior to clinical involvement, all incorporated into an automated effortless workflow, and to characterize the program over a two year period. Methods: A new quantitative uniformity analysis metric was developed based on 2D noise power spectrum metrology and confirmed based on expert observer visual analysis. The metric, termed Structured Noise Index (SNI) was then integrated into an automated program to analyze, archive, and report on daily NM QC uniformity images. The effectiveness of the program was evaluated over a period of 2 years. Results: The SNI metric successfully identified visually apparent non-uniformities overlooked by the pixel valuebased analysis methods. Implementation of the program has resulted in nonuniformity identification in about 12% of daily flood images. In addition, due to the vigilance of staff response, the percentage of days exceeding trigger value shows a decline over time. Conclusion: The SNI provides a robust quantification of the NM performance of gamma camera uniformity. It operates seamlessly across a fleet of multiple camera models. The automated process provides effective workflow within the NM spectra between physicist, technologist, and clinical engineer. The reliability of this process has made it the preferred platform for NM uniformity analysis.« less

  19. Automated identification of best-quality coronary artery segments from multiple-phase coronary CT angiography (cCTA) for vessel analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Chuan; Chan, Heang-Ping; Hadjiiski, Lubomir M.; Chughtai, Aamer; Wei, Jun; Kazerooni, Ella A.

    2016-03-01

    We are developing an automated method to identify the best quality segment among the corresponding segments in multiple-phase cCTA. The coronary artery trees are automatically extracted from different cCTA phases using our multi-scale vessel segmentation and tracking method. An automated registration method is then used to align the multiple-phase artery trees. The corresponding coronary artery segments are identified in the registered vessel trees and are straightened by curved planar reformation (CPR). Four features are extracted from each segment in each phase as quality indicators in the original CT volume and the straightened CPR volume. Each quality indicator is used as a voting classifier to vote the corresponding segments. A newly designed weighted voting ensemble (WVE) classifier is finally used to determine the best-quality coronary segment. An observer preference study is conducted with three readers to visually rate the quality of the vessels in 1 to 6 rankings. Six and 10 cCTA cases are used as training and test set in this preliminary study. For the 10 test cases, the agreement between automatically identified best-quality (AI-BQ) segments and radiologist's top 2 rankings is 79.7%, and between AI-BQ and the other two readers are 74.8% and 83.7%, respectively. The results demonstrated that the performance of our automated method was comparable to those of experienced readers for identification of the best-quality coronary segments.

  20. Information flow analysis and Petri-net-based modeling for welding flexible manufacturing cell

    NASA Astrophysics Data System (ADS)

    Qiu, T.; Chen, Shanben; Wang, Y. T.; Wu, Lin

    2000-10-01

    Due to the development of advanced manufacturing technology and the introduction of Smart-Manufacturing notion in the field of modern industrial production, welding flexible manufacturing system (WFMS) using robot technology has become the inevitable developing direction on welding automation. In WFMS process, the flexibility for different welding products and the realizing on corresponding welding parameters control are the guarantees for welding quality. Based on a new intelligent arc-welding flexible manufacturing cell (WFMC), the system structure and control policies are studied in this paper. Aiming at the different information flows among every subsystem and central monitoring computer in this WFMC, Petri net theory is introduced into the process of welding manufacturing. With its help, a discrete control model of WFMC has been constructed, in which the system status is regarded as place and the control process is regarded as transition. Moreover, grounded on automation Petri net principle, the judging and utilizing of information obtained from welding sensors are imported into net structure, which extends the traditional Petri net concepts. The control model and policies researched in this paper have established foundation for further intelligent real-time control on WFMC and WFMS.

  1. Estimation of 3D reconstruction errors in a stereo-vision system

    NASA Astrophysics Data System (ADS)

    Belhaoua, A.; Kohler, S.; Hirsch, E.

    2009-06-01

    The paper presents an approach for error estimation for the various steps of an automated 3D vision-based reconstruction procedure of manufactured workpieces. The process is based on a priori planning of the task and built around a cognitive intelligent sensory system using so-called Situation Graph Trees (SGT) as a planning tool. Such an automated quality control system requires the coordination of a set of complex processes performing sequentially data acquisition, its quantitative evaluation and the comparison with a reference model (e.g., CAD object model) in order to evaluate quantitatively the object. To ensure efficient quality control, the aim is to be able to state if reconstruction results fulfill tolerance rules or not. Thus, the goal is to evaluate independently the error for each step of the stereo-vision based 3D reconstruction (e.g., for calibration, contour segmentation, matching and reconstruction) and then to estimate the error for the whole system. In this contribution, we analyze particularly the segmentation error due to localization errors for extracted edge points supposed to belong to lines and curves composing the outline of the workpiece under evaluation. The fitting parameters describing these geometric features are used as quality measure to determine confidence intervals and finally to estimate the segmentation errors. These errors are then propagated through the whole reconstruction procedure, enabling to evaluate their effect on the final 3D reconstruction result, specifically on position uncertainties. Lastly, analysis of these error estimates enables to evaluate the quality of the 3D reconstruction, as illustrated by the shown experimental results.

  2. Evaluating the quality of a cell counting measurement process via a dilution series experimental design.

    PubMed

    Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng

    2017-12-01

    Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.

  3. Graywater Discharges from Vessels

    DTIC Science & Technology

    2011-11-01

    soaps and detergents used in any capacity that will be discharged as part of graywater must be nontoxic and phosphate -free, and should be...mounted on one skid. The Evac MBR process is fully automated and controlled through a PLC. Evac uses high quality Kubota membranes. Kubota membranes...the best solution. • Non- corrosive . By choosing to use non- corrosive materials (including a special space age polymer yielding strength and

  4. Development and integration of block operations for data invariant automation of digital preprocessing and analysis of biological and biomedical Raman spectra.

    PubMed

    Schulze, H Georg; Turner, Robin F B

    2015-06-01

    High-throughput information extraction from large numbers of Raman spectra is becoming an increasingly taxing problem due to the proliferation of new applications enabled using advances in instrumentation. Fortunately, in many of these applications, the entire process can be automated, yielding reproducibly good results with significant time and cost savings. Information extraction consists of two stages, preprocessing and analysis. We focus here on the preprocessing stage, which typically involves several steps, such as calibration, background subtraction, baseline flattening, artifact removal, smoothing, and so on, before the resulting spectra can be further analyzed. Because the results of some of these steps can affect the performance of subsequent ones, attention must be given to the sequencing of steps, the compatibility of these sequences, and the propensity of each step to generate spectral distortions. We outline here important considerations to effect full automation of Raman spectral preprocessing: what is considered full automation; putative general principles to effect full automation; the proper sequencing of processing and analysis steps; conflicts and circularities arising from sequencing; and the need for, and approaches to, preprocessing quality control. These considerations are discussed and illustrated with biological and biomedical examples reflecting both successful and faulty preprocessing.

  5. Comparison of manual versus automated data collection method for an evidence-based nursing practice study.

    PubMed

    Byrne, M D; Jordan, T R; Welle, T

    2013-01-01

    The objective of this study was to investigate and improve the use of automated data collection procedures for nursing research and quality assurance. A descriptive, correlational study analyzed 44 orthopedic surgical patients who were part of an evidence-based practice (EBP) project examining post-operative oxygen therapy at a Midwestern hospital. The automation work attempted to replicate a manually-collected data set from the EBP project. Automation was successful in replicating data collection for study data elements that were available in the clinical data repository. The automation procedures identified 32 "false negative" patients who met the inclusion criteria described in the EBP project but were not selected during the manual data collection. Automating data collection for certain data elements, such as oxygen saturation, proved challenging because of workflow and practice variations and the reliance on disparate sources for data abstraction. Automation also revealed instances of human error including computational and transcription errors as well as incomplete selection of eligible patients. Automated data collection for analysis of nursing-specific phenomenon is potentially superior to manual data collection methods. Creation of automated reports and analysis may require initial up-front investment with collaboration between clinicians, researchers and information technology specialists who can manage the ambiguities and challenges of research and quality assurance work in healthcare.

  6. Modernization and Activation of the NASA Ames 11- by 11-Foot Transonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Kmak, Frank J.

    2000-01-01

    The Unitary Plan Wind Tunnel (UPWT) was modernized to improve performance, capability, productivity, and reliability. Automation systems were installed in all three UPWT tunnel legs and the Auxiliaries facility. Major improvements were made to the four control rooms, model support systems, main drive motors, and main drive speed control. Pressure vessel repairs and refurbishment to the electrical distribution system were also completed. Significant changes were made to improve test section flow quality in the 11-by 11-Foot Transonic leg. After the completion of the construction phase of the project, acceptance and checkout testing was performed to demonstrate the capabilities of the modernized facility. A pneumatic test of the tunnel circuit was performed to verify the structural integrity of the pressure vessel before wind-on operations. Test section turbulence, flow angularity, and acoustic parameters were measured throughout the tunnel envelope to determine the effects of the tunnel flow quality improvements. The new control system processes were thoroughly checked during wind-off and wind-on operations. Manual subsystem modes and automated supervisory modes of tunnel operation were validated. The aerodynamic and structural performance of both the new composite compressor rotor blades and the old aluminum rotor blades was measured. The entire subsonic and supersonic envelope of the 11-by 11-Foot Transonic leg was defined up to the maximum total pressure.

  7. Automated subsystems control development. [for life support systems of space station

    NASA Technical Reports Server (NTRS)

    Block, R. F.; Heppner, D. B.; Samonski, F. H., Jr.; Lance, N., Jr.

    1985-01-01

    NASA has the objective to launch a Space Station in the 1990s. It has been found that the success of the Space Station engineering development, the achievement of initial operational capability (IOC), and the operation of a productive Space Station will depend heavily on the implementation of an effective automation and control approach. For the development of technology needed to implement the required automation and control function, a contract entitled 'Automated Subsystems Control for Life Support Systems' (ASCLSS) was awarded to two American companies. The present paper provides a description of the ASCLSS program. Attention is given to an automation and control architecture study, a generic automation and control approach for hardware demonstration, a standard software approach, application of Air Revitalization Group (ARG) process simulators, and a generic man-machine interface.

  8. Stages and levels of automation in support of space teleoperations.

    PubMed

    Li, Huiyang; Wickens, Christopher D; Sarter, Nadine; Sebok, Angelia

    2014-09-01

    This study examined the impact of stage of automation on the performance and perceived workload during simulated robotic arm control tasks in routine and off-nominal scenarios. Automation varies with respect to the stage of information processing it supports and its assigned level of automation. Making appropriate choices in terms of stages and levels of automation is critical to ensure robust joint system performance. To date, this issue has been empirically studied in domains such as aviation and medicine but not extensively in the context of space operations. A total of 36 participants played the role of a payload specialist and controlled a simulated robotic arm. Participants performed fly-to tasks with two types of automation (camera recommendation and trajectory control automation) of varying stage. Tasks were performed during routine scenarios and in scenarios in which either the trajectory control automation or a hazard avoidance automation failed. Increasing the stage of automation progressively improved performance and lowered workload when the automation was reliable, but incurred severe performance costs when the system failed. The results from this study support concerns about automation-induced complacency and automation bias when later stages of automation are introduced. The benefits of such automation are offset by the risk of catastrophic outcomes when system failures go unnoticed or become difficult to recover from. A medium stage of automation seems preferable as it provides sufficient support during routine operations and helps avoid potentially catastrophic outcomes in circumstances when the automation fails.

  9. Automated Theorem Proving in High-Quality Software Design

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  10. Laser light-section sensor automating the production of textile-reinforced composites

    NASA Astrophysics Data System (ADS)

    Schmitt, R.; Niggemann, C.; Mersmann, C.

    2009-05-01

    Due to their advanced weight-specific mechanical properties, the application of fibre-reinforced plastics (FRP) has been established as a key technology in several engineering areas. Textile-based reinforcement structures (Preform) in particular achieve a high structural integrity due to the multi-dimensional build-up of dry-fibre layers combined with 3D-sewing and further textile processes. The final composite parts provide enhanced damage tolerances through excellent crash-energy absorbing characteristics. For these reasons, structural parts (e.g. frame) will be integrated in next generation airplanes. However, many manufacturing processes for FRP are still involving manual production steps without integrated quality control. The non-automated production implies considerable process dispersion and a high rework rate. Before the final inspection there is no reliable information about the production status. This work sets metrology as the key to automation and thus an economically feasible production, applying a laser light-section sensor system (LLSS) to measure process quality and feed back the results to close control loops of the production system. The developed method derives 3D-measurements from height profiles acquired by the LLSS. To assure the textile's quality a full surface scan is conducted, detecting defects or misalignment by comparing the measurement results with a CAD model of the lay-up. The method focuses on signal processing of the height profiles to ensure a sub-pixel accuracy using a novel algorithm based on a non-linear least-square fitting to a set of sigmoid functions. To compare the measured surface points to the CAD model, material characteristics are incorporated into the method. This ensures that only the fibre layer of the textile's surface is included and gaps between the fibres or overlaying seams are neglected. Finally, determining the uncertainty in measurement according to the GUM-standard proofed the sensor system's accuracy. First tests under industrial conditions showed that applying this sensor after the drapery of each textile layer reduces the scrap quota by approximately 30%.

  11. Investigation of Control System and Display Variations on Spacecraft Handling Qualities for Docking with Stationary and Rotating Targets

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Goodrich, Kenneth H.; Bailey, Randall E.; Barnes, James R.; Ragsdale, William A.; Neuhaus, Jason R.

    2010-01-01

    This paper documents the investigation into the manual docking of a preliminary version of the Crew Exploration Vehicle with stationary and rotating targets in Low Earth Orbit. The investigation was conducted at NASA Langley Research Center in the summer of 2008 in a repurposed fixed-base transport aircraft cockpit and involved nine evaluation astronauts and research pilots. The investigation quantified the benefits of a feed-forward reaction control system thruster mixing scheme to reduce translation-into-rotation coupling, despite unmodeled variations in individual thruster force levels and off-axis center of mass locations up to 12 inches. A reduced rate dead-band in the phase-plane attitude controller also showed some promise. Candidate predictive symbology overlaid on a docking ring centerline camera image did not improve handling qualities, but an innovative attitude status indicator symbol was beneficial. The investigation also showed high workload and handling quality problems when manual dockings were performed with a rotating target. These concerns indicate achieving satisfactory handling quality ratings with a vehicle configuration similar to the nominal Crew Exploration Vehicle may require additional automation.

  12. Aozan: an automated post-sequencing data-processing pipeline.

    PubMed

    Perrin, Sandrine; Firmo, Cyril; Lemoine, Sophie; Le Crom, Stéphane; Jourdren, Laurent

    2017-07-15

    Data management and quality control of output from Illumina sequencers is a disk space- and time-consuming task. Thus, we developed Aozan to automatically handle data transfer, demultiplexing, conversion and quality control once a run has finished. This software greatly improves run data management and the monitoring of run statistics via automatic emails and HTML web reports. Aozan is implemented in Java and Python, supported on Linux systems, and distributed under the GPLv3 License at: http://www.outils.genomique.biologie.ens.fr/aozan/ . Aozan source code is available on GitHub: https://github.com/GenomicParisCentre/aozan . aozan@biologie.ens.fr. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  13. Theory research of seam recognition and welding torch pose control based on machine vision

    NASA Astrophysics Data System (ADS)

    Long, Qiang; Zhai, Peng; Liu, Miao; He, Kai; Wang, Chunyang

    2017-03-01

    At present, the automation requirement of the welding become higher, so a method of the welding information extraction by vision sensor is proposed in this paper, and the simulation with the MATLAB has been conducted. Besides, in order to improve the quality of robot automatic welding, an information retrieval method for welding torch pose control by visual sensor is attempted. Considering the demands of welding technology and engineering habits, the relative coordinate systems and variables are strictly defined, and established the mathematical model of the welding pose, and verified its feasibility by using the MATLAB simulation in the paper, these works lay a foundation for the development of welding off-line programming system with high precision and quality.

  14. INDUSTRIE 4.0 - Automation in weft knitting technology

    NASA Astrophysics Data System (ADS)

    Simonis, K.; Gloy, Y.-S.; Gries, T.

    2016-07-01

    Industry 4.0 applies to the knitting industry. Regarding the knitting process retrofitting activities are executed mostly manually by an operator on the basis on the operator's experience. In doing so, the knitted fabric is not necessarily produced in the most efficient way regarding process speed and fabric quality aspects. The knitting division at ITA is concentrating on project activities regarding automation and Industry 4.0. ITA is working on analysing the correspondences of the knitting process parameters and their influence on the fabric quality. By using e.g. the augmented reality technology, the operator will be supported when setting up the knitting machine in case of product or pattern change - or in case of an intervention when production errors occur. Furthermore, the RFID-Technology offers great possibilities to ensure information flow between sub-processes of the fragmented textile process chain. ITA is using RFID-chips to save yarn production information and connect the information to the fabric producing machine control. In addition, ITA is currently working on integrating image processing systems into the large circular knitting machine in order to ensure online-quality measurement of the knitted fabrics. This will lead to a self-optimizing and selflearning knitting machine.

  15. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment

    NASA Astrophysics Data System (ADS)

    James, Mike R.; Robson, Stuart; d'Oleire-Oltmanns, Sebastian; Niethammer, Uwe

    2016-04-01

    Structure-from-motion (SfM) algorithms are greatly facilitating the production of detailed topographic models based on images collected by unmanned aerial vehicles (UAVs). However, SfM-based software does not generally provide the rigorous photogrammetric analysis required to fully understand survey quality. Consequently, error related to problems in control point data or the distribution of control points can remain undiscovered. Even if these errors are not large in magnitude, they can be systematic, and thus have strong implications for the use of products such as digital elevation models (DEMs) and orthophotos. Here, we develop a Monte Carlo approach to (1) improve the accuracy of products when SfM-based processing is used and (2) reduce the associated field effort by identifying suitable lower density deployments of ground control points. The method highlights over-parameterisation during camera self-calibration and provides enhanced insight into control point performance when rigorous error metrics are not available. Processing was implemented using commonly-used SfM-based software (Agisoft PhotoScan), which we augment with semi-automated and automated GCPs image measurement. We apply the Monte Carlo method to two contrasting case studies - an erosion gully survey (Taurodont, Morocco) carried out with an fixed-wing UAV, and an active landslide survey (Super-Sauze, France), acquired using a manually controlled quadcopter. The results highlight the differences in the control requirements for the two sites, and we explore the implications for future surveys. We illustrate DEM sensitivity to critical processing parameters and show how the use of appropriate parameter values increases DEM repeatability and reduces the spatial variability of error due to processing artefacts.

  16. Automated Formative Feedback and Summative Assessment Using Individualised Spreadsheet Assignments

    ERIC Educational Resources Information Center

    Blayney, Paul; Freeman, Mark

    2004-01-01

    This paper reports on the effects of automating formative feedback at the student's discretion and automating summative assessment with individualised spreadsheet assignments. Quality learning outcomes are achieved when students adopt deep approaches to learning (Ramsden, 2003). Learning environments designed to align assessment to learning…

  17. Flip the tip: an automated, high quality, cost-effective patch clamp screen.

    PubMed

    Lepple-Wienhues, Albrecht; Ferlinz, Klaus; Seeger, Achim; Schäfer, Arvid

    2003-01-01

    The race for creating an automated patch clamp has begun. Here, we present a novel technology to produce true gigaseals and whole cell preparations at a high rate. Suspended cells are flushed toward the tip of glass micropipettes. Seal, whole-cell break-in, and pipette/liquid handling are fully automated. Extremely stable seals and access resistance guarantee high recording quality. Data obtained from different cell types sealed inside pipettes show long-term stability, voltage clamp and seal quality, as well as block by compounds in the pM range. A flexible array of independent electrode positions minimizes consumables consumption at maximal throughput. Pulled micropipettes guarantee a proven gigaseal substrate with ultra clean and smooth surface at low cost.

  18. Effective Materials Property Information Management for the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Weiju; Cebon, David; Arnold, Steve

    2010-01-01

    This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in industry, research organizations and government agencies. In part these are fuelled by the demands for higher efficiency in material testing, product design and development and engineering analysis. But equally important, organizations are being driven to employ sophisticated methods and software tools for managing their mission-critical materials information by the needs for consistency, quality and traceability of data, as well as control of access to proprietary or sensitive information. Furthermore the use of increasingly sophisticated nonlinear,more » anisotropic and multi-scale engineering analysis approaches, particularly for composite materials, requires both processing of much larger volumes of test data for development of constitutive models and much more complex materials data input requirements for Computer-Aided Engineering (CAE) software. And finally, the globalization of engineering processes and outsourcing of design and development activities generates much greater needs for sharing a single gold source of materials information between members of global engineering teams in extended supply-chains. Fortunately material property management systems have kept pace with the growing user demands. They have evolved from hard copy archives, through simple electronic databases, to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access control, version control, and quality control; (ii) a wide range of data import, export and analysis capabilities; (iii) mechanisms for ensuring that all data is traceable to its pedigree sources: details of testing programs, published sources, etc; (iv) tools for searching, reporting and viewing the data; and (v) access to the information via a wide range of interfaces, including web browsers, rich clients, programmatic access and clients embedded in third-party applications, such as CAE systems. This paper discusses the important requirements for advanced material data management systems as well as the future challenges and opportunities such as automated error checking, automated data quality assessment and characterization, identification of gaps in data, as well as functionalities and business models to keep users returning to the source: to generate user demand to fuel database growth and maintenance.« less

  19. Quantitative Analysis and Stability of the Rodenticide TETS ...

    EPA Pesticide Factsheets

    Journal Article The determination of the rodenticide tetramethylenedisulfotetramine (TETS) in drinking water is reportable through the use of automated sample preparation via solid phase extraction and detection using isotope dilution gas chromatography-mass spectrometry. The method was characterized over twenty-two analytical batches with quality control samples. Accuracies for low and high concentration quality control pools were 100 and 101%, respectively. The minimum reporting level (MRL) for TETS in this method is 0.50 ug/L. Five drinking waters representing a range of water quality parameters and disinfection practices were fortified with TETS at ten times the MRL and analyzed over a 28 day period to determine the stability of TETS in these waters. The amount of TETS measured in these samples averaged 100 ± 6% of the amount fortified suggesting that tap water samples may be held for up to 28 days prior to analysis.

  20. Quality control in the year 2000.

    PubMed

    Schade, B

    1992-01-01

    'Just-in-time' production is a prerequisite for a company to meet the challenges of competition. Manufacturing cycles have been so successfully optimized that release time now has become a significant factor. A vision for a major quality-control (QC) contribution to profitability in this decade seems to be the just-in-time release. Benefits will go beyond cost savings for lower inventory. The earlier detection of problems will reduce rejections and scrap. In addition, problem analysis and problem-solving will be easier. To achieve just-in-time release, advanced automated systems like robots will become the workhorses in QC for high volume pharmaceutical production. The requirements for these systems are extremely high in terms of quality, reliability and ruggedness. Crucial for the success might be advances in use of microelectronics for error checks, system recording, trouble shooting, etc. as well as creative new approaches (for example the use of redundant assay systems).

  1. Quality control in the year 2000

    PubMed Central

    Schade, Bernd

    1992-01-01

    ‘Just-in-time’ production is a prerequisite for a company to meet the challenges of competition. Manufacturing cycles have been so successfully optimized that release time now has become a significant factor. A vision for a major quality-control (QC) contribution to profitability in this decade seems to be the just-in-time release. Benefits will go beyond cost savings for lower inventory. The earlier detection of problems will reduce rejections and scrap. In addition, problem analysis and problem-solving will be easier. To achieve just-in-time release, advanced automated systems like robots will become the workhorses in QC for high volume pharmaceutical production. The requirements for these systems are extremely high in terms of quality, reliability and ruggedness. Crucial for the success might be advances in use of microelectronics for error checks, system recording, trouble shooting, etc. as well as creative new approaches (for example the use of redundant assay systems). PMID:18924930

  2. Multicenter Cell Processing for Cardiovascular Regenerative Medicine Applications - The Cardiovascular Cell Therapy Research Network (CCTRN) Experience

    PubMed Central

    Gee, Adrian P.; Richman, Sara; Durett, April; McKenna, David; Traverse, Jay; Henry, Timothy; Fisk, Diann; Pepine, Carl; Bloom, Jeannette; Willerson, James; Prater, Karen; Zhao, David; Koç, Jane Reese; Ellis, Steven; Taylor, Doris; Cogle, Christopher; Moyé, Lemuel; Simari, Robert; Skarlatos, Sonia

    2013-01-01

    Background Aims Multi-center cellular therapy clinical trials require the establishment and implementation of standardized cell processing protocols and associated quality control mechanisms. The aims here were to develop such an infrastructure in support of the Cardiovascular Cell Therapy Research Network (CCTRN) and to report on the results of processing for the first 60 patients. Methods Standardized cell preparations, consisting of autologous bone marrow mononuclear cells, prepared using the Sepax device were manufactured at each of the five processing facilities that supported the clinical treatment centers. Processing staff underwent centralized training that included proficiency evaluation. Quality was subsequently monitored by a central quality control program that included product evaluation by the CCTRN biorepositories. Results Data from the first 60 procedures demonstrate that uniform products, that met all release criteria, could be manufactured at all five sites within 7 hours of receipt of the bone marrow. Uniformity was facilitated by use of the automated systems (the Sepax for processing and the Endosafe device for endotoxin testing), standardized procedures and centralized quality control. Conclusions Complex multicenter cell therapy and regenerative medicine protocols can, where necessary, successfully utilize local processing facilities once an effective infrastructure is in place to provide training, and quality control. PMID:20524773

  3. Automated delineation and characterization of watersheds for more than 3,000 surface-water-quality monitoring stations active in 2010 in Texas

    USGS Publications Warehouse

    Archuleta, Christy-Ann M.; Gonzales, Sophia L.; Maltby, David R.

    2012-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Texas Commission on Environmental Quality, developed computer scripts and applications to automate the delineation of watershed boundaries and compute watershed characteristics for more than 3,000 surface-water-quality monitoring stations in Texas that were active during 2010. Microsoft Visual Basic applications were developed using ArcGIS ArcObjects to format the source input data required to delineate watershed boundaries. Several automated scripts and tools were developed or used to calculate watershed characteristics using Python, Microsoft Visual Basic, and the RivEX tool. Automated methods were augmented by the use of manual methods, including those done using ArcMap software. Watershed boundaries delineated for the monitoring stations are limited to the extent of the Subbasin boundaries in the USGS Watershed Boundary Dataset, which may not include the total watershed boundary from the monitoring station to the headwaters.

  4. Improvement of Vivarium Biodecontamination through Data-acquisition Systems and Automation.

    PubMed

    Devan, Shakthi Rk; Vasu, Suresh; Mallikarjuna, Yogesha; Ponraj, Ramkumar; Kamath, Gireesh; Poosala, Suresh

    2018-03-01

    Biodecontamination is important for eliminating pathogens at research animal facilities, thereby preventing contamination within barrier systems. We enhanced our facility's standard biodecontamination method to replace the traditional foggers, and the new system was used effectively after creating bypass ducts in HVAC units so that individual rooms could be isolated. The entire system was controlled by inhouse-developed supervisory control and data-acquisition software that supported multiple cycles of decontamination by equipment, which had different decontamination capacities, operated in parallel, and used different agents, including H2O2 vapor and ClO2 gas. The process was validated according to facility mapping, and effectiveness was assessed by using biologic (Geobacillus stearothermophilus) and chemical indicator strips, which were positioned before decontamination, and by sampling contact plates after the completion of each cycle. The results of biologic indicators showed 6-log reduction in microbial counts after successful decontamination cycles for both agents and found to be compatible with clean-room panels including commonly used materials in vivarium such as racks, cages, trolleys, cage changing stations, biosafety cabinets, refrigerators and other equipment in both procedure and animal rooms. In conclusion, the automated process enabled users to perform effective decontamination through multiple cycles with realtime documentation and provided additional capability to deal with potential outbreaks. Enabling software integration of automation improved quality-control systems in our vivarium.

  5. Automated Cognitive Health Assessment Using Smart Home Monitoring of Complex Tasks

    PubMed Central

    Dawadi, Prafulla N.; Cook, Diane J.; Schmitter-Edgecombe, Maureen

    2014-01-01

    One of the many services that intelligent systems can provide is the automated assessment of resident well-being. We hypothesize that the functional health of individuals, or ability of individuals to perform activities independently without assistance, can be estimated by tracking their activities using smart home technologies. In this paper, we introduce a machine learning-based method for assessing activity quality in smart homes. To validate our approach we quantify activity quality for 179 volunteer participants who performed a complex, interweaved set of activities in our smart home apartment. We observed a statistically significant correlation (r=0.79) between automated assessment of task quality and direct observation scores. Using machine learning techniques to predict the cognitive health of the participants based on task quality is accomplished with an AUC value of 0.64. We believe that this capability is an important step in understanding everyday functional health of individuals in their home environments. PMID:25530925

  6. Automated Cognitive Health Assessment Using Smart Home Monitoring of Complex Tasks.

    PubMed

    Dawadi, Prafulla N; Cook, Diane J; Schmitter-Edgecombe, Maureen

    2013-11-01

    One of the many services that intelligent systems can provide is the automated assessment of resident well-being. We hypothesize that the functional health of individuals, or ability of individuals to perform activities independently without assistance, can be estimated by tracking their activities using smart home technologies. In this paper, we introduce a machine learning-based method for assessing activity quality in smart homes. To validate our approach we quantify activity quality for 179 volunteer participants who performed a complex, interweaved set of activities in our smart home apartment. We observed a statistically significant correlation (r=0.79) between automated assessment of task quality and direct observation scores. Using machine learning techniques to predict the cognitive health of the participants based on task quality is accomplished with an AUC value of 0.64. We believe that this capability is an important step in understanding everyday functional health of individuals in their home environments.

  7. The role of optical flow in automated quality assessment of full-motion video

    NASA Astrophysics Data System (ADS)

    Harguess, Josh; Shafer, Scott; Marez, Diego

    2017-09-01

    In real-world video data, such as full-motion-video (FMV) taken from unmanned vehicles, surveillance systems, and other sources, various corruptions to the raw data is inevitable. This can be due to the image acquisition process, noise, distortion, and compression artifacts, among other sources of error. However, we desire methods to analyze the quality of the video to determine whether the underlying content of the corrupted video can be analyzed by humans or machines and to what extent. Previous approaches have shown that motion estimation, or optical flow, can be an important cue in automating this video quality assessment. However, there are many different optical flow algorithms in the literature, each with their own advantages and disadvantages. We examine the effect of the choice of optical flow algorithm (including baseline and state-of-the-art), on motionbased automated video quality assessment algorithms.

  8. First Worldwide Proficiency Study on Variable-Number Tandem-Repeat Typing of Mycobacterium tuberculosis Complex Strains

    PubMed Central

    de Beer, Jessica L.; Kremer, Kristin; Ködmön, Csaba; Supply, Philip

    2012-01-01

    Although variable-number tandem-repeat (VNTR) typing has gained recognition as the new standard for the DNA fingerprinting of Mycobacterium tuberculosis complex (MTBC) isolates, external quality control programs have not yet been developed. Therefore, we organized the first multicenter proficiency study on 24-locus VNTR typing. Sets of 30 DNAs of MTBC strains, including 10 duplicate DNA samples, were distributed among 37 participating laboratories in 30 different countries worldwide. Twenty-four laboratories used an in-house-adapted method with fragment sizing by gel electrophoresis or an automated DNA analyzer, nine laboratories used a commercially available kit, and four laboratories used other methods. The intra- and interlaboratory reproducibilities of VNTR typing varied from 0% to 100%, with averages of 72% and 60%, respectively. Twenty of the 37 laboratories failed to amplify particular VNTR loci; if these missing results were ignored, the number of laboratories with 100% interlaboratory reproducibility increased from 1 to 5. The average interlaboratory reproducibility of VNTR typing using a commercial kit was better (88%) than that of in-house-adapted methods using a DNA analyzer (70%) or gel electrophoresis (50%). Eleven laboratories using in-house-adapted manual typing or automated typing scored inter- and intralaboratory reproducibilities of 80% or higher, which suggests that these approaches can be used in a reliable way. In conclusion, this first multicenter study has documented the worldwide quality of VNTR typing of MTBC strains and highlights the importance of international quality control to improve genotyping in the future. PMID:22170917

  9. ProDeGe: A computational protocol for fully automated decontamination of genomes

    DOE PAGES

    Tennessen, Kristin; Andersen, Evan; Clingenpeel, Scott; ...

    2015-06-09

    Single amplified genomes and genomes assembled from metagenomes have enabled the exploration of uncultured microorganisms at an unprecedented scale. However, both these types of products are plagued by contamination. Since these genomes are now being generated in a high-throughput manner and sequences from them are propagating into public databases to drive novel scientific discoveries, rigorous quality controls and decontamination protocols are urgently needed. Here, we present ProDeGe (Protocol for fully automated Decontamination of Genomes), the first computational protocol for fully automated decontamination of draft genomes. ProDeGe classifies sequences into two classes—clean and contaminant—using a combination of homology and feature-based methodologies.more » On average, 84% of sequence from the non-target organism is removed from the data set (specificity) and 84% of the sequence from the target organism is retained (sensitivity). Lastly, the procedure operates successfully at a rate of ~0.30 CPU core hours per megabase of sequence and can be applied to any type of genome sequence.« less

  10. The discriminatory power of ribotyping as automatable technique for differentiation of bacteria.

    PubMed

    Schumann, Peter; Pukall, Rüdiger

    2013-09-01

    Since the introduction of ribonucleic acid gene restriction patterns as taxonomic tools in 1986, ribotyping has become an established method for systematics, epidemiological, ecological and population studies of microorganisms. In the last 25 years, several modifications have improved the convenience, reproducibility and turn-around time of this technique. The technological development culminated in the automation of ribotyping which allowed for high-throughput applications e.g. in the quality control of food production, pharmaceutical industry and culture collections. The capability of the fully automated RiboPrinter(®) System for the differentiation of bacteria below the species level is compared with the discriminatory power of traditional ribotyping, of molecular fingerprint techniques like PFGE, MLST and MLVA as well as of MALDI-TOF mass spectrometry. While automated RiboPrinting is advantageous with respect to standardization, ease and speed, PCR ribotyping has proved being a highly discriminatory, flexible, robust and cost-efficient routine technique which makes inter-laboratory comparison and build of ribotype databases possible, too. Copyright © 2013 Elsevier GmbH. All rights reserved.

  11. ProDeGe: A computational protocol for fully automated decontamination of genomes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tennessen, Kristin; Andersen, Evan; Clingenpeel, Scott

    Single amplified genomes and genomes assembled from metagenomes have enabled the exploration of uncultured microorganisms at an unprecedented scale. However, both these types of products are plagued by contamination. Since these genomes are now being generated in a high-throughput manner and sequences from them are propagating into public databases to drive novel scientific discoveries, rigorous quality controls and decontamination protocols are urgently needed. Here, we present ProDeGe (Protocol for fully automated Decontamination of Genomes), the first computational protocol for fully automated decontamination of draft genomes. ProDeGe classifies sequences into two classes—clean and contaminant—using a combination of homology and feature-based methodologies.more » On average, 84% of sequence from the non-target organism is removed from the data set (specificity) and 84% of the sequence from the target organism is retained (sensitivity). Lastly, the procedure operates successfully at a rate of ~0.30 CPU core hours per megabase of sequence and can be applied to any type of genome sequence.« less

  12. Effect of Automated Online Counseling on Clinical Outcomes and Quality of Life Among Adolescents With Acne Vulgaris

    PubMed Central

    Tuong, William; Wang, Audrey S.; Armstrong, April W.

    2016-01-01

    IMPORTANCE Effective patient education is necessary for treating patients with acne vulgaris. Automated online counseling simulates face-to-face encounters and may be a useful tool to deliver education. OBJECTIVE To compare the effectiveness of a standard educational website with that of an automated-counseling website in improving clinical outcomes and quality of life among adolescents with acne. DESIGN, SETTING, AND PARTICIPANTS Randomized clinical trial conducted between March 27, 2014, and June 27, 2014, including a 12-week follow-up in a local inner-city high school. Ninety-eight students aged at least 13 years with mild to moderate acne were eligible for participation. A per-protocol analysis of the evaluable population was conducted on clinical outcome data. INTERVENTIONS Participants viewed either a standard educational website or an automated-counseling website. MAIN OUTCOMES AND MEASURES The primary outcome was the total acne lesion count. Secondary measures included the Children’s Dermatology Life Quality Index (CDLQI) scores and general skin care behavior. RESULTS Forty-nine participants were randomized to each group. At baseline, the mean (SD) total acne lesion count was not significantly different between the standard-website group and the automated-counseling–website group (21.33 [10.81] vs 25.33 [12.45]; P = .10). Improvement in the mean (SD) acne lesion count was not significantly different between the standard-website group and the automated-counseling–website group (0.20 [9.26] vs 3.90 [12.19]; P = .10). The mean (SD) improvement in CDLQI score for the standard-website group was not significantly different from that of the automated-counseling–website group (0.17 [2.64] vs 0.39 [2.94]; P = .71). After 12 weeks, a greater proportion of participants in the automated-counseling–website group maintained or adopted a recommended anti-acne skin care routine compared with the standard-website group (43% vs 22%; P = .03). CONCLUSIONS AND RELEVANCE Internet-based acne education using automated counseling was not superior to standard-website education in improving acne severity and quality of life. However, a greater proportion of participants who viewed the automated-counseling website reported having maintained or adopted a recommended anti-acne skin care regimen. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT02031718 PMID:26017816

  13. SAMOS Surface Fluxes

    NASA Astrophysics Data System (ADS)

    Smith, Shawn; Bourassa, Mark

    2014-05-01

    The development of a new surface flux dataset based on underway meteorological observations from research vessels will be presented. The research vessel data center at the Florida State University routinely acquires, quality controls, and distributes underway surface meteorological and oceanographic observations from over 30 oceanographic vessels. These activities are coordinated by the Shipboard Automated Meteorological and Oceanographic System (SAMOS) initiative in partnership with the Rolling Deck to Repository (R2R) project. Recently, the SAMOS data center has used these underway observations to produce bulk flux estimates for each vessel along individual cruise tracks. A description of this new flux product, along with the underlying data quality control procedures applied to SAMOS observations, will be provided. Research vessels provide underway observations at high-temporal frequency (1 min. sampling interval) that include navigational (position, course, heading, and speed), meteorological (air temperature, humidity, wind, surface pressure, radiation, rainfall), and oceanographic (surface sea temperature and salinity) samples. Vessels recruited to the SAMOS initiative collect a high concentration of data within the U.S. continental shelf and also frequently operate well outside routine shipping lanes, capturing observations in extreme ocean environments (Southern, Arctic, South Atlantic, and South Pacific oceans). These observations are atypical for their spatial and temporal sampling, making them very useful for many applications including validation of numerical models and satellite retrievals, as well as local assessments of natural variability. Individual SAMOS observations undergo routine automated quality control and select vessels receive detailed visual data quality inspection. The result is a quality-flagged data set that is ideal for calculating turbulent flux estimates. We will describe the bulk flux algorithms that have been applied to the observations and the choices of constants that are used. Analysis of the preliminary SAMOS flux products will be presented, including spatial and temporal coverage for each derived parameter. The unique quality and sampling locations of research vessel observations and their independence from many models and products makes them ideal for validation studies. The strengths and limitations of research observations for flux validation studies will be discussed. The authors welcome a discussion with the flux community regarding expansion of the SAMOS program to include additional international vessels, thus facilitating and expansion of this research vessel-based flux product.

  14. Automated urinalysis: first experiences and a comparison between the Iris iQ200 urine microscopy system, the Sysmex UF-100 flow cytometer and manual microscopic particle counting.

    PubMed

    Shayanfar, Noushin; Tobler, Ulrich; von Eckardstein, Arnold; Bestmann, Lukas

    2007-01-01

    Automated analysis of insoluble urine components can reduce the workload of conventional microscopic examination of urine sediment and is possibly helpful for standardization. We compared the diagnostic performance of two automated urine sediment analyzers and combined dipstick/automated urine analysis with that of the traditional dipstick/microscopy algorithm. A total of 332 specimens were collected and analyzed for insoluble urine components by microscopy and automated analyzers, namely the Iris iQ200 (Iris Diagnostics) and the UF-100 flow cytometer (Sysmex). The coefficients of variation for day-to-day quality control of the iQ200 and UF-100 analyzers were 6.5% and 5.5%, respectively, for red blood cells. We reached accuracy ranging from 68% (bacteria) to 97% (yeast) for the iQ200 and from 42% (bacteria) to 93% (yeast) for the UF-100. The combination of dipstick and automated urine sediment analysis increased the sensitivity of screening to approximately 98%. We conclude that automated urine sediment analysis is sufficiently precise and improves the workflow in a routine laboratory. In addition, it allows sediment analysis of all urine samples and thereby helps to detect pathological samples that would have been missed in the conventional two-step procedure according to the European guidelines. Although it is not a substitute for microscopic sediment examination, it can, when combined with dipstick testing, reduce the number of specimens submitted to microscopy. Visual microscopy is still required for some samples, namely, dysmorphic erythrocytes, yeasts, Trichomonas, oval fat bodies, differentiation of casts and certain crystals.

  15. Transitioning Resolution Responsibility between the Controller and Automation Team in Simulated NextGen Separation Assurance

    NASA Technical Reports Server (NTRS)

    Cabrall, C.; Gomez, A.; Homola, J.; Hunt, S..; Martin, L.; Merccer, J.; Prevott, T.

    2013-01-01

    As part of an ongoing research effort on separation assurance and functional allocation in NextGen, a controller- in-the-loop study with ground-based automation was conducted at NASA Ames' Airspace Operations Laboratory in August 2012 to investigate the potential impact of introducing self-separating aircraft in progressively advanced NextGen timeframes. From this larger study, the current exploratory analysis of controller-automation interaction styles focuses on the last and most far-term time frame. Measurements were recorded that firstly verified the continued operational validity of this iteration of the ground-based functional allocation automation concept in forecast traffic densities up to 2x that of current day high altitude en-route sectors. Additionally, with greater levels of fully automated conflict detection and resolution as well as the introduction of intervention functionality, objective and subjective analyses showed a range of passive to active controller- automation interaction styles between the participants. Not only did the controllers work with the automation to meet their safety and capacity goals in the simulated future NextGen timeframe, they did so in different ways and with different attitudes of trust/use of the automation. Taken as a whole, the results showed that the prototyped controller-automation functional allocation framework was very flexible and successful overall.

  16. High Performance Building Mockup in FLEXLAB

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNeil, Andrew; Kohler, Christian; Lee, Eleanor S.

    Genentech has ambitious energy and indoor environmental quality performance goals for Building 35 (B35) being constructed by Webcor at the South San Francisco campus. Genentech and Webcor contracted with the Lawrence Berkeley National Laboratory (LBNL) to test building systems including lighting, lighting controls, shade fabric, and automated shading controls in LBNL’s new FLEXLAB facility. The goal of the testing is to ensure that the systems installed in the new office building will function in a way that reduces energy consumption and provides a comfortable work environment for employees.

  17. Development and Validation of a Laboratory-Developed Multiplex Real-Time PCR Assay on the BD Max System for Detection of Herpes Simplex Virus and Varicella-Zoster Virus DNA in Various Clinical Specimens.

    PubMed

    Pillet, Sylvie; Verhoeven, Paul O; Epercieux, Amélie; Bourlet, Thomas; Pozzetto, Bruno

    2015-06-01

    A multiplex real-time PCR (quantitative PCR [qPCR]) assay detecting herpes simplex virus (HSV) and varicella-zoster virus (VZV) DNA together with an internal control was developed on the BD Max platform combining automated DNA extraction and an open amplification procedure. Its performance was compared to those of PCR assays routinely used in the laboratory, namely, a laboratory-developed test for HSV DNA on the LightCycler instrument and a test using a commercial master mix for VZV DNA on the ABI7500fast system. Using a pool of negative cerebrospinal fluid (CSF) samples spiked with either calibrated controls for HSV-1 and VZV or dilutions of a clinical strain that was previously quantified for HSV-2, the empirical limit of detection of the BD Max assay was 195.65, 91.80, and 414.07 copies/ml for HSV-1, HSV-2, and VZV, respectively. All the samples from HSV and VZV DNA quality control panels (Quality Control for Molecular Diagnostics [QCMD], 2013, Glasgow, United Kingdom) were correctly identified by the BD Max assay. From 180 clinical specimens of various origins, 2 CSF samples were found invalid by the BD Max assay due to the absence of detection of the internal control; a concordance of 100% was observed between the BD Max assay and the corresponding routine tests. The BD Max assay detected the PCR signal 3 to 4 cycles earlier than did the routine methods. With results available within 2 h on a wide range of specimens, this sensitive and fully automated PCR assay exhibited the qualities required for detecting simultaneously HSV and VZV DNA on a routine basis. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  18. Automation Applications in an Advanced Air Traffic Management System : Volume 4A. Automation Requirements.

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work force, computer resources, controller productivity, system manning, failure ef...

  19. Human-centered automation: Development of a philosophy

    NASA Technical Reports Server (NTRS)

    Graeber, Curtis; Billings, Charles E.

    1990-01-01

    Information on human-centered automation philosophy is given in outline/viewgraph form. It is asserted that automation of aircraft control will continue in the future, but that automation should supplement, not supplant the human management and control function in civil air transport.

  20. Sharing control between humans and automation using haptic interface: primary and secondary task performance benefits.

    PubMed

    Griffiths, Paul G; Gillespie, R Brent

    2005-01-01

    This paper describes a paradigm for human/automation control sharing in which the automation acts through a motor coupled to a machine's manual control interface. The manual interface becomes a haptic display, continually informing the human about automation actions. While monitoring by feel, users may choose either to conform to the automation or override it and express their own control intentions. This paper's objective is to demonstrate that adding automation through haptic display can be used not only to improve performance on a primary task but also to reduce perceptual demands or free attention for a secondary task. Results are presented from three experiments in which 11 participants completed a lane-following task using a motorized steering wheel on a fixed-base driving simulator. The automation behaved like a copilot, assisting with lane following by applying torques to the steering wheel. Results indicate that haptic assist improves lane following by least 30%, p < .0001, while reducing visual demand by 29%, p < .0001, or improving reaction time in a secondary tone localization task by 18 ms, p = .0009. Potential applications of this research include the design of automation interfaces based on haptics that support human/automation control sharing better than traditional push-button automation interfaces.

  1. Optimizing the balance between task automation and human manual control in simulated submarine track management.

    PubMed

    Chen, Stephanie I; Visser, Troy A W; Huf, Samuel; Loft, Shayne

    2017-09-01

    Automation can improve operator performance and reduce workload, but can also degrade operator situation awareness (SA) and the ability to regain manual control. In 3 experiments, we examined the extent to which automation could be designed to benefit performance while ensuring that individuals maintained SA and could regain manual control. Participants completed a simulated submarine track management task under varying task load. The automation was designed to facilitate information acquisition and analysis, but did not make task decisions. Relative to a condition with no automation, the continuous use of automation improved performance and reduced subjective workload, but degraded SA. Automation that was engaged and disengaged by participants as required (adaptable automation) moderately improved performance and reduced workload relative to no automation, but degraded SA. Automation engaged and disengaged based on task load (adaptive automation) provided no benefit to performance or workload, and degraded SA relative to no automation. Automation never led to significant return-to-manual deficits. However, all types of automation led to degraded performance on a nonautomated task that shared information processing requirements with automated tasks. Given these outcomes, further research is urgently required to establish how to design automation to maximize performance while keeping operators cognitively engaged. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. Bayesian deconvolution and quantification of metabolites in complex 1D NMR spectra using BATMAN.

    PubMed

    Hao, Jie; Liebeke, Manuel; Astle, William; De Iorio, Maria; Bundy, Jacob G; Ebbels, Timothy M D

    2014-01-01

    Data processing for 1D NMR spectra is a key bottleneck for metabolomic and other complex-mixture studies, particularly where quantitative data on individual metabolites are required. We present a protocol for automated metabolite deconvolution and quantification from complex NMR spectra by using the Bayesian automated metabolite analyzer for NMR (BATMAN) R package. BATMAN models resonances on the basis of a user-controllable set of templates, each of which specifies the chemical shifts, J-couplings and relative peak intensities for a single metabolite. Peaks are allowed to shift position slightly between spectra, and peak widths are allowed to vary by user-specified amounts. NMR signals not captured by the templates are modeled non-parametrically by using wavelets. The protocol covers setting up user template libraries, optimizing algorithmic input parameters, improving prior information on peak positions, quality control and evaluation of outputs. The outputs include relative concentration estimates for named metabolites together with associated Bayesian uncertainty estimates, as well as the fit of the remainder of the spectrum using wavelets. Graphical diagnostics allow the user to examine the quality of the fit for multiple spectra simultaneously. This approach offers a workflow to analyze large numbers of spectra and is expected to be useful in a wide range of metabolomics studies.

  3. Impedance-based cellular assays for regenerative medicine.

    PubMed

    Gamal, W; Wu, H; Underwood, I; Jia, J; Smith, S; Bagnaninchi, P O

    2018-07-05

    Therapies based on regenerative techniques have the potential to radically improve healthcare in the coming years. As a result, there is an emerging need for non-destructive and label-free technologies to assess the quality of engineered tissues and cell-based products prior to their use in the clinic. In parallel, the emerging regenerative medicine industry that aims to produce stem cells and their progeny on a large scale will benefit from moving away from existing destructive biochemical assays towards data-driven automation and control at the industrial scale. Impedance-based cellular assays (IBCA) have emerged as an alternative approach to study stem-cell properties and cumulative studies, reviewed here, have shown their potential to monitor stem-cell renewal, differentiation and maturation. They offer a novel method to non-destructively assess and quality-control stem-cell cultures. In addition, when combined with in vitro disease models they provide complementary insights as label-free phenotypic assays. IBCA provide quantitative and very sensitive results that can easily be automated and up-scaled in multi-well format. When facing the emerging challenge of real-time monitoring of three-dimensional cell culture dielectric spectroscopy and electrical impedance tomography represent viable alternatives to two-dimensional impedance sensing.This article is part of the theme issue 'Designer human tissue: coming to a lab near you'. © 2018 The Author(s).

  4. Quantitative analysis of ex vivo colorectal epithelium using an automated feature extraction algorithm for microendoscopy image data

    PubMed Central

    Prieto, Sandra P.; Lai, Keith K.; Laryea, Jonathan A.; Mizell, Jason S.; Muldoon, Timothy J.

    2016-01-01

    Abstract. Qualitative screening for colorectal polyps via fiber bundle microendoscopy imaging has shown promising results, with studies reporting high rates of sensitivity and specificity, as well as low interobserver variability with trained clinicians. A quantitative image quality control and image feature extraction algorithm (QFEA) was designed to lessen the burden of training and provide objective data for improved clinical efficacy of this method. After a quantitative image quality control step, QFEA extracts field-of-view area, crypt area, crypt circularity, and crypt number per image. To develop and validate this QFEA, a training set of microendoscopy images was collected from freshly resected porcine colon epithelium. The algorithm was then further validated on ex vivo image data collected from eight human subjects, selected from clinically normal appearing regions distant from grossly visible tumor in surgically resected colorectal tissue. QFEA has proven flexible in application to both mosaics and individual images, and its automated crypt detection sensitivity ranges from 71 to 94% despite intensity and contrast variation within the field of view. It also demonstrates the ability to detect and quantify differences in grossly normal regions among different subjects, suggesting the potential efficacy of this approach in detecting occult regions of dysplasia. PMID:27335893

  5. An automated testing tool for traffic signal controller functionalities.

    DOT National Transportation Integrated Search

    2010-03-01

    The purpose of this project was to develop an automated tool that facilitates testing of traffic controller functionality using controller interface device (CID) technology. Benefits of such automated testers to traffic engineers include reduced test...

  6. Automating the Exchange of Military Personnel Data Among Selected Army Organizations. Volume II. Appendices,

    DTIC Science & Technology

    1981-06-30

    manpower needs as to quantity, quality and timing; all the internal functions of the personnel service are tapped to help meet these ends. Manpower...Program ACOS - Automated Computation of Service ACQ - Acquisition ACSAC - Assistant Chief of Staff for Automation and Comunications ACT - Automated...ARSTAF - Army Staff ARSTAFF - Army Staff ARTEP - Army Training and Evaluation Program ASI - Additional Skill Identifier ASVAB - Armed Services

  7. Automated Assessment of the Quality of Depression Websites

    PubMed Central

    Tang, Thanh Tin; Hawking, David; Christensen, Helen

    2005-01-01

    Background Since health information on the World Wide Web is of variable quality, methods are needed to assist consumers to identify health websites containing evidence-based information. Manual assessment tools may assist consumers to evaluate the quality of sites. However, these tools are poorly validated and often impractical. There is a need to develop better consumer tools, and in particular to explore the potential of automated procedures for evaluating the quality of health information on the web. Objective This study (1) describes the development of an automated quality assessment procedure (AQA) designed to automatically rank depression websites according to their evidence-based quality; (2) evaluates the validity of the AQA relative to human rated evidence-based quality scores; and (3) compares the validity of Google PageRank and the AQA as indicators of evidence-based quality. Method The AQA was developed using a quality feedback technique and a set of training websites previously rated manually according to their concordance with statements in the Oxford University Centre for Evidence-Based Mental Health’s guidelines for treating depression. The validation phase involved 30 websites compiled from the DMOZ, Yahoo! and LookSmart Depression Directories by randomly selecting six sites from each of the Google PageRank bands of 0, 1-2, 3-4, 5-6 and 7-8. Evidence-based ratings from two independent raters (based on concordance with the Oxford guidelines) were then compared with scores derived from the automated AQA and Google algorithms. There was no overlap in the websites used in the training and validation phases of the study. Results The correlation between the AQA score and the evidence-based ratings was high and significant (r=0.85, P<.001). Addition of a quadratic component improved the fit, the combined linear and quadratic model explaining 82 percent of the variance. The correlation between Google PageRank and the evidence-based score was lower than that for the AQA. When sites with zero PageRanks were included the association was weak and non-significant (r=0.23, P=.22). When sites with zero PageRanks were excluded, the correlation was moderate (r=.61, P=.002). Conclusions Depression websites of different evidence-based quality can be differentiated using an automated system. If replicable, generalizable to other health conditions and deployed in a consumer-friendly form, the automated procedure described here could represent an important advance for consumers of Internet medical information. PMID:16403723

  8. Automated power distribution system hardware. [for space station power supplies

    NASA Technical Reports Server (NTRS)

    Anderson, Paul M.; Martin, James A.; Thomason, Cindy

    1989-01-01

    An automated power distribution system testbed for the space station common modules has been developed. It incorporates automated control and monitoring of a utility-type power system. Automated power system switchgear, control and sensor hardware requirements, hardware design, test results, and potential applications are discussed. The system is designed so that the automated control and monitoring of the power system is compatible with both a 208-V, 20-kHz single-phase AC system and a high-voltage (120 to 150 V) DC system.

  9. Comparison of Manual Versus Automated Data Collection Method for an Evidence-Based Nursing Practice Study

    PubMed Central

    Byrne, M.D.; Jordan, T.R.; Welle, T.

    2013-01-01

    Objective The objective of this study was to investigate and improve the use of automated data collection procedures for nursing research and quality assurance. Methods A descriptive, correlational study analyzed 44 orthopedic surgical patients who were part of an evidence-based practice (EBP) project examining post-operative oxygen therapy at a Midwestern hospital. The automation work attempted to replicate a manually-collected data set from the EBP project. Results Automation was successful in replicating data collection for study data elements that were available in the clinical data repository. The automation procedures identified 32 “false negative” patients who met the inclusion criteria described in the EBP project but were not selected during the manual data collection. Automating data collection for certain data elements, such as oxygen saturation, proved challenging because of workflow and practice variations and the reliance on disparate sources for data abstraction. Automation also revealed instances of human error including computational and transcription errors as well as incomplete selection of eligible patients. Conclusion Automated data collection for analysis of nursing-specific phenomenon is potentially superior to manual data collection methods. Creation of automated reports and analysis may require initial up-front investment with collaboration between clinicians, researchers and information technology specialists who can manage the ambiguities and challenges of research and quality assurance work in healthcare. PMID:23650488

  10. [A comparative study between the Vitek YBC and Microscan Walk Away RYID automated systems with conventional phenotypic methods for the identification of yeasts of clinical interest].

    PubMed

    Ferrara, Giuseppe; Mercedes Panizol, Maria; Mazzone, Marja; Delia Pequeneze, Maria; Reviakina, Vera

    2014-12-01

    The aim of this study was to compare the identification of clin- ically relevant yeasts by the Vitek YBC and Microscan Walk Away RYID automated methods with conventional phenotypic methods. One hundred and ninety three yeast strains isolated from clinical samples and five controls strains were used. All the yeasts were identified by the automated methods previously mentioned and conventional phenotypic methods such as carbohydrate assimilation, visualization of microscopic morphology on corn meal agar and the use of chromogenic agar. Variables were assessed by 2 x 2 contingency tables, McNemar's Chi square, the Kappa index, and concordance values were calculated, as well as major and minor errors for the automated methods. Yeasts were divided into two groups: (1) frequent isolation and (2) rare isolation. The Vitek YBC and Microscan Walk Away RYID systems were concordant in 88.4 and 85.9% respectively, when compared to conventional phenotypic methods. Although both automated systems can be used for yeasts identification, the presence of major and minor errors indicates the possibility of misidentifications; therefore, the operator of this equipment must use in parallel, phenotypic tests such as visualization of microscopic morphology on corn meal agar and chromogenic agar, especially against infrequently isolated yeasts. Automated systems are a valuable tool; however, the expertise and judgment of the microbiologist are an important strength to ensure the quality of the results.

  11. TU-F-CAMPUS-I-05: Semi-Automated, Open Source MRI Quality Assurance and Quality Control Program for Multi-Unit Institution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yung, J; Stefan, W; Reeve, D

    2015-06-15

    Purpose: Phantom measurements allow for the performance of magnetic resonance (MR) systems to be evaluated. Association of Physicists in Medicine (AAPM) Report No. 100 Acceptance Testing and Quality Assurance Procedures for MR Imaging Facilities, American College of Radiology (ACR) MR Accreditation Program MR phantom testing, and ACR MRI quality control (QC) program documents help to outline specific tests for establishing system performance baselines as well as system stability over time. Analyzing and processing tests from multiple systems can be time-consuming for medical physicists. Besides determining whether tests are within predetermined limits or criteria, monitoring longitudinal trends can also help preventmore » costly downtime of systems during clinical operation. In this work, a semi-automated QC program was developed to analyze and record measurements in a database that allowed for easy access to historical data. Methods: Image analysis was performed on 27 different MR systems of 1.5T and 3.0T field strengths from GE and Siemens manufacturers. Recommended measurements involved the ACR MRI Accreditation Phantom, spherical homogenous phantoms, and a phantom with an uniform hole pattern. Measurements assessed geometric accuracy and linearity, position accuracy, image uniformity, signal, noise, ghosting, transmit gain, center frequency, and magnetic field drift. The program was designed with open source tools, employing Linux, Apache, MySQL database and Python programming language for the front and backend. Results: Processing time for each image is <2 seconds. Figures are produced to show regions of interests (ROIs) for analysis. Historical data can be reviewed to compare previous year data and to inspect for trends. Conclusion: A MRI quality assurance and QC program is necessary for maintaining high quality, ACR MRI Accredited MR programs. A reviewable database of phantom measurements assists medical physicists with processing and monitoring of large datasets. Longitudinal data can reveal trends that although are within passing criteria indicate underlying system issues.« less

  12. Automation: how much is too much?

    PubMed

    Hancock, P A

    2014-01-01

    The headlong rush to automate continues apace. The dominant question still remains whether we can automate, not whether we should automate. However, it is this latter question that is featured and considered explicitly here. The suggestion offered is that unlimited automation of all technical functions will eventually prove anathema to the fundamental quality of human life. Examples of tasks, pursuits and past-times that should potentially be excused from the automation imperative are discussed. This deliberation leads us back to the question of balance in the cooperation, coordination and potential conflict between humans and the machines they create.

  13. Automation Applications in an Advanced Air Traffic Management System : Volume 4B. Automation Requirements (Concluded)

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work for-e, computer resources, controller productivity, system manning, failure ef...

  14. Quality Controlling CMIP datasets at GFDL

    NASA Astrophysics Data System (ADS)

    Horowitz, L. W.; Radhakrishnan, A.; Balaji, V.; Adcroft, A.; Krasting, J. P.; Nikonov, S.; Mason, E. E.; Schweitzer, R.; Nadeau, D.

    2017-12-01

    As GFDL makes the switch from model development to production in light of the Climate Model Intercomparison Project (CMIP), GFDL's efforts are shifted to testing and more importantly establishing guidelines and protocols for Quality Controlling and semi-automated data publishing. Every CMIP cycle introduces key challenges and the upcoming CMIP6 is no exception. The new CMIP experimental design comprises of multiple MIPs facilitating research in different focus areas. This paradigm has implications not only for the groups that develop the models and conduct the runs, but also for the groups that monitor, analyze and quality control the datasets before data publishing, before their knowledge makes its way into reports like the IPCC (Intergovernmental Panel on Climate Change) Assessment Reports. In this talk, we discuss some of the paths taken at GFDL to quality control the CMIP-ready datasets including: Jupyter notebooks, PrePARE, LAMP (Linux, Apache, MySQL, PHP/Python/Perl): technology-driven tracker system to monitor the status of experiments qualitatively and quantitatively, provide additional metadata and analysis services along with some in-built controlled-vocabulary validations in the workflow. In addition to this, we also discuss the integration of community-based model evaluation software (ESMValTool, PCMDI Metrics Package, and ILAMB) as part of our CMIP6 workflow.

  15. Automated Adaptive Brightness in Wireless Capsule Endoscopy Using Image Segmentation and Sigmoid Function.

    PubMed

    Shrestha, Ravi; Mohammed, Shahed K; Hasan, Md Mehedi; Zhang, Xuechao; Wahid, Khan A

    2016-08-01

    Wireless capsule endoscopy (WCE) plays an important role in the diagnosis of gastrointestinal (GI) diseases by capturing images of human small intestine. Accurate diagnosis of endoscopic images depends heavily on the quality of captured images. Along with image and frame rate, brightness of the image is an important parameter that influences the image quality which leads to the design of an efficient illumination system. Such design involves the choice and placement of proper light source and its ability to illuminate GI surface with proper brightness. Light emitting diodes (LEDs) are normally used as sources where modulated pulses are used to control LED's brightness. In practice, instances like under- and over-illumination are very common in WCE, where the former provides dark images and the later provides bright images with high power consumption. In this paper, we propose a low-power and efficient illumination system that is based on an automated brightness algorithm. The scheme is adaptive in nature, i.e., the brightness level is controlled automatically in real-time while the images are being captured. The captured images are segmented into four equal regions and the brightness level of each region is calculated. Then an adaptive sigmoid function is used to find the optimized brightness level and accordingly a new value of duty cycle of the modulated pulse is generated to capture future images. The algorithm is fully implemented in a capsule prototype and tested with endoscopic images. Commercial capsules like Pillcam and Mirocam were also used in the experiment. The results show that the proposed algorithm works well in controlling the brightness level accordingly to the environmental condition, and as a result, good quality images are captured with an average of 40% brightness level that saves power consumption of the capsule.

  16. Coordinated joint motion control system with position error correction

    DOEpatents

    Danko, George [Reno, NV

    2011-11-22

    Disclosed are an articulated hydraulic machine supporting, control system and control method for same. The articulated hydraulic machine has an end effector for performing useful work. The control system is capable of controlling the end effector for automated movement along a preselected trajectory. The control system has a position error correction system to correct discrepancies between an actual end effector trajectory and a desired end effector trajectory. The correction system can employ one or more absolute position signals provided by one or more acceleration sensors supported by one or more movable machine elements. Good trajectory positioning and repeatability can be obtained. A two-joystick controller system is enabled, which can in some cases facilitate the operator's task and enhance their work quality and productivity.

  17. Coordinated joint motion control system with position error correction

    DOEpatents

    Danko, George L.

    2016-04-05

    Disclosed are an articulated hydraulic machine supporting, control system and control method for same. The articulated hydraulic machine has an end effector for performing useful work. The control system is capable of controlling the end effector for automated movement along a preselected trajectory. The control system has a position error correction system to correct discrepancies between an actual end effector trajectory and a desired end effector trajectory. The correction system can employ one or more absolute position signals provided by one or more acceleration sensors supported by one or more movable machine elements. Good trajectory positioning and repeatability can be obtained. A two joystick controller system is enabled, which can in some cases facilitate the operator's task and enhance their work quality and productivity.

  18. Toward a perceptual video-quality metric

    NASA Astrophysics Data System (ADS)

    Watson, Andrew B.

    1998-07-01

    The advent of widespread distribution of digital video creates a need for automated methods for evaluating the visual quality of digital video. This is particularly so since most digital video is compressed using lossy methods, which involve the controlled introduction of potentially visible artifacts. Compounding the problem is the bursty nature of digital video, which requires adaptive bit allocation based on visual quality metrics, and the economic need to reduce bit-rate to the lowest level that yields acceptable quality. In previous work, we have developed visual quality metrics for evaluating, controlling,a nd optimizing the quality of compressed still images. These metrics incorporate simplified models of human visual sensitivity to spatial and chromatic visual signals. Here I describe a new video quality metric that is an extension of these still image metrics into the time domain. Like the still image metrics, it is based on the Discrete Cosine Transform. An effort has been made to minimize the amount of memory and computation required by the metric, in order that might be applied in the widest range of applications. To calibrate the basic sensitivity of this metric to spatial and temporal signals we have made measurements of visual thresholds for temporally varying samples of DCT quantization noise.

  19. Health care quality measures for children and adolescents in Foster Care: feasibility testing in electronic records.

    PubMed

    Deans, Katherine J; Minneci, Peter C; Nacion, Kristine M; Leonhart, Karen; Cooper, Jennifer N; Scholle, Sarah Hudson; Kelleher, Kelly J

    2018-02-22

    Preventive quality measures for the foster care population are largely untested. The objective of the study is to identify healthcare quality measures for young children and adolescents in foster care and to test whether the data required to calculate these measures can be feasibly extracted and interpreted within an electronic health records or within the Statewide Automated Child Welfare Information System. The AAP Recommendations for Preventive Pediatric Health Care served as the guideline for determining quality measures. Quality measures related to well child visits, developmental screenings, immunizations, trauma-related care, BMI measurements, sexually transmitted infections and depression were defined. Retrospective chart reviews were performed on a cohort of children in foster care from a single large pediatric institution and related county. Data available in the Ohio Statewide Automated Child Welfare Information System was compared to the same population studied in the electronic health record review. Quality measures were calculated as observed (received) to expected (recommended) ratios (O/E ratios) to describe the actual quantity of recommended health care that was received by individual children. Electronic health records and the Statewide Automated Child Welfare Information System data frequently lacked important information on foster care youth essential for calculating the measures. Although electronic health records were rich in encounter specific clinical data, they often lacked custodial information such as the dates of entry into and exit from foster care. In contrast, Statewide Automated Child Welfare Information System included robust data on custodial arrangements, but lacked detailed medical information. Despite these limitations, several quality measures were devised that attempted to accommodate these limitations. In this feasibility testing, neither the electronic health records at a single institution nor the county level Statewide Automated Child Welfare Information System was able to independently serve as a reliable source of data for health care quality measures for foster care youth. However, the ability to leverage both sources by matching them at an individual level may provide the complement of data necessary to assess the quality of healthcare.

  20. Integrating automated support for a software management cycle into the TAME system

    NASA Technical Reports Server (NTRS)

    Sunazuka, Toshihiko; Basili, Victor R.

    1989-01-01

    Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.

  1. Intelligent robot trends for 1998

    NASA Astrophysics Data System (ADS)

    Hall, Ernest L.

    1998-10-01

    An intelligent robot is a remarkably useful combination of a manipulator, sensors and controls. The use of these machines in factory automation can improve productivity, increase product quality and improve competitiveness. This paper presents a discussion of recent technical and economic trends. Technically, the machines are faster, cheaper, more repeatable, more reliable and safer. The knowledge base of inverse kinematic and dynamic solutions and intelligent controls is increasing. More attention is being given by industry to robots, vision and motion controls. New areas of usage are emerging for service robots, remote manipulators and automated guided vehicles. Economically, the robotics industry now has a 1.1 billion-dollar market in the U.S. and is growing. Feasibility studies results are presented which also show decreasing costs for robots and unaudited healthy rates of return for a variety of robotic applications. However, the road from inspiration to successful application can be long and difficult, often taking decades to achieve a new product. A greater emphasis on mechatronics is needed in our universities. Certainly, more cooperation between government, industry and universities is needed to speed the development of intelligent robots that will benefit industry and society.

  2. Psychological Effects of Automated External Defibrillator Training A randomized trial

    PubMed Central

    Meischke, Hendrika; Diehr, Paula; Phelps, Randi; Damon, Susan; Rea, Tom

    2011-01-01

    Objectives The objective of this study was to test if an Automated External Defibrillator (AED) training program would positively affect the mental health of family members of high risk patients. Methods 305 ischemic heart disease patients and their family members were randomized to one of four AED training programs: two video-based training programs and two face-to-face training programs that emphasized self-efficacy and perceived control. Patients and family members were surveyed at baseline, 3 and 9 months post ischemic event on demographic characteristics, measures of quality of life (SF=36) , self-efficacy and perceived control. For this study, family members were the focus rather than the patients. Results Regression analyses showed that family members in the face-to-face training programs did not score better on any of the mental health status variables than family members who participated in the other training programs but for an increase in self-efficacy beliefs at 3 months post training. Conclusion The findings suggest that a specifically designed AED training program emphasizing self-efficacy and perceived control beliefs is not likely to enhance family member mental health. PMID:21411144

  3. Adaptive automation of human-machine system information-processing functions.

    PubMed

    Kaber, David B; Wright, Melanie C; Prinzel, Lawrence J; Clamann, Michael P

    2005-01-01

    The goal of this research was to describe the ability of human operators to interact with adaptive automation (AA) applied to various stages of complex systems information processing, defined in a model of human-automation interaction. Forty participants operated a simulation of an air traffic control task. Automated assistance was adaptively applied to information acquisition, information analysis, decision making, and action implementation aspects of the task based on operator workload states, which were measured using a secondary task. The differential effects of the forms of automation were determined and compared with a manual control condition. Results of two 20-min trials of AA or manual control revealed a significant effect of the type of automation on performance, particularly during manual control periods as part of the adaptive conditions. Humans appear to better adapt to AA applied to sensory and psychomotor information-processing functions (action implementation) than to AA applied to cognitive functions (information analysis and decision making), and AA is superior to completely manual control. Potential applications of this research include the design of automation to support air traffic controller information processing.

  4. 1366 Project Automate: Enabling Automation for <$0.10/W High-Efficiency Kerfless Wafers Manufactured in the US

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lorenz, Adam

    For photovoltaic (PV) manufacturing to thrive in the U.S., there must be an innovative core to the technology. Project Automate builds on 1366’s proprietary Direct Wafer® kerfless wafer technology and aims to unlock the cost and efficiency advantages of thin kerfless wafers. Direct Wafer is an innovative, U.S.-friendly (efficient, low-labor content) manufacturing process that addresses the main cost barrier limiting silicon PV cost-reductions – the 35-year-old grand challenge of manufacturing quality wafers (40% of the cost of modules) without the cost and waste of sawing. This simple, scalable process will allow 1366 to manufacture “drop-in” replacement wafers for the $10more » billion silicon PV wafer market at 50% of the cost, 60% of the capital, and 30% of the electricity of conventional casting and sawing manufacturing processes. This SolarMat project developed the Direct Wafer processes’ unique capability to tailor the shape of wafers to simultaneously make thinner AND stronger wafers (with lower silicon usage) that enable high-efficiency cell architectures. By producing wafers with a unique target geometry including a thick border (which determines handling characteristics) and thin interior regions (which control light capture and electron transport and therefore determine efficiency), 1366 can simultaneously improve quality and lower cost (using less silicon).« less

  5. Microalgal process-monitoring based on high-selectivity spectroscopy tools: status and future perspectives.

    PubMed

    Podevin, Michael; Fotidis, Ioannis A; Angelidaki, Irini

    2018-08-01

    Microalgae are well known for their ability to accumulate lipids intracellularly, which can be used for biofuels and mitigate CO 2 emissions. However, due to economic challenges, microalgae bioprocesses have maneuvered towards the simultaneous production of food, feed, fuel, and various high-value chemicals in a biorefinery concept. On-line and in-line monitoring of macromolecules such as lipids, proteins, carbohydrates, and high-value pigments will be more critical to maintain product quality and consistency for downstream processing in a biorefinery to maintain and valorize these markets. The main contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity - the capability of monitoring several analytes simultaneously - in the interest of improving product quality, productivity, and process automation of a microalgal biorefinery. The high-selectivity PAT under consideration are mid-infrared (MIR), near-infrared (NIR), and Raman vibrational spectroscopies. The current review contains a critical assessment of these technologies in the context of recent advances in software and hardware in order to move microalgae production towards process automation through multivariate process control (MVPC) and software sensors trained on "big data". The paper will also include a comprehensive overview of off-line implementations of vibrational spectroscopy in microalgal research as it pertains to spectral interpretation and process automation to aid and motivate development.

  6. Robotic voltammetry with carbon nanotube-based sensors: a superb blend for convenient high-quality antimicrobial trace analysis

    PubMed Central

    Theanponkrang, Somjai; Suginta, Wipa; Weingart, Helge; Winterhalter, Mathias; Schulte, Albert

    2015-01-01

    A new automated pharmacoanalytical technique for convenient quantification of redox-active antibiotics has been established by combining the benefits of a carbon nanotube (CNT) sensor modification with electrocatalytic activity for analyte detection with the merits of a robotic electrochemical device that is capable of sequential nonmanual sample measurements in 24-well microtiter plates. Norfloxacin (NFX) and ciprofloxacin (CFX), two standard fluoroquinolone antibiotics, were used in automated calibration measurements by differential pulse voltammetry (DPV) and accomplished were linear ranges of 1–10 μM and 2–100 μM for NFX and CFX, respectively. The lowest detectable levels were estimated to be 0.3±0.1 μM (n=7) for NFX and 1.6±0.1 μM (n=7) for CFX. In standard solutions or tablet samples of known content, both analytes could be quantified with the robotic DPV microtiter plate assay, with recoveries within ±4% of 100%. And recoveries were as good when NFX was evaluated in human serum samples with added NFX. The use of simple instrumentation, convenience in execution, and high effectiveness in analyte quantitation suggest the merger between automated microtiter plate voltammetry and CNT-supported electrochemical drug detection as a novel methodology for antibiotic testing in pharmaceutical and clinical research and quality control laboratories. PMID:25670899

  7. Associated Effects of Automated Essay Evaluation Software on Growth in Writing Quality for Students with and without Disabilities

    ERIC Educational Resources Information Center

    Wilson, Joshua

    2017-01-01

    The present study examined growth in writing quality associated with feedback provided by an automated essay evaluation system called PEG Writing. Equal numbers of students with disabilities (SWD) and typically-developing students (TD) matched on prior writing achievement were sampled (n = 1196 total). Data from a subsample of students (n = 655)…

  8. Hospital adoption of automated surveillance technology and the implementation of infection prevention and control programs.

    PubMed

    Halpin, Helen; Shortell, Stephen M; Milstein, Arnold; Vanneman, Megan

    2011-05-01

    This research analyzes the relationship between hospital use of automated surveillance technology (AST) for identification and control of hospital-acquired infections (HAI) and implementation of evidence-based infection control practices. Our hypothesis is that hospitals that use AST have made more progress implementing infection control practices than hospitals that rely on manual surveillance. A survey of all acute general care hospitals in California was conducted from October 2008 through January 2009. A structured computer-assisted telephone interview was conducted with the quality director of each hospital. The final sample includes 241 general acute care hospitals (response rate, 83%). Approximately one third (32.4%) of California's hospitals use AST for monitoring HAI. Adoption of AST is statistically significant and positively associated with the depth of implementation of evidence-based practices for methicillin-resistant Staphylococcus aureus and ventilator-associated pneumonia and adoption of contact precautions and surgical care infection practices. Use of AST is also statistically significantly associated with the breadth of hospital implementation of evidence-based practices across all 5 targeted HAI. Our findings suggest that hospitals using AST can achieve greater depth and breadth in implementing evidenced-based infection control practices. Copyright © 2011 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  9. Mapping the Recent US Hurricanes Triggered Flood Events in Near Real Time

    NASA Astrophysics Data System (ADS)

    Shen, X.; Lazin, R.; Anagnostou, E. N.; Wanik, D. W.; Brakenridge, G. R.

    2017-12-01

    Synthetic Aperture Radar (SAR) observations is the only reliable remote sensing data source to map flood inundation during severe weather events. Unfortunately, since state-of-art data processing algorithms cannot meet the automation and quality standard of a near-real-time (NRT) system, quality controlled inundation mapping by SAR currently depends heavily on manual processing, which limits our capability to quickly issue flood inundation maps at global scale. Specifically, most SAR-based inundation mapping algorithms are not fully automated, while those that are automated exhibit severe over- and/or under-detection errors that limit their potential. These detection errors are primarily caused by the strong overlap among the SAR backscattering probability density functions (PDF) of different land cover types. In this study, we tested a newly developed NRT SAR-based inundation mapping system, named Radar Produced Inundation Diary (RAPID), using Sentinel-1 dual polarized SAR data over recent flood events caused by Hurricanes Harvey, Irma, and Maria (2017). The system consists of 1) self-optimized multi-threshold classification, 2) over-detection removal using land-cover information and change detection, 3) under-detection compensation, and 4) machine-learning based correction. Algorithm details are introduced in another poster, H53J-1603. Good agreements were obtained by comparing the result from RAPID with visual interpretation of SAR images and manual processing from Dartmouth Flood Observatory (DFO) (See Figure 1). Specifically, the over- and under-detections that is typically noted in automated methods is significantly reduced to negligible levels. This performance indicates that RAPID can address the automation and accuracy issues of current state-of-art algorithms and has the potential to apply operationally on a number of satellite SAR missions, such as SWOT, ALOS, Sentinel etc. RAPID data can support many applications such as rapid assessment of damage losses and disaster alleviation/rescue at global scale.

  10. AmeriFlux Data Processing: Integrating automated and manual data management across software technologies and an international network to generate timely data products

    NASA Astrophysics Data System (ADS)

    Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.

    2017-12-01

    AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with various software tools, and working across interdisciplinary and international science cultures. Additionally, we discuss results from community member feedback that helped refine QA/QC communications for efficient data submission and revision.

  11. Current Level of Mission Control Automation at NASA/Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Maks, Lori; Breed, Julie; Rackley, Michael; Powers, Edward I. (Technical Monitor)

    2001-01-01

    NASA is particularly concerned with reducing mission operations costs through increased automation. This paper examines the operations procedures within NASA Mission Control Centers in order to uncover the level of automation that currently exists within them. Based on an assessment of mission operations procedures within three representative control centers, this paper recommends specific areas where there is potential for mission cost reduction through increased automation.

  12. Insitu measurement and control of processing properties of composite resins in a production tool

    NASA Technical Reports Server (NTRS)

    Kranbuehl, D.; Hoff, M.; Haverty, P.; Loos, A.; Freeman, T.

    1988-01-01

    An in situ measuring technique for use in automated composite processing and quality control is discussed. Frequency dependent electromagnetic sensors are used to measure processing parameters at four ply positions inside a thick section 192-ply graphite-epoxy composite during cure in an 8 x 4 in. autoclave. Viscosity measurements obtained using the sensors are compared with the viscosities calculated using the Loos-Springer cure process model. Good overall agreement is obtained. In a subsequent autoclave run, the output from the four sensors was used to control the autoclave temperature. Using the 'closed loop' sensor controlled autoclave temperature resulted in a more uniform and more rapid cure cycle.

  13. Driving Performance After Self-Regulated Control Transitions in Highly Automated Vehicles.

    PubMed

    Eriksson, Alexander; Stanton, Neville A

    2017-12-01

    This study aims to explore whether driver-paced, noncritical transitions of control may counteract some of the aftereffects observed in the contemporary literature, resulting in higher levels of vehicle control. Research into control transitions in highly automated driving has focused on urgent scenarios where drivers are given a relatively short time span to respond to a request to resume manual control, resulting in seemingly scrambled control when manual control is resumed. Twenty-six drivers drove two scenarios with an automated driving feature activated. Drivers were asked to read a newspaper or monitor the system and relinquish or resume control from the automation when prompted by vehicle systems. Driving performance in terms of lane positioning and steering behavior was assessed for 20 seconds post resuming control to capture the resulting level of control. It was found that lane positioning was virtually unaffected for the duration of the 20-second time span in both automated conditions compared to the manual baseline when drivers resumed manual control; however, significant increases in the standard deviation of steering input were found for both automated conditions compared to baseline. No significant differences were found between the two automated conditions. The results indicate that when drivers self-paced the transfer back to manual control they exhibit less of the detrimental effects observed in system-paced conditions. It was shown that self-paced transitions could reduce the risk of accidents near the edge of the operational design domain. Vehicle manufacturers must consider these benefits when designing contemporary systems.

  14. Performance of automated scoring of ER, PR, HER2, CK5/6 and EGFR in breast cancer tissue microarrays in the Breast Cancer Association Consortium

    PubMed Central

    Howat, William J; Blows, Fiona M; Provenzano, Elena; Brook, Mark N; Morris, Lorna; Gazinska, Patrycja; Johnson, Nicola; McDuffus, Leigh‐Anne; Miller, Jodi; Sawyer, Elinor J; Pinder, Sarah; van Deurzen, Carolien H M; Jones, Louise; Sironen, Reijo; Visscher, Daniel; Caldas, Carlos; Daley, Frances; Coulson, Penny; Broeks, Annegien; Sanders, Joyce; Wesseling, Jelle; Nevanlinna, Heli; Fagerholm, Rainer; Blomqvist, Carl; Heikkilä, Päivi; Ali, H Raza; Dawson, Sarah‐Jane; Figueroa, Jonine; Lissowska, Jolanta; Brinton, Louise; Mannermaa, Arto; Kataja, Vesa; Kosma, Veli‐Matti; Cox, Angela; Brock, Ian W; Cross, Simon S; Reed, Malcolm W; Couch, Fergus J; Olson, Janet E; Devillee, Peter; Mesker, Wilma E; Seyaneve, Caroline M; Hollestelle, Antoinette; Benitez, Javier; Perez, Jose Ignacio Arias; Menéndez, Primitiva; Bolla, Manjeet K; Easton, Douglas F; Schmidt, Marjanka K; Pharoah, Paul D; Sherman, Mark E

    2014-01-01

    Abstract Breast cancer risk factors and clinical outcomes vary by tumour marker expression. However, individual studies often lack the power required to assess these relationships, and large‐scale analyses are limited by the need for high throughput, standardized scoring methods. To address these limitations, we assessed whether automated image analysis of immunohistochemically stained tissue microarrays can permit rapid, standardized scoring of tumour markers from multiple studies. Tissue microarray sections prepared in nine studies containing 20 263 cores from 8267 breast cancers stained for two nuclear (oestrogen receptor, progesterone receptor), two membranous (human epidermal growth factor receptor 2 and epidermal growth factor receptor) and one cytoplasmic (cytokeratin 5/6) marker were scanned as digital images. Automated algorithms were used to score markers in tumour cells using the Ariol system. We compared automated scores against visual reads, and their associations with breast cancer survival. Approximately 65–70% of tissue microarray cores were satisfactory for scoring. Among satisfactory cores, agreement between dichotomous automated and visual scores was highest for oestrogen receptor (Kappa = 0.76), followed by human epidermal growth factor receptor 2 (Kappa = 0.69) and progesterone receptor (Kappa = 0.67). Automated quantitative scores for these markers were associated with hazard ratios for breast cancer mortality in a dose‐response manner. Considering visual scores of epidermal growth factor receptor or cytokeratin 5/6 as the reference, automated scoring achieved excellent negative predictive value (96–98%), but yielded many false positives (positive predictive value = 30–32%). For all markers, we observed substantial heterogeneity in automated scoring performance across tissue microarrays. Automated analysis is a potentially useful tool for large‐scale, quantitative scoring of immunohistochemically stained tissue microarrays available in consortia. However, continued optimization, rigorous marker‐specific quality control measures and standardization of tissue microarray designs, staining and scoring protocols is needed to enhance results. PMID:27499890

  15. Benefits of an automated GLP final report preparation software solution.

    PubMed

    Elvebak, Larry E

    2011-07-01

    The final product of analytical laboratories performing US FDA-regulated (or GLP) method validation and bioanalysis studies is the final report. Although there are commercial-off-the-shelf (COTS) software/instrument systems available to laboratory managers to automate and manage almost every aspect of the instrumental and sample-handling processes of GLP studies, there are few software systems available to fully manage the GLP final report preparation process. This lack of appropriate COTS tools results in the implementation of rather Byzantine and manual processes to cobble together all the information needed to generate a GLP final report. The manual nature of these processes results in the need for several iterative quality control and quality assurance events to ensure data accuracy and report formatting. The industry is in need of a COTS solution that gives laboratory managers and study directors the ability to manage as many portions as possible of the GLP final report writing process and the ability to generate a GLP final report with the click of a button. This article describes the COTS software features needed to give laboratory managers and study directors such a solution.

  16. Low-Dose, High-Frequency CPR Training Improves Skill Retention of In-Hospital Pediatric Providers

    PubMed Central

    Niles, Dana; Meaney, Peter A.; Aplenc, Richard; French, Benjamin; Abella, Benjamin S.; Lengetti, Evelyn L.; Berg, Robert A.; Helfaer, Mark A.; Nadkarni, Vinay

    2011-01-01

    OBJECTIVE: To investigate the effectiveness of brief bedside cardiopulmonary resuscitation (CPR) training to improve the skill retention of hospital-based pediatric providers. We hypothesized that a low-dose, high-frequency training program (booster training) would improve CPR skill retention. PATIENTS AND METHODS: CPR recording/feedback defibrillators were used to evaluate CPR quality during simulated arrest. Basic life support–certified, hospital-based providers were randomly assigned to 1 of 4 study arms: (1) instructor-only training; (2) automated defibrillator feedback only; (3) instructor training combined with automated feedback; and (4) control (no structured training). Each session (time: 0, 1, 3, and 6 months after training) consisted of a pretraining evaluation (60 seconds), booster training (120 seconds), and a posttraining evaluation (60 seconds). Excellent CPR was defined as chest compression (CC) depth ≥ one-third anterior-posterior chest depth, rate ≥ 90 and ≤120 CC per minute, ≤20% of CCs with incomplete release (>2500 g), and no flow fraction ≤ 0.30. MEASUREMENTS AND MAIN RESULTS: Eighty-nine providers were randomly assigned; 74 (83%) completed all sessions. Retention of CPR skills was 2.3 times (95% confidence interval [CI]: 1.1–4.5; P = .02) more likely after 2 trainings and 2.9 times (95% CI: 1.4–6.2; P = .005) more likely after 3 trainings. The automated defibrillator feedback only group had lower retention rates compared with the instructor-only training group (odds ratio: 0.41 [95% CI: 0.17–0.97]; P = .043). CONCLUSIONS: Brief bedside booster CPR training improves CPR skill retention. Our data reveal that instructor-led training improves retention compared with automated feedback training alone. Future studies should investigate whether bedside training improves CPR quality during actual pediatric arrests. PMID:21646262

  17. Automated MAD and MIR structure solution

    PubMed Central

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-01-01

    Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations. PMID:10089316

  18. Centralized automated quality assurance for large scale health care systems. A pilot method for some aspects of dental radiography.

    PubMed

    Benn, D K; Minden, N J; Pettigrew, J C; Shim, M

    1994-08-01

    President Clinton's Health Security Act proposes the formation of large scale health plans with improved quality assurance. Dental radiography consumes 4% ($1.2 billion in 1990) of total dental expenditure yet regular systematic office quality assurance is not performed. A pilot automated method is described for assessing density of exposed film and fogging of unexposed processed film. A workstation and camera were used to input intraoral radiographs. Test images were produced from a phantom jaw with increasing exposure times. Two radiologists subjectively classified the images as too light, acceptable, or too dark. A computer program automatically classified global grey level histograms from the test images as too light, acceptable, or too dark. The program correctly classified 95% of 88 clinical films. Optical density of unexposed film in the range 0.15 to 0.52 measured by computer was reliable to better than 0.01. Further work is needed to see if comprehensive centralized automated radiographic quality assurance systems with feedback to dentists are feasible, are able to improve quality, and are significantly cheaper than conventional clerical methods.

  19. Explicit control of adaptive automation under different levels of environmental stress.

    PubMed

    Sauer, Jürgen; Kao, Chung-Shan; Wastell, David; Nickel, Peter

    2011-08-01

    This article examines the effectiveness of three different forms of explicit control of adaptive automation under low- and high-stress conditions, operationalised by different levels of noise. In total, 60 participants were assigned to one of three types of automation design (free, prompted and forced choice). They were trained for 4 h on a highly automated simulation of a process control environment, called AutoCAMS. This was followed by a 4-h testing session under noise exposure and quiet conditions. Measures of performance, psychophysiology and subjective reactions were taken. The results showed that all three modes of explicit control of adaptive automation modes were able to attenuate the negative effects of noise. This was partly due to the fact that operators opted for higher levels of automation under noise. It also emerged that forced choice showed marginal advantages over the two other automation modes. Statement of Relevance: This work is relevant to the design of adaptive automation since it emphasises the need to consider the impact of work-related stressors during task completion. During the presence of stressors, different forms of operator support through automation may be required than under more favourable working conditions.

  20. Implementation of Quality Management in Core Service Laboratories

    PubMed Central

    Creavalle, T.; Haque, K.; Raley, C.; Subleski, M.; Smith, M.W.; Hicks, B.

    2010-01-01

    CF-28 The Genetics and Genomics group of the Advanced Technology Program of SAIC-Frederick exists to bring innovative genomic expertise, tools and analysis to NCI and the scientific community. The Sequencing Facility (SF) provides next generation short read (Illumina) sequencing capacity to investigators using a streamlined production approach. The Laboratory of Molecular Technology (LMT) offers a wide range of genomics core services including microarray expression analysis, miRNA analysis, array comparative genome hybridization, long read (Roche) next generation sequencing, quantitative real time PCR, transgenic genotyping, Sanger sequencing, and clinical mutation detection services to investigators from across the NIH. As the technology supporting this genomic research becomes more complex, the need for basic quality processes within all aspects of the core service groups becomes critical. The Quality Management group works alongside members of these labs to establish or improve processes supporting operations control (equipment, reagent and materials management), process improvement (reengineering/optimization, automation, acceptance criteria for new technologies and tech transfer), and quality assurance and customer support (controlled documentation/SOPs, training, service deficiencies and continual improvement efforts). Implementation and expansion of quality programs within unregulated environments demonstrates SAIC-Frederick's dedication to providing the highest quality products and services to the NIH community.

  1. Management of laser welding based on analysis informative signals

    NASA Astrophysics Data System (ADS)

    Zvezdin, V. V.; Rakhimov, R. R.; Saubanov, Ruz R.; Israfilov, I. H.; Akhtiamov, R. F.

    2017-09-01

    Features of formation precision weld of metal were presented. It has been shown that the quality of the welding process depends not only on the energy characteristics of the laser processing facility, the temperature of the surface layer, but also on the accuracy of positioning laser focus relative to seam and the workpiece surface. So the laser focus positioning accuracy is an estimate of the quality of the welding process. This approach allows to build a system automated control of the laser technological complex with the stabilization of the setpoint accuracy of of positioning of the laser beam relative to the workpiece surface.

  2. Towards cooperative guidance and control of highly automated vehicles: H-Mode and Conduct-by-Wire.

    PubMed

    Flemisch, Frank Ole; Bengler, Klaus; Bubb, Heiner; Winner, Hermann; Bruder, Ralph

    2014-01-01

    This article provides a general ergonomic framework of cooperative guidance and control for vehicles with an emphasis on the cooperation between a human and a highly automated vehicle. In the twenty-first century, mobility and automation technologies are increasingly fused. In the sky, highly automated aircraft are flying with a high safety record. On the ground, a variety of driver assistance systems are being developed, and highly automated vehicles with increasingly autonomous capabilities are becoming possible. Human-centred automation has paved the way for a better cooperation between automation and humans. How can these highly automated systems be structured so that they can be easily understood, how will they cooperate with the human? The presented research was conducted using the methods of iterative build-up and refinement of framework by triangulation, i.e. by instantiating and testing the framework with at least two derived concepts and prototypes. This article sketches a general, conceptual ergonomic framework of cooperative guidance and control of highly automated vehicles, two concepts derived from the framework, prototypes and pilot data. Cooperation is exemplified in a list of aspects and related to levels of the driving task. With the concept 'Conduct-by-Wire', cooperation happens mainly on the guidance level, where the driver can delegate manoeuvres to the automation with a specialised manoeuvre interface. With H-Mode, a haptic-multimodal interaction with highly automated vehicles based on the H(orse)-Metaphor, cooperation is mainly done on guidance and control with a haptically active interface. Cooperativeness should be a key aspect for future human-automation systems. Especially for highly automated vehicles, cooperative guidance and control is a research direction with already promising concepts and prototypes that should be further explored. The application of the presented approach is every human-machine system that moves and includes high levels of assistance/automation.

  3. Oxygen-controlled automated neural differentiation of mouse embryonic stem cells.

    PubMed

    Mondragon-Teran, Paul; Tostoes, Rui; Mason, Chris; Lye, Gary J; Veraitch, Farlan S

    2013-03-01

    Automation and oxygen tension control are two tools that provide significant improvements to the reproducibility and efficiency of stem cell production processes. the aim of this study was to establish a novel automation platform capable of controlling oxygen tension during both the cell-culture and liquid-handling steps of neural differentiation processes. We built a bespoke automation platform, which enclosed a liquid-handling platform in a sterile, oxygen-controlled environment. An airtight connection was used to transfer cell culture plates to and from an automated oxygen-controlled incubator. Our results demonstrate that our system yielded comparable cell numbers, viabilities, metabolism profiles and differentiation efficiencies when compared with traditional manual processes. Interestingly, eliminating exposure to ambient conditions during the liquid-handling stage resulted in significant improvements in the yield of MAP2-positive neural cells, indicating that this level of control can improve differentiation processes. This article describes, for the first time, an automation platform capable of maintaining oxygen tension control during both the cell-culture and liquid-handling stages of a 2D embryonic stem cell differentiation process.

  4. HT-COMET: a novel automated approach for high throughput assessment of human sperm chromatin quality.

    PubMed

    Albert, Océane; Reintsch, Wolfgang E; Chan, Peter; Robaire, Bernard

    2016-05-01

    Can we make the comet assay (single-cell gel electrophoresis) for human sperm a more accurate and informative high throughput assay? We developed a standardized automated high throughput comet (HT-COMET) assay for human sperm that improves its accuracy and efficiency, and could be of prognostic value to patients in the fertility clinic. The comet assay involves the collection of data on sperm DNA damage at the level of the single cell, allowing the use of samples from severe oligozoospermic patients. However, this makes comet scoring a low throughput procedure that renders large cohort analyses tedious. Furthermore, the comet assay comes with an inherent vulnerability to variability. Our objective is to develop an automated high throughput comet assay for human sperm that will increase both its accuracy and efficiency. The study comprised two distinct components: a HT-COMET technical optimization section based on control versus DNAse treatment analyses ( ITALIC! n = 3-5), and a cross-sectional study on 123 men presenting to a reproductive center with sperm concentrations categorized as severe oligozoospermia, oligozoospermia or normozoospermia. Sperm chromatin quality was measured using the comet assay: on classic 2-well slides for software comparison; on 96-well slides for HT-COMET optimization; after exposure to various concentrations of a damage-inducing agent, DNAse, using HT-COMET; on 123 subjects with different sperm concentrations using HT-COMET. Data from the 123 subjects were correlated to classic semen quality parameters and plotted as single-cell data in individual DNA damage profiles. We have developed a standard automated HT-COMET procedure for human sperm. It includes automated scoring of comets by a fully integrated high content screening setup that compares well with the most commonly used semi-manual analysis software. Using this method, a cross-sectional study on 123 men showed no significant correlation between sperm concentration and sperm DNA damage, confirming the existence of hidden chromatin damage in men with apparently normal semen characteristics, and a significant correlation between percentage DNA in the tail and percentage of progressively motile spermatozoa. Finally, the use of DNA damage profiles helped to distinguish subjects between and within sperm concentration categories, and allowed a determination of the proportion of highly damaged cells. The main limitations of the HT-COMET are the high, yet indispensable, investment in an automated liquid handling system and heating block to ensure accuracy, and the availability of an automated plate reading microscope and analysis software. This standardized HT-COMET assay offers many advantages, including higher accuracy and evenness due to automation of sensitive steps, a 14.4-fold increase in sample analysis capacity, and an imaging and scoring time of 1 min/well. Overall, HT-COMET offers a decrease in total experimental time of more than 90%. Hence, this assay constitutes a more efficient option to assess sperm chromatin quality, paves the way to using this assay to screen large cohorts, and holds prognostic value for infertile patients. Funded by the CIHR Institute of Human Development, Child and Youth Health (IHDCYH; RHF 100625). O.A. is a fellow supported by the Fonds de la Recherche du Québec - Santé (FRQS) and the CIHR Training Program in Reproduction, Early Development, and the Impact on Health (REDIH). B.R. is a James McGill Professor. The authors declare no conflicts of interest. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Evaporation from weighing precipitation gauges: impacts on automated gauge measurements and quality assurance methods

    NASA Astrophysics Data System (ADS)

    Leeper, R. D.; Kochendorfer, J.

    2015-06-01

    Evaporation from a precipitation gauge can cause errors in the amount of measured precipitation. For automated weighing-bucket gauges, the World Meteorological Organization (WMO) suggests the use of evaporative suppressants and frequent observations to limit these biases. However, the use of evaporation suppressants is not always feasible due to environmental hazards and the added cost of maintenance, transport, and disposal of the gauge additive. In addition, research has suggested that evaporation prior to precipitation may affect precipitation measurements from auto-recording gauges operating at sub-hourly frequencies. For further evaluation, a field campaign was conducted to monitor evaporation and its impacts on the quality of precipitation measurements from gauges used at U.S. Climate Reference Network (USCRN) stations. Two Geonor gauges were collocated, with one gauge using an evaporative suppressant (referred to as Geonor-NonEvap) and the other with no suppressant (referred to as Geonor-Evap) to evaluate evaporative losses and evaporation biases on precipitation measurements. From June to August, evaporative losses from the Geonor-Evap gauge exceeded accumulated precipitation, with an average loss of 0.12 mm h-1. The impact of evaporation on precipitation measurements was sensitive to the choice of calculation method. In general, the pairwise method that utilized a longer time series to smooth out sensor noise was more sensitive to gauge evaporation (-4.6% bias with respect to control) than the weighted-average method that calculated depth change over a smaller window (<+1% bias). These results indicate that while climate and gauge design affect gauge evaporation rates, computational methods also influence the magnitude of evaporation biases on precipitation measurements. This study can be used to advance quality insurance (QA) techniques used in other automated networks to mitigate the impact of evaporation biases on precipitation measurements.

  6. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  7. Boredom and monotony as a consequence of automation : a consideration of the evidence relating boredom and monotony to stress.

    DOT National Transportation Integrated Search

    1980-02-01

    As air traffic control becomes increasingly automated, the various implications of this trend should be considered. One of the likely byproducts of highly automated air traffic control systems is an increase in boredom and monotony among controllers ...

  8. AIMBAT: A Python/Matplotlib Tool for Measuring Teleseismic Arrival Times

    NASA Astrophysics Data System (ADS)

    Lou, X.; van der Lee, S.; Lloyd, S.

    2013-12-01

    Python is an open-source, platform-independent, and object-oriented scripting language. It became more popular in the seismologist community since the appearance of ObsPy (Beyreuther et al. 2010, Megies et al. 2011), which provides a powerful framework for seismic data access and processing. This study introduces a new Python-based tool named AIMBAT (Automated and Interactive Measurement of Body-wave Arrival Times) for measuring teleseismic body-wave arrival times on large-scale seismic event data (Lou et al. 2013). Compared to ObsPy, AIMBAT is a lighter tool that is more focused on a particular aspect of seismic data processing. It originates from the widely used MCCC (Multi-Channel Cross-Correlation) method developed by VanDecar and Crosson (1990). On top of the original MCCC procedure, AIMBAT is automated in initial phase picking and is interactive in quality control. The core cross-correlation function is implemented in Fortran to boost up performance in addition to Python. The GUI (graphical user interface) of AIMBAT depends on Matplotlib's GUI-neutral widgets and event-handling API. A number of sorting and (de)selecting options are designed to facilitate the quality control of seismograms. By using AIMBAT, both relative and absolute teleseismic body-wave arrival times are measured. AIMBAT significantly improves efficiency and quality of the measurements. User interaction is needed only to pick the target phase arrival and to set a time window on the array stack. The package is easy to install and use, open-source, and is publicly available. Graphical user interface of AIMBAT.

  9. Effective Materials Property Information Management for the 21st Century

    NASA Technical Reports Server (NTRS)

    Ren, Weiju; Cebon, David; Arnold, Steve

    2009-01-01

    This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in various organizations. In part these are fueled by the demands for higher efficiency in material testing, product design and engineering analysis. But equally important, organizations are being driven by the need for consistency, quality and traceability of data, as well as control of access to sensitive information such as proprietary data. Further, the use of increasingly sophisticated nonlinear, anisotropic and multi-scale engineering analyses requires both processing of large volumes of test data for development of constitutive models and complex materials data input for Computer-Aided Engineering (CAE) software. And finally, the globalization of economy often generates great needs for sharing a single "gold source" of materials information between members of global engineering teams in extended supply chains. Fortunately, material property management systems have kept pace with the growing user demands and evolved to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access, version, and quality controls; (ii) a wide range of data import, export and analysis capabilities; (iii) data "pedigree" traceability mechanisms; (iv) data searching, reporting and viewing tools; and (v) access to the information via a wide range of interfaces. In this paper the important requirements for advanced material data management systems, future challenges and opportunities such as automated error checking, data quality characterization, identification of gaps in datasets, as well as functionalities and business models to fuel database growth and maintenance are discussed.

  10. Quantitative diagnostic performance of myocardial perfusion SPECT with attenuation correction in women.

    PubMed

    Wolak, Arik; Slomka, Piotr J; Fish, Mathews B; Lorenzo, Santiago; Berman, Daniel S; Germano, Guido

    2008-06-01

    Attenuation correction (AC) for myocardial perfusion SPECT (MPS) had not been evaluated separately in women despite specific considerations in this group because of breast photon attenuation. We aimed to evaluate the performance of AC in women by using automated quantitative analysis of MPS to avoid any bias. Consecutive female patients--134 with a low likelihood (LLk) of coronary artery disease (CAD) and 114 with coronary angiography performed within less than 3 mo of MPS--who were referred for rest-stress electrocardiography-gated 99mTc-sestamibi MPS with AC were considered. Imaging data were evaluated for contour quality control. An additional 50 LLk studies in women were used to create equivalent normal limits for studies with AC and with no correction (NC). An experienced technologist unaware of the angiography and other results performed the contour quality control. All other processing was performed in a fully automated manner. Quantitative analysis was performed with the Cedars-Sinai myocardial perfusion analysis package. All automated segmental analyses were performed with the 17-segment, 5-point American Heart Association model. Summed stress scores (SSS) of > or =3 were considered abnormal. CAD (> or =70% stenosis) was present in 69 of 114 patients (60%). The normalcy rates were 93% for both NC and AC studies. The SSS for patients with CAD and without CAD for NC versus AC were 10.0 +/- 9.0 (mean +/- SD) versus 10.2 +/- 8.5 and 1.6 +/- 2.3 versus 1.8 +/- 2.5, respectively; P was not significant (NS) for all comparisons of NC versus AC. The SSS for LLk patients for NC versus AC were 0.51 +/- 1.0 versus 0.6 +/- 1.1, respectively; P was NS. The specificity for both NC and AC was 73%. The sensitivities for NC and AC were 80% and 81%, respectively, and the accuracies for NC and AC were 77% and 78%, respectively; P was NS for both comparisons. There are no significant diagnostic differences between automated quantitative MPS analyses performed in studies processed with and without AC in women.

  11. Acoustic Sample Deposition MALDI-MS (ASD-MALDI-MS): A Novel Process Flow for Quality Control Screening of Compound Libraries.

    PubMed

    Chin, Jefferson; Wood, Elizabeth; Peters, Grace S; Drexler, Dieter M

    2016-02-01

    In the early stages of drug discovery, high-throughput screening (HTS) of compound libraries against pharmaceutical targets is a common method to identify potential lead molecules. For these HTS campaigns to be efficient and successful, continuous quality control of the compound collection is necessary and crucial. However, the large number of compound samples and the limited sample amount pose unique challenges. Presented here is a proof-of-concept study for a novel process flow for the quality control screening of small-molecule compound libraries that consumes only minimal amounts of samples and affords compound-specific molecular data. This process employs an acoustic sample deposition (ASD) technique for the offline sample preparation by depositing nanoliter volumes in an array format onto microscope glass slides followed by matrix-assisted laser desorption/ionization mass spectrometric (MALDI-MS) analysis. An initial study of a 384-compound array employing the ASD-MALDI-MS workflow resulted in a 75% first-pass positive identification rate with an analysis time of <1 s per sample. © 2015 Society for Laboratory Automation and Screening.

  12. Laboratory Automation and Middleware.

    PubMed

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. The "hospital central laboratory": automation, integration and clinical usefulness.

    PubMed

    Zaninotto, Martina; Plebani, Mario

    2010-07-01

    Recent technological developments in laboratory medicine have led to a major challenge, maintaining a close connection between the search of efficiency through automation and consolidation and the assurance of effectiveness. The adoption of systems that automate most of the manual tasks characterizing routine activities has significantly improved the quality of laboratory performance; total laboratory automation being the paradigm of the idea that "human-less" robotic laboratories may allow for better operation and insuring less human errors. Furthermore, even if ongoing technological developments have considerably improved the productivity of clinical laboratories as well as reducing the turnaround time of the entire process, the value of qualified personnel remains a significant issue. Recent evidence confirms that automation allows clinical laboratories to improve analytical performances only if trained staff operate in accordance with well-defined standard operative procedures, thus assuring continuous monitoring of the analytical quality. In addition, laboratory automation may improve the appropriateness of test requests through the use of algorithms and reflex testing. This should allow the adoption of clinical and biochemical guidelines. In conclusion, in laboratory medicine, technology represents a tool for improving clinical effectiveness and patient outcomes, but it has to be managed by qualified laboratory professionals.

  14. Control task substitution in semiautomated driving: does it matter what aspects are automated?

    PubMed

    Carsten, Oliver; Lai, Frank C H; Barnard, Yvonne; Jamson, A Hamish; Merat, Natasha

    2012-10-01

    The study was designed to show how driver attention to the road scene and engagement of a choice of secondary tasks are affected by the level of automation provided to assist or take over the basic task of vehicle control. It was also designed to investigate the difference between support in longitudinal control and support in lateral control. There is comparatively little literature on the implications of automation for drivers' engagement in the driving task and for their willingness to engage in non-driving-related activities. A study was carried out on a high-level driving simulator in which drivers experienced three levels of automation: manual driving, semiautomated driving with either longitudinal or lateral control provided, and highly automated driving with both longitudinal and lateral control provided. Drivers were free to pay attention to the roadway and traffic or to engage in a range of entertainment and grooming tasks. Engagement in the nondriving tasks increased from manual to semiautomated driving and increased further with highly automated driving. There were substantial differences in attention to the road and traffic between the two types of semiautomated driving. The literature on automation and the various task analyses of driving do not currently help to explain the effects that were found. Lateral support and longitudinal support may be the same in terms of levels of automation but appear to be regarded rather differently by drivers.

  15. SU-G-BRB-05: Automation of the Photon Dosimetric Quality Assurance Program of a Linear Accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lebron, S; Lu, B; Yan, G

    Purpose: To develop an automated method to calculate a linear accelerator (LINAC) photon radiation field size, flatness, symmetry, output and beam quality in a single delivery for flattened (FF) and flattening-filter-free (FFF) beams using an ionization chamber array. Methods: The proposed method consists of three control points that deliver 30×30, 10×10 and 5×5cm{sup 2} fields (FF or FFF) in a step-and-shoot sequence where the number of monitor units is weighted for each field size. The IC Profiler (Sun Nuclear Inc.) with 5mm detector spacing was used for this study. The corrected counts (CCs) were calculated and the locations of themore » maxima and minima values of the first-order gradient determined data of each sub field. Then, all CCs for each field size are summed in order to obtain the final profiles. For each profile, the radiation field size, symmetry, flatness, output factor and beam quality were calculated. For field size calculation, a parameterized gradient method was used. For method validation, profiles were collected in the detector array both, individually and as part of the step-and-shoot plan, with 9.9cm buildup for FF and FFF beams at 90cm source-to-surface distance. The same data were collected with the device (plus buildup) placed on a movable platform to achieve a 1mm resolution. Results: The differences between the dosimetric quantities calculated from both deliveries, individually and step-and-shoot, were within 0.31±0.20% and 0.04±0.02mm. The differences between the calculated field sizes with 5mm and 1mm resolution were ±0.1mm. Conclusion: The proposed single delivery method proved to be simple and efficient in automating the photon dosimetric monthly and annual quality assurance.« less

  16. Coming back into the loop: Drivers' perceptual-motor performance in critical events after automated driving.

    PubMed

    Louw, Tyron; Markkula, Gustav; Boer, Erwin; Madigan, Ruth; Carsten, Oliver; Merat, Natasha

    2017-11-01

    This driving simulator study, conducted as part of the EU AdaptIVe project, investigated drivers' performance in critical traffic events, during the resumption of control from an automated driving system. Prior to the critical events, using a between-participant design, 75 drivers were exposed to various screen manipulations that varied the amount of available visual information from the road environment and automation state, which aimed to take them progressively further 'out-of-the-loop' (OoTL). The current paper presents an analysis of the timing, type, and rate of drivers' collision avoidance response, also investigating how these were influenced by the criticality of the unfolding situation. Results showed that the amount of visual information available to drivers during automation impacted on how quickly they resumed manual control, with less information associated with slower take-over times, however, this did not influence the timing of when drivers began a collision avoidance manoeuvre. Instead, the observed behaviour is in line with recent accounts emphasising the role of scenario kinematics in the timing of driver avoidance response. When considering collision incidents in particular, avoidance manoeuvres were initiated when the situation criticality exceeded an Inverse Time To Collision value of ≈0.3s -1 . Our results suggest that take-over time and timing and quality of avoidance response appear to be largely independent, and while long take-over time did not predict collision outcome, kinematically late initiation of avoidance did. Hence, system design should focus on achieving kinematically early avoidance initiation, rather than short take-over times. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Human Factor and Usability Testing of a Binocular Optical Coherence Tomography System

    PubMed Central

    Chopra, Reena; Mulholland, Pádraig J.; Dubis, Adam M.; Anderson, Roger S.; Keane, Pearse A.

    2017-01-01

    Purpose To perform usability testing of a binocular optical coherence tomography (OCT) prototype to predict its function in a clinical setting, and to identify any potential user errors, especially in an elderly and visually impaired population. Methods Forty-five participants with chronic eye disease (mean age 62.7 years) and 15 healthy controls (mean age 53 years) underwent automated eye examination using the prototype. Examination included ‘whole-eye' OCT, ocular motility, visual acuity measurement, perimetry, and pupillometry. Interviews were conducted to assess the subjective appeal and ease of use for this cohort of first-time users. Results All participants completed the full suite of tests. Eighty-one percent of the chronic eye disease group, and 79% of healthy controls, found the prototype easier to use than common technologies, such as smartphones. Overall, 86% described the device to be appealing for use in a clinical setting. There was no statistically significant difference in the total time taken to complete the examination between participants with chronic eye disease (median 702 seconds) and healthy volunteers (median 637 seconds) (P = 0.81). Conclusion On their first use, elderly and visually impaired users completed the automated examination without assistance. Binocular OCT has the potential to perform a comprehensive eye examination in an automated manner, and thus improve the efficiency and quality of eye care. Translational Relevance A usable binocular OCT system has been developed that can be administered in an automated manner. We have identified areas that would benefit from further development to guide the translation of this technology into clinical practice. PMID:28824827

  18. Separation of pigment formulations by high-performance thin-layer chromatography with automated multiple development.

    PubMed

    Stiefel, Constanze; Dietzel, Sylvia; Endress, Marc; Morlock, Gertrud E

    2016-09-02

    Food packaging is designed to provide sufficient protection for the respective filling, legally binding information for the consumers like nutritional facts or filling information, and an attractive appearance to promote the sale. For quality and safety of the package, a regular quality control of the used printing materials is necessary to get consistently good print results, to avoid migration of undesired ink components into the food and to identify potentially faulty ink batches. Analytical approaches, however, have hardly been considered for quality assurance so far due to the lack of robust, suitable methods for the analysis of rarely soluble pigment formulations. Thus, a simple and generic high-performance thin-layer chromatography (HPTLC) method for the separation of different colored pigment formulations was developed on HPTLC plates silica gel 60 by automated multiple development. The gradient system provided a sharp resolution for differently soluble pigment constituents like additives and coating materials. The results of multi-detection allowed a first assignment of the differently detectable bands to particular chemical substance classes (e.g., lipophilic components), enabled the comparison of different commercially available pigment batches and revealed substantial variations in the composition of the batches. Hyphenation of HPTLC with high resolution mass spectrometry and infrared spectroscopy allowed the characterization of single unknown pigment constituents, which may partly be responsible for known quality problems during printing. The newly developed, precise and selective HPTLC method can be used as part of routine quality control for both, incoming pigment batches and monitoring of internal pigment production processes, to secure a consistent pigment composition resulting in consistent ink quality, a faultless print image and safe products. Hyphenation of HPTLC with the A. fischeri bioassay gave first information on the bioactivity or rather on the toxicological potential of different compounds of the pigment formulations. The results of the bioassay might be helpful to choose pigment compositions that provide both, a high printing quality but at the same time guarantee a high consumer safety, especially in regard to smaller pigment components, which tend to migrate through the packaging. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Model-Based Design of Air Traffic Controller-Automation Interaction

    NASA Technical Reports Server (NTRS)

    Romahn, Stephan; Callantine, Todd J.; Palmer, Everett A.; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    A model of controller and automation activities was used to design the controller-automation interactions necessary to implement a new terminal area air traffic management concept. The model was then used to design a controller interface that provides the requisite information and functionality. Using data from a preliminary study, the Crew Activity Tracking System (CATS) was used to help validate the model as a computational tool for describing controller performance.

  20. Vision-based real-time position control of a semi-automated system for robot-assisted joint fracture surgery.

    PubMed

    Dagnino, Giulio; Georgilas, Ioannis; Tarassoli, Payam; Atkins, Roger; Dogramadzi, Sanja

    2016-03-01

    Joint fracture surgery quality can be improved by robotic system with high-accuracy and high-repeatability fracture fragment manipulation. A new real-time vision-based system for fragment manipulation during robot-assisted fracture surgery was developed and tested. The control strategy was accomplished by merging fast open-loop control with vision-based control. This two-phase process is designed to eliminate the open-loop positioning errors by closing the control loop using visual feedback provided by an optical tracking system. Evaluation of the control system accuracy was performed using robot positioning trials, and fracture reduction accuracy was tested in trials on ex vivo porcine model. The system resulted in high fracture reduction reliability with a reduction accuracy of 0.09 mm (translations) and of [Formula: see text] (rotations), maximum observed errors in the order of 0.12 mm (translations) and of [Formula: see text] (rotations), and a reduction repeatability of 0.02 mm and [Formula: see text]. The proposed vision-based system was shown to be effective and suitable for real joint fracture surgical procedures, contributing a potential improvement of their quality.

  1. "First generation" automated DNA sequencing technology.

    PubMed

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  2. GCE Data Toolbox for MATLAB - a software framework for automating environmental data processing, quality control and documentation

    NASA Astrophysics Data System (ADS)

    Sheldon, W.; Chamblee, J.; Cary, R. H.

    2013-12-01

    Environmental scientists are under increasing pressure from funding agencies and journal publishers to release quality-controlled data in a timely manner, as well as to produce comprehensive metadata for submitting data to long-term archives (e.g. DataONE, Dryad and BCO-DMO). At the same time, the volume of digital data that researchers collect and manage is increasing rapidly due to advances in high frequency electronic data collection from flux towers, instrumented moorings and sensor networks. However, few pre-built software tools are available to meet these data management needs, and those tools that do exist typically focus on part of the data management lifecycle or one class of data. The GCE Data Toolbox has proven to be both a generalized and effective software solution for environmental data management in the Long Term Ecological Research Network (LTER). This open source MATLAB software library, developed by the Georgia Coastal Ecosystems LTER program, integrates metadata capture, creation and management with data processing, quality control and analysis to support the entire data lifecycle. Raw data can be imported directly from common data logger formats (e.g. SeaBird, Campbell Scientific, YSI, Hobo), as well as delimited text files, MATLAB files and relational database queries. Basic metadata are derived from the data source itself (e.g. parsed from file headers) and by value inspection, and then augmented using editable metadata templates containing boilerplate documentation, attribute descriptors, code definitions and quality control rules. Data and metadata content, quality control rules and qualifier flags are then managed together in a robust data structure that supports database functionality and ensures data validity throughout processing. A growing suite of metadata-aware editing, quality control, analysis and synthesis tools are provided with the software to support managing data using graphical forms and command-line functions, as well as developing automated workflows for unattended processing. Finalized data and structured metadata can be exported in a wide variety of text and MATLAB formats or uploaded to a relational database for long-term archiving and distribution. The GCE Data Toolbox can be used as a complete, light-weight solution for environmental data and metadata management, but it can also be used in conjunction with other cyber infrastructure to provide a more comprehensive solution. For example, newly acquired data can be retrieved from a Data Turbine or Campbell LoggerNet Database server for quality control and processing, then transformed to CUAHSI Observations Data Model format and uploaded to a HydroServer for distribution through the CUAHSI Hydrologic Information System. The GCE Data Toolbox can also be leveraged in analytical workflows developed using Kepler or other systems that support MATLAB integration or tool chaining. This software can therefore be leveraged in many ways to help researchers manage, analyze and distribute the data they collect.

  3. AN ULTRAVIOLET-VISIBLE SPECTROPHOTOMETER AUTOMATION SYSTEM. PART III: PROGRAM DOCUMENTATION

    EPA Science Inventory

    The Ultraviolet-Visible Spectrophotometer (UVVIS) automation system accomplishes 'on-line' spectrophotometric quality assurance determinations, report generations, plot generations and data reduction for chlorophyll or color analysis. This system also has the capability to proces...

  4. A compendium of controlled diffusion blades generated by an automated inverse design procedure

    NASA Technical Reports Server (NTRS)

    Sanz, Jose M.

    1989-01-01

    A set of sample cases was produced to test an automated design procedure developed at the NASA Lewis Research Center for the design of controlled diffusion blades. The range of application of the automated design procedure is documented. The results presented include characteristic compressor and turbine blade sections produced with the automated design code as well as various other airfoils produced with the base design method prior to the incorporation of the automated procedure.

  5. Evaluation of droplet digital PCR for quantification of residual leucocytes in red blood cell concentrates.

    PubMed

    Doescher, A; Loges, U; Petershofen, E K; Müller, T H

    2017-11-01

    Enumeration of residual white blood cells in leucoreduced blood components is essential part of quality control. Digital PCR has substantially facilitated quantitative PCR and was thus evaluated for measurements of leucocytes. Target for quantification of leucocytes by digital droplet PCR was the blood group gene RHCE. The SPEF1 gene was added as internal control for the entire assay starting with automated DNA extraction. The sensitivity of the method was determined by serial dilutions of standard samples. Quality control samples were analysed within 24 h, 7 days and 6 months after collection. Routine samples from leucodepleted red blood cell concentrates (n = 150) were evaluated in parallel by flow-cytometry (LeucoCount) and by digital PCR. Digital PCR reliably detected at least 0·4 leucocytes per assay. The mean difference between PCR and flow-cytometric results from 150 units was -0·01 (±1·0). DNA samples were stable for up to at least six months. PCR measurement of leucocytes in samples from plasma and platelet concentrates also provided valid results in a pilot study. Droplet digital PCR to enumerate leucocytes offers an alternative for quality control of leucoreduced blood products. Sensitivity, specificity and reproducibility are comparable to flow-cytometry. The option to collect samples over an extended period of time and the automatization introduce attractive features for routine quality control. © 2017 International Society of Blood Transfusion.

  6. An approach to describing and analysing bulk biological annotation quality: a case study using UniProtKB.

    PubMed

    Bell, Michael J; Gillespie, Colin S; Swan, Daniel; Lord, Phillip

    2012-09-15

    Annotations are a key feature of many biological databases, used to convey our knowledge of a sequence to the reader. Ideally, annotations are curated manually, however manual curation is costly, time consuming and requires expert knowledge and training. Given these issues and the exponential increase of data, many databases implement automated annotation pipelines in an attempt to avoid un-annotated entries. Both manual and automated annotations vary in quality between databases and annotators, making assessment of annotation reliability problematic for users. The community lacks a generic measure for determining annotation quality and correctness, which we look at addressing within this article. Specifically we investigate word reuse within bulk textual annotations and relate this to Zipf's Principle of Least Effort. We use the UniProt Knowledgebase (UniProtKB) as a case study to demonstrate this approach since it allows us to compare annotation change, both over time and between automated and manually curated annotations. By applying power-law distributions to word reuse in annotation, we show clear trends in UniProtKB over time, which are consistent with existing studies of quality on free text English. Further, we show a clear distinction between manual and automated analysis and investigate cohorts of protein records as they mature. These results suggest that this approach holds distinct promise as a mechanism for judging annotation quality. Source code is available at the authors website: http://homepages.cs.ncl.ac.uk/m.j.bell1/annotation. phillip.lord@newcastle.ac.uk.

  7. Preface to the special section on human factors and automation in vehicles: designing highly automated vehicles with the driver in mind.

    PubMed

    Merat, Natasha; Lee, John D

    2012-10-01

    This special section brings together diverse research regarding driver interaction with advanced automotive technology to guide design of increasingly automated vehicles. Rapidly evolving vehicle automation will likely change cars and trucks more in the next 5 years than the preceding 50, radically redefining what it means to drive. This special section includes 10 articles from European and North American researchers reporting simulator and naturalistic driving studies. Little research has considered the consequences of fully automated driving, with most focusing on lane-keeping and speed control systems individually. The studies reveal two underlying design philosophies: automate driving versus support driving. Results of several studies, consistent with previous research in other domains, suggest that the automate philosophy can delay driver responses to incidents in which the driver has to intervene and take control from the automation. Understanding how to orchestrate the transfer or sharing of control between the system and the driver, particularly in critical incidents, emerges as a central challenge. Designers should not assume that automation can substitute seamlessly for a human driver, nor can they assume that the driver can safely accommodate the limitations of automation. Designers, policy makers, and researchers must give careful consideration to what role the person should have in highly automated vehicles and how to support the driver if the driver is to be responsible for vehicle control. As in other domains, driving safety increasingly depends on the combined performance of the human and automation, and successful designs will depend on recognizing and supporting the new roles of the driver.

  8. Comparison of architect I 2000 for determination of cyclosporine with axsym.

    PubMed

    Serdarevic, Nafija; Zunic, Lejla

    2012-12-01

    Cyclosporine has been shown effective drug in suppressing acute rejection in recipients of allograft organ transplants. The cyclosporine concentration of 96 blood samples was determined using CMIA (chemiluminesecent microparticle immnoassay) Architect i 2000 and FPIA (fluorescence polarization immunoassay) AxSYM Abbott diagnostic. All patients have transplantation of kidneys and were hospitalized at Department of Nephrology at the Clinical center of University of Sarajevo. The reference serum range of cyclosporine for kidney organ transplantation for maintenance lies between 50 and 150 ng/mL. The quality control, precision and accurancy of Architect i 2000 were assessed. The quality control was done using quality control serums for low (= 91 ng/mL), medium (= 328 ng/mL) and high (= 829 ng/mL). We have used commercial BIORAD controls and got reproducibility CV 5.83 % to 13 % for Architect i 2000. It was established that the main difference between Architect i 2000 and AxSYM and it was statistically significant for P < 0.05 according to Student t-test. Correlation coefficient was r = 0.903. The CMIA Architect assay has significant reduced cyclosporine metabolite interference relative to other immunoassay and is a convenient and sensitive automated method to measure cyclosporine in whole blood.

  9. Looking ahead in systems engineering

    NASA Technical Reports Server (NTRS)

    Feigenbaum, Donald S.

    1966-01-01

    Five areas that are discussed in this paper are: (1) the technological characteristics of systems engineering; (2) the analytical techniques that are giving modern systems work its capability and power; (3) the management, economics, and effectiveness dimensions that now frame the modern systems field; (4) systems engineering's future impact upon automation, computerization and managerial decision-making in industry - and upon aerospace and weapons systems in government and the military; and (5) modern systems engineering's partnership with modern quality control and reliability.

  10. Software Quality Control at Belle II

    NASA Astrophysics Data System (ADS)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  11. Automated acoustic matrix deposition for MALDI sample preparation.

    PubMed

    Aerni, Hans-Rudolf; Cornett, Dale S; Caprioli, Richard M

    2006-02-01

    Novel high-throughput sample preparation strategies for MALDI imaging mass spectrometry (IMS) and profiling are presented. An acoustic reagent multispotter was developed to provide improved reproducibility for depositing matrix onto a sample surface, for example, such as a tissue section. The unique design of the acoustic droplet ejector and its optimization for depositing matrix solution are discussed. Since it does not contain a capillary or nozzle for fluid ejection, issues with clogging of these orifices are avoided. Automated matrix deposition provides better control of conditions affecting protein extraction and matrix crystallization with the ability to deposit matrix accurately onto small surface features. For tissue sections, matrix spots of 180-200 microm in diameter were obtained and a procedure is described for generating coordinate files readable by a mass spectrometer to permit automated profile acquisition. Mass spectral quality and reproducibility was found to be better than that obtained with manual pipet spotting. The instrument can also deposit matrix spots in a dense array pattern so that, after analysis in a mass spectrometer, two-dimensional ion images may be constructed. Example ion images from a mouse brain are presented.

  12. Towards Evolving Electronic Circuits for Autonomous Space Applications

    NASA Technical Reports Server (NTRS)

    Lohn, Jason D.; Haith, Gary L.; Colombano, Silvano P.; Stassinopoulos, Dimitris

    2000-01-01

    The relatively new field of Evolvable Hardware studies how simulated evolution can reconfigure, adapt, and design hardware structures in an automated manner. Space applications, especially those requiring autonomy, are potential beneficiaries of evolvable hardware. For example, robotic drilling from a mobile platform requires high-bandwidth controller circuits that are difficult to design. In this paper, we present automated design techniques based on evolutionary search that could potentially be used in such applications. First, we present a method of automatically generating analog circuit designs using evolutionary search and a circuit construction language. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. Using a parallel genetic algorithm, we present experimental results for five design tasks. Second, we investigate the use of coevolution in automated circuit design. We examine fitness evaluation by comparing the effectiveness of four fitness schedules. The results indicate that solution quality is highest with static and co-evolving fitness schedules as compared to the other two dynamic schedules. We discuss these results and offer two possible explanations for the observed behavior: retention of useful information, and alignment of problem difficulty with circuit proficiency.

  13. Automated Car Park Management System

    NASA Astrophysics Data System (ADS)

    Fabros, J. P.; Tabañag, D.; Espra, A.; Gerasta, O. J.

    2015-06-01

    This study aims to develop a prototype for an Automated Car Park Management System that will increase the quality of service of parking lots through the integration of a smart system that assists motorist in finding vacant parking lot. The research was based on implementing an operating system and a monitoring system for parking system without the use of manpower. This will include Parking Guidance and Information System concept which will efficiently assist motorists and ensures the safety of the vehicles and the valuables inside the vehicle. For monitoring, Optical Character Recognition was employed to monitor and put into list all the cars entering the parking area. All parking events in this system are visible via MATLAB GUI which contain time-in, time-out, time consumed information and also the lot number where the car parks. To put into reality, this system has a payment method, and it comes via a coin slot operation to control the exit gate. The Automated Car Park Management System was successfully built by utilizing microcontrollers specifically one PIC18f4550 and two PIC16F84s and one PIC16F628A.

  14. Automated Work Packages Prototype: Initial Design, Development, and Evaluation. Light Water Reactor Sustainability Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oxstrand, Johanna Helene; Ahmad Al Rashdan; Le Blanc, Katya Lee

    The goal of the Automated Work Packages (AWP) project is to demonstrate how to enhance work quality, cost management, and nuclear safety through the use of advanced technology. The work described in this report is part of the digital architecture for a highly automated plant project of the technical program plan for advanced instrumentation, information, and control (II&C) systems technologies. This report addresses the DOE Milestone M2LW-15IN0603112: Describe the outcomes of field evaluations/demonstrations of the AWP prototype system and plant surveillance and communication framework requirements at host utilities. A brief background to the need for AWP research is provided, thenmore » two human factors field evaluation studies are described. These studies focus on the user experience of conducting a task (in this case a preventive maintenance and a surveillance test) while using an AWP system. The remaining part of the report describes an II&C effort to provide real time status updates to the technician by wireless transfer of equipment indications and a dynamic user interface.« less

  15. The automation of remote vehicle control. [in Mars roving vehicles

    NASA Technical Reports Server (NTRS)

    Paine, G.

    1977-01-01

    The automation of remote vehicles is becoming necessary to overcome the requirement of having man present as a controller. By removing man, remote vehicles can be operated in areas where the environment is too hostile for man, his reaction times are too slow, time delays are too long, and where his presence is too costly, or where system performance can be improved. This paper addresses the development of automated remote vehicle control for nonspace and space tasks from warehouse vehicles to proposed Mars rovers. The state-of-the-art and the availability of new technology for implementing automated control are reviewed and the major problem areas are outlined. The control strategies are divided into those where the path is planned in advance or constrained, or where the system is a teleoperator, or where automation or robotics have been introduced.

  16. Automated control of linear constricted plasma source array

    DOEpatents

    Anders, Andre; Maschwitz, Peter A.

    2000-01-01

    An apparatus and method for controlling an array of constricted glow discharge chambers are disclosed. More particularly a linear array of constricted glow plasma sources whose polarity and geometry are set so that the contamination and energy of the ions discharged from the sources are minimized. The several sources can be mounted in parallel and in series to provide a sustained ultra low source of ions in a plasma with contamination below practical detection limits. The quality of film along deposition "tracks" opposite the plasma sources can be measured and compared to desired absolute or relative values by optical and/or electrical sensors. Plasma quality can then be adjusted by adjusting the power current values, gas feed pressure/flow, gas mixtures or a combination of some or all of these to improve the match between the measured values and the desired values.

  17. Development of the quality control system of the readout electronics for the large size telescope of the Cherenkov Telescope Array observatory

    NASA Astrophysics Data System (ADS)

    Konno, Y.; Kubo, H.; Masuda, S.; Paoletti, R.; Poulios, S.; Rugliancich, A.; Saito, T.

    2016-07-01

    The Cherenkov Telescope Array (CTA) is the next generation VHE γ-ray observatory which will improve the currently available sensitivity by a factor of 10 in the range 100 GeV to 10 TeV. The array consists of different types of telescopes, called large size telescope (LST), medium size telescope (MST) and small size telescope (SST). A LST prototype is currently being built and will be installed at the Observatorio Roque de los Muchachos, island of La Palma, Canary islands, Spain. The readout system for the LST prototype has been designed and around 300 readout boards will be produced in the coming months. In this note we describe an automated quality control system able to measure basic performance parameters and quickly identify faulty boards.

  18. Database Performance Monitoring for the Photovoltaic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Katherine A.

    The Database Performance Monitoring (DPM) software (copyright in processes) is being developed at Sandia National Laboratories to perform quality control analysis on time series data. The software loads time indexed databases (currently csv format), performs a series of quality control tests defined by the user, and creates reports which include summary statistics, tables, and graphics. DPM can be setup to run on an automated schedule defined by the user. For example, the software can be run once per day to analyze data collected on the previous day. HTML formatted reports can be sent via email or hosted on a website.more » To compare performance of several databases, summary statistics and graphics can be gathered in a dashboard view which links to detailed reporting information for each database. The software can be customized for specific applications.« less

  19. Flight-deck automation - Promises and problems

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.; Curry, R. E.

    1980-01-01

    The paper analyzes the role of human factors in flight-deck automation, identifies problem areas, and suggests design guidelines. Flight-deck automation using microprocessor technology and display systems improves performance and safety while leading to a decrease in size, cost, and power consumption. On the other hand negative factors such as failure of automatic equipment, automation-induced error compounded by crew error, crew error in equipment set-up, failure to heed automatic alarms, and loss of proficiency must also be taken into account. Among the problem areas discussed are automation of control tasks, monitoring of complex systems, psychosocial aspects of automation, and alerting and warning systems. Guidelines are suggested for designing, utilising, and improving control and monitoring systems. Investigation into flight-deck automation systems is important as the knowledge gained can be applied to other systems such as air traffic control and nuclear power generation, but the many problems encountered with automated systems need to be analyzed and overcome in future research.

  20. Designing effective human-automation-plant interfaces: a control-theoretic perspective.

    PubMed

    Jamieson, Greg A; Vicente, Kim J

    2005-01-01

    In this article, we propose the application of a control-theoretic framework to human-automation interaction. The framework consists of a set of conceptual distinctions that should be respected in automation research and design. We demonstrate how existing automation interface designs in some nuclear plants fail to recognize these distinctions. We further show the value of the approach by applying it to modes of automation. The design guidelines that have been proposed in the automation literature are evaluated from the perspective of the framework. This comparison shows that the framework reveals insights that are frequently overlooked in this literature. A new set of design guidelines is introduced that builds upon the contributions of previous research and draws complementary insights from the control-theoretic framework. The result is a coherent and systematic approach to the design of human-automation-plant interfaces that will yield more concrete design criteria and a broader set of design tools. Applications of this research include improving the effectiveness of human-automation interaction design and the relevance of human-automation interaction research.

  1. Fatigue and voluntary utilization of automation in simulated driving.

    PubMed

    Neubauer, Catherine; Matthews, Gerald; Langheim, Lisa; Saxby, Dyani

    2012-10-01

    A driving simulator was used to assess the impact on fatigue, stress, and workload of full vehicle automation that was initiated by the driver. Previous studies have shown that mandatory use of full automation induces a state of "passive fatigue" associated with loss of alertness. By contrast, voluntary use of automation may enhance the driver's perceptions of control and ability to manage fatigue. Participants were assigned to one of two experimental conditions, automation optional (AO) and nonautomation (NA), and then performed a 35 min, monotonous simulated drive. In the last 5 min, automation was unavailable and drivers were required to respond to an emergency event. Subjective state and workload were evaluated before and after the drive. Making automation available to the driver failed to alleviate fatigue and stress states induced by driving in monotonous conditions. Drivers who were fatigued prior to the drive were more likely to choose to use automation, but automation use increased distress, especially in fatigue-prone drivers. Drivers in the AO condition were slower to initiate steering responses to the emergency event, suggesting optional automation may be distracting. Optional, driver-controlled automation appears to pose the same dangers to task engagement and alertness as externally initiated automation. Drivers of automated vehicles may be vulnerable to fatigue that persists when normal vehicle control is restored. It is important to evaluate automated systems' impact on driver fatigue, to seek design solutions to the issue of maintaining driver engagement, and to address the vulnerabilities of fatigue-prone drivers.

  2. Maximizing coupling-efficiency of high-power diode lasers utilizing hybrid assembly technology

    NASA Astrophysics Data System (ADS)

    Zontar, D.; Dogan, M.; Fulghum, S.; Müller, T.; Haag, S.; Brecher, C.

    2015-03-01

    In this paper, we present hybrid assembly technology to maximize coupling efficiency for spatially combined laser systems. High quality components, such as center-turned focusing units, as well as suitable assembly strategies are necessary to obtain highest possible output ratios. Alignment strategies are challenging tasks due to their complexity and sensitivity. Especially in low-volume production fully automated systems are economically at a disadvantage, as operator experience is often expensive. However reproducibility and quality of automatically assembled systems can be superior. Therefore automated and manual assembly techniques are combined to obtain high coupling efficiency while preserving maximum flexibility. The paper will describe necessary equipment and software to enable hybrid assembly processes. Micromanipulator technology with high step-resolution and six degrees of freedom provide a large number of possible evaluation points. Automated algorithms are necess ary to speed-up data gathering and alignment to efficiently utilize available granularity for manual assembly processes. Furthermore, an engineering environment is presented to enable rapid prototyping of automation tasks with simultaneous data ev aluation. Integration with simulation environments, e.g. Zemax, allows the verification of assembly strategies in advance. Data driven decision making ensures constant high quality, documents the assembly process and is a basis for further improvement. The hybrid assembly technology has been applied on several applications for efficiencies above 80% and will be discussed in this paper. High level coupling efficiency has been achieved with minimized assembly as a result of semi-automated alignment. This paper will focus on hybrid automation for optimizing and attaching turning mirrors and collimation lenses.

  3. Development of design principles for automated systems in transport control.

    PubMed

    Balfe, Nora; Wilson, John R; Sharples, Sarah; Clarke, Theresa

    2012-01-01

    This article reports the results of a qualitative study investigating attitudes towards and opinions of an advanced automation system currently used in UK rail signalling. In-depth interviews were held with 10 users, key issues associated with automation were identified and the automation's impact on the signalling task investigated. The interview data highlighted the importance of the signallers' understanding of the automation and their (in)ability to predict its outputs. The interviews also covered the methods used by signallers to interact with and control the automation, and the perceived effects on their workload. The results indicate that despite a generally low level of understanding and ability to predict the actions of the automation system, signallers have developed largely successful coping mechanisms that enable them to use the technology effectively. These findings, along with parallel work identifying desirable attributes of automation from the literature in the area, were used to develop 12 principles of automation which can be used to help design new systems which better facilitate cooperative working. The work reported in this article was completed with the active involvement of operational rail staff who regularly use automated systems in rail signalling. The outcomes are currently being used to inform decisions on the extent and type of automation and user interfaces in future generations of rail control systems.

  4. Finding the ’RITE’ Acquisition Environment for Navy C2 Software

    DTIC Science & Technology

    2015-05-01

    Boiler plate contract language - Gov purpose Rights • Adding expectation of quality to contracting language • Template SOW’s created Pr...Debugger MCCABE IQ Static Analysis Cyclomatic Complexity and KSLOC. All Languages HP Fortify Security Scan STIG and Vulnerabilities Security & IA...GSSAT (GOTS) Security Scan STIG and Vulnerabilities AutoIT Automated Test Scripting Engine for Automation Functional Testing TestComplete Automated

  5. Comparison study of membrane filtration direct count and an automated coliform and Escherichia coli detection system for on-site water quality testing.

    PubMed

    Habash, Marc; Johns, Robert

    2009-10-01

    This study compared an automated Escherichia coli and coliform detection system with the membrane filtration direct count technique for water testing. The automated instrument performed equal to or better than the membrane filtration test in analyzing E. coli-spiked samples and blind samples with interference from Proteus vulgaris or Aeromonas hydrophila.

  6. Development of critical dimension measurement scanning electron microscope for ULSI (S-8000 series)

    NASA Astrophysics Data System (ADS)

    Ezumi, Makoto; Otaka, Tadashi; Mori, Hiroyoshi; Todokoro, Hideo; Ose, Yoichi

    1996-05-01

    The semiconductor industry is moving from half-micron to quarter-micron design rules. To support this evolution, Hitachi has developed a new critical dimension measurement scanning electron microscope (CD-SEM), the model S-8800 series, for quality control of quarter- micron process lines. The new CD-SEM provides detailed examination of process conditions with 5 nm resolution and 5 nm repeatability (3 sigma) at accelerating voltage 800 V using secondary electron imaging. In addition, a newly developed load-lock system has a capability of achieving a high sample throughput of 20 wafers/hour (5 point measurements per wafer) under continuous operation. To support user friendliness, the system incorporates a graphical user interface (GUI), an automated pattern recognition system which helps locating measurement points, both manual and semi-automated operation, and user-programmable operating parameters.

  7. Scale invariant feature transform in adaptive radiation therapy: a tool for deformable image registration assessment and re-planning indication

    NASA Astrophysics Data System (ADS)

    Paganelli, Chiara; Peroni, Marta; Riboldi, Marco; Sharp, Gregory C.; Ciardo, Delia; Alterio, Daniela; Orecchia, Roberto; Baroni, Guido

    2013-01-01

    Adaptive radiation therapy (ART) aims at compensating for anatomic and pathological changes to improve delivery along a treatment fraction sequence. Current ART protocols require time-consuming manual updating of all volumes of interest on the images acquired during treatment. Deformable image registration (DIR) and contour propagation stand as a state of the ART method to automate the process, but the lack of DIR quality control methods hinder an introduction into clinical practice. We investigated the scale invariant feature transform (SIFT) method as a quantitative automated tool (1) for DIR evaluation and (2) for re-planning decision-making in the framework of ART treatments. As a preliminary test, SIFT invariance properties at shape-preserving and deformable transformations were studied on a computational phantom, granting residual matching errors below the voxel dimension. Then a clinical dataset composed of 19 head and neck ART patients was used to quantify the performance in ART treatments. For the goal (1) results demonstrated SIFT potential as an operator-independent DIR quality assessment metric. We measured DIR group systematic residual errors up to 0.66 mm against 1.35 mm provided by rigid registration. The group systematic errors of both bony and all other structures were also analyzed, attesting the presence of anatomical deformations. The correct automated identification of 18 patients who might benefit from ART out of the total 22 cases using SIFT demonstrated its capabilities toward goal (2) achievement.

  8. Effects of imperfect automation on decision making in a simulated command and control task.

    PubMed

    Rovira, Ericka; McGarry, Kathleen; Parasuraman, Raja

    2007-02-01

    Effects of four types of automation support and two levels of automation reliability were examined. The objective was to examine the differential impact of information and decision automation and to investigate the costs of automation unreliability. Research has shown that imperfect automation can lead to differential effects of stages and levels of automation on human performance. Eighteen participants performed a "sensor to shooter" targeting simulation of command and control. Dependent variables included accuracy and response time of target engagement decisions, secondary task performance, and subjective ratings of mental work-load, trust, and self-confidence. Compared with manual performance, reliable automation significantly reduced decision times. Unreliable automation led to greater cost in decision-making accuracy under the higher automation reliability condition for three different forms of decision automation relative to information automation. At low automation reliability, however, there was a cost in performance for both information and decision automation. The results are consistent with a model of human-automation interaction that requires evaluation of the different stages of information processing to which automation support can be applied. If fully reliable decision automation cannot be guaranteed, designers should provide users with information automation support or other tools that allow for inspection and analysis of raw data.

  9. Automated Space Processing Payloads Study. Volume 1: Executive Summary

    NASA Technical Reports Server (NTRS)

    1975-01-01

    An investigation is described which examined the extent to which the experiment hardware and operational requirements can be met by automatic control and material handling devices; payload and system concepts are defined which make extensive use of automation technology. Topics covered include experiment requirements and hardware data, capabilities and characteristics of industrial automation equipment and controls, payload grouping, automated payload conceptual design, space processing payload preliminary design, automated space processing payloads for early shuttle missions, and cost and scheduling.

  10. Computational assessment of mammography accreditation phantom images and correlation with human observer analysis

    NASA Astrophysics Data System (ADS)

    Barufaldi, Bruno; Lau, Kristen C.; Schiabel, Homero; Maidment, D. A.

    2015-03-01

    Routine performance of basic test procedures and dose measurements are essential for assuring high quality of mammograms. International guidelines recommend that breast care providers ascertain that mammography systems produce a constant high quality image, using as low a radiation dose as is reasonably achievable. The main purpose of this research is to develop a framework to monitor radiation dose and image quality in a mixed breast screening and diagnostic imaging environment using an automated tracking system. This study presents a module of this framework, consisting of a computerized system to measure the image quality of the American College of Radiology mammography accreditation phantom. The methods developed combine correlation approaches, matched filters, and data mining techniques. These methods have been used to analyze radiological images of the accreditation phantom. The classification of structures of interest is based upon reports produced by four trained readers. As previously reported, human observers demonstrate great variation in their analysis due to the subjectivity of human visual inspection. The software tool was trained with three sets of 60 phantom images in order to generate decision trees using the software WEKA (Waikato Environment for Knowledge Analysis). When tested with 240 images during the classification step, the tool correctly classified 88%, 99%, and 98%, of fibers, speck groups and masses, respectively. The variation between the computer classification and human reading was comparable to the variation between human readers. This computerized system not only automates the quality control procedure in mammography, but also decreases the subjectivity in the expert evaluation of the phantom images.

  11. RAMAS: The RITL Automated Management System. Master Control and Periodicals Control Subsystems. Stockholm Papers in Library and Information Science.

    ERIC Educational Resources Information Center

    Ya-chun, Lian

    An automated minicomputer-based library management system is being developed at the Swedish Royal Institute of Technology Library (RITL). RAMAS (the RITL Automated Management System) currently deals with periodical check-in, claiming, index-handling, and binding control. A RAMAS bibliographic record can be accessed from eight different points…

  12. Software development infrastructure for the HYBRID modeling and simulation project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Epiney, Aaron S.; Kinoshita, Robert A.; Kim, Jong Suk

    One of the goals of the HYBRID modeling and simulation project is to assess the economic viability of hybrid systems in a market that contains renewable energy sources like wind. The idea is that it is possible for the nuclear plant to sell non-electric energy cushions, which absorb (at least partially) the volatility introduced by the renewable energy sources. This system is currently modeled in the Modelica programming language. To assess the economics of the system, an optimization procedure is trying to find the minimal cost of electricity production. The RAVEN code is used as a driver for the wholemore » problem. It is assumed that at this stage, the HYBRID modeling and simulation framework can be classified as non-safety “research and development” software. The associated quality level is Quality Level 3 software. This imposes low requirements on quality control, testing and documentation. The quality level could change as the application development continues.Despite the low quality requirement level, a workflow for the HYBRID developers has been defined that include a coding standard and some documentation and testing requirements. The repository performs automated unit testing of contributed models. The automated testing is achieved via an open-source python script called BuildingsP from Lawrence Berkeley National Lab. BuildingsPy runs Modelica simulation tests using Dymola in an automated manner and generates and runs unit tests from Modelica scripts written by developers. In order to assure effective communication between the different national laboratories a biweekly videoconference has been set-up, where developers can report their progress and issues. In addition, periodic face-face meetings are organized intended to discuss high-level strategy decisions with management. A second means of communication is the developer email list. This is a list to which everybody can send emails that will be received by the collective of the developers and managers involved in the project. Thirdly, to exchange documents quickly, a SharePoint directory has been set-up. SharePoint allows teams and organizations to intelligently share, and collaborate on content from anywhere.« less

  13. General Tool for Evaluating High-Contrast Coronagraphic Telescope Performance Error Budgets

    NASA Technical Reports Server (NTRS)

    Marchen, Luis F.

    2011-01-01

    The Coronagraph Performance Error Budget (CPEB) tool automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. The tool uses a Code V prescription of the optical train, and uses MATLAB programs to call ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-of-sight pointing, with and without controlled fine-steering mirrors (FSMs). The sensitivity matrices are imported by macros into Excel 2007, where the error budget is evaluated. The user specifies the particular optics of interest, and chooses the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions, and combines that with the sensitivity matrices to generate an error budget for the system. CPEB also contains a combination of form and ActiveX controls with Visual Basic for Applications code to allow for user interaction in which the user can perform trade studies such as changing engineering requirements, and identifying and isolating stringent requirements. It contains summary tables and graphics that can be instantly used for reporting results in view graphs. The entire process to obtain a coronagraphic telescope performance error budget has been automated into three stages: conversion of optical prescription from Zemax or Code V to MACOS (in-house optical modeling and analysis tool), a linear models process, and an error budget tool process. The first process was improved by developing a MATLAB package based on the Class Constructor Method with a number of user-defined functions that allow the user to modify the MACOS optical prescription. The second process was modified by creating a MATLAB package that contains user-defined functions that automate the process. The user interfaces with the process by utilizing an initialization file where the user defines the parameters of the linear model computations. Other than this, the process is fully automated. The third process was developed based on the Terrestrial Planet Finder coronagraph Error Budget Tool, but was fully automated by using VBA code, form, and ActiveX controls.

  14. Keep the driver in control: Automating automobiles of the future.

    PubMed

    Banks, Victoria A; Stanton, Neville A

    2016-03-01

    Automated automobiles will be on our roads within the next decade but the role of the driver has not yet been formerly recognised or designed. Rather, the driver is often left in a passive monitoring role until they are required to reclaim control from the vehicle. This research aimed to test the idea of driver-initiated automation, in which the automation offers decision support that can be either accepted or ignored. The test case examined a combination of lateral and longitudinal control in addition to an auto-overtake system. Despite putting the driver in control of the automated systems by enabling them to accept or ignore behavioural suggestions (e.g. overtake), there were still issues associated with increased workload and decreased trust. These issues are likely to have arisen due to the way in which the automated system has been designed. Recommendations for improvements in systems design have been made which are likely to improve trust and make the role of the driver more transparent concerning their authority over the automated system. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  15. Transforming administrative data into real-time information in the Department of Surgery.

    PubMed

    Beaulieu, Peter A; Higgins, John H; Dacey, Lawrence J; Nugent, William C; DeFoe, Gordon R; Likosky, Donald S

    2010-10-01

    Cardiothoracic surgical programmes face increasingly more complex procedures performed on evermore challenging patients. Public and private stakeholders are demanding these programmes report process-level and clinical outcomes as a mechanism for enabling quality assurance and informed clinical decision-making. Increasingly these measures are being tied to reimbursement and institutional accreditation. The authors developed a system for linking administrative and clinical registries, in real-time, to track performance in satisfying the needs of the patients and stakeholders, as well as helping to drive continuous quality improvement. A relational surgical database was developed to link prospectively collected clinical data to administrative data sources at Dartmouth-Hitchcock Medical Center. Institutional performance was displayed over time using process control charts, and compared with both internal and regional benchmarks. Quarterly reports have been generated and automated for five surgical cohorts. Data are displayed externally on our dedicated website, and internally in the cardiothoracic surgical office suites, operating room theatre and nursing units. Monthly discussions are held with the clinical staff and have resulted in the development of quality-improvement projects. The delivery of clinical care in isolation of data and information is no longer prudent or acceptable. The present study suggests that an automated and real-time computer system may provide rich sources of data that may be used to drive improvements in the quality of care. Current and future work will be focused on identifying opportunities to integrate these data into the fabric of the delivery of care to drive process improvement.

  16. SAMOS - A Decade of High-Quality, Underway Meteorological and Oceanographic Data from Research Vessels

    NASA Astrophysics Data System (ADS)

    Smith, S. R.; Rolph, J.; Briggs, K.; Elya, J. L.; Bourassa, M. A.

    2016-02-01

    The authors will describe the successes and lessons learned from the Shipboard Automated Meteorological and Oceanographic System (SAMOS) initiative. Over the past decade, SAMOS has acquired, quality controlled, and distributed underway surface meteorological and oceanographic observations from nearly 40 oceanographic research vessels. Research vessels provide underway observations at high-temporal frequency (1-minute sampling interval) that include navigational (position, course, heading, and speed), meteorological (air temperature, humidity, wind, surface pressure, radiation, rainfall), and oceanographic (surface sea temperature and salinity) samples. Vessels recruited to the SAMOS initiative collect a high concentration of data within the U.S. continental shelf, around Hawaii and the islands of the tropical Pacific, and frequently operate well outside routine shipping lanes, capturing observations in extreme ocean environments (Southern, Arctic, South Atlantic, and South Pacific oceans) desired by the air-sea exchange, modeling, and satellite remote sensing communities. The presentation will highlight the data stewardship practices of the SAMOS initiative. Activities include routine automated and visual data quality evaluation, feedback to vessel technicians and operators regarding instrumentation errors, best practices for instrument siting and exposure on research vessels, and professional development activities for research vessel technicians. Best practices for data, metadata, and quality evaluation will be presented. We will discuss ongoing efforts to expand data services to enhance interoperability between marine data centers. Data access and archival protocols will also be presented, including how these data may be referenced and accessed via NCEI.

  17. Takeover Time in Highly Automated Vehicles: Noncritical Transitions to and From Manual Control.

    PubMed

    Eriksson, Alexander; Stanton, Neville A

    2017-06-01

    The aim of this study was to review existing research into driver control transitions and to determine the time it takes drivers to resume control from a highly automated vehicle in noncritical scenarios. Contemporary research has moved from an inclusive design approach to adhering only to mean/median values when designing control transitions in automated driving. Research into control transitions in highly automated driving has focused on urgent scenarios where drivers are given a relatively short time span to respond to a request to resume manual control. We found a paucity in research into more frequent scenarios for control transitions, such as planned exits from highway systems. Twenty-six drivers drove two scenarios with an automated driving feature activated. Drivers were asked to read a newspaper, or to monitor the system, and to relinquish, or resume, control from the automation when prompted by vehicle systems. Significantly longer control transition times were found between driving with and without secondary tasks. Control transition times were substantially longer than those reported in the peer-reviewed literature. We found that drivers take longer to resume control when under no time pressure compared with that reported in the literature. Moreover, we found that drivers occupied by a secondary task exhibit larger variance and slower responses to requests to resume control. Workload scores implied optimal workload. Intra- and interindividual differences need to be accommodated by vehicle manufacturers and policy makers alike to ensure inclusive design of contemporary systems and safety during control transitions.

  18. A framework for automatic information quality ranking of diabetes websites.

    PubMed

    Belen Sağlam, Rahime; Taskaya Temizel, Tugba

    2015-01-01

    Objective: When searching for particular medical information on the internet the challenge lies in distinguishing the websites that are relevant to the topic, and contain accurate information. In this article, we propose a framework that automatically identifies and ranks diabetes websites according to their relevance and information quality based on the website content. Design: The proposed framework ranks diabetes websites according to their content quality, relevance and evidence based medicine. The framework combines information retrieval techniques with a lexical resource based on Sentiwordnet making it possible to work with biased and untrusted websites while, at the same time, ensuring the content relevance. Measurement: The evaluation measurements used were Pearson-correlation, true positives, false positives and accuracy. We tested the framework with a benchmark data set consisting of 55 websites with varying degrees of information quality problems. Results: The proposed framework gives good results that are comparable with the non-automated information quality measuring approaches in the literature. The correlation between the results of the proposed automated framework and ground-truth is 0.68 on an average with p < 0.001 which is greater than the other proposed automated methods in the literature (r score in average is 0.33).

  19. Evaluating Cell Processes, Quality, and Biomarkers in Pluripotent Stem Cells Using Video Bioinformatics

    PubMed Central

    Lin, Sabrina C.; Bays, Brett C.; Omaiye, Esther; Bhanu, Bir; Talbot, Prue

    2016-01-01

    There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells. PMID:26848582

  20. Evaluating Cell Processes, Quality, and Biomarkers in Pluripotent Stem Cells Using Video Bioinformatics.

    PubMed

    Zahedi, Atena; On, Vincent; Lin, Sabrina C; Bays, Brett C; Omaiye, Esther; Bhanu, Bir; Talbot, Prue

    2016-01-01

    There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells.

  1. Industrial applications of automated X-ray inspection

    NASA Astrophysics Data System (ADS)

    Shashishekhar, N.

    2015-03-01

    Many industries require that 100% of manufactured parts be X-ray inspected. Factors such as high production rates, focus on inspection quality, operator fatigue and inspection cost reduction translate to an increasing need for automating the inspection process. Automated X-ray inspection involves the use of image processing algorithms and computer software for analysis and interpretation of X-ray images. This paper presents industrial applications and illustrative case studies of automated X-ray inspection in areas such as automotive castings, fuel plates, air-bag inflators and tires. It is usually necessary to employ application-specific automated inspection strategies and techniques, since each application has unique characteristics and interpretation requirements.

  2. Automated retinal image quality assessment on the UK Biobank dataset for epidemiological studies.

    PubMed

    Welikala, R A; Fraz, M M; Foster, P J; Whincup, P H; Rudnicka, A R; Owen, C G; Strachan, D P; Barman, S A

    2016-04-01

    Morphological changes in the retinal vascular network are associated with future risk of many systemic and vascular diseases. However, uncertainty over the presence and nature of some of these associations exists. Analysis of data from large population based studies will help to resolve these uncertainties. The QUARTZ (QUantitative Analysis of Retinal vessel Topology and siZe) retinal image analysis system allows automated processing of large numbers of retinal images. However, an image quality assessment module is needed to achieve full automation. In this paper, we propose such an algorithm, which uses the segmented vessel map to determine the suitability of retinal images for use in the creation of vessel morphometric data suitable for epidemiological studies. This includes an effective 3-dimensional feature set and support vector machine classification. A random subset of 800 retinal images from UK Biobank (a large prospective study of 500,000 middle aged adults; where 68,151 underwent retinal imaging) was used to examine the performance of the image quality algorithm. The algorithm achieved a sensitivity of 95.33% and a specificity of 91.13% for the detection of inadequate images. The strong performance of this image quality algorithm will make rapid automated analysis of vascular morphometry feasible on the entire UK Biobank dataset (and other large retinal datasets), with minimal operator involvement, and at low cost. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Automatic detection of retina disease: robustness to image quality and localization of anatomy structure.

    PubMed

    Karnowski, T P; Aykac, D; Giancardo, L; Li, Y; Nichols, T; Tobin, K W; Chaum, E

    2011-01-01

    The automated detection of diabetic retinopathy and other eye diseases in images of the retina has great promise as a low-cost method for broad-based screening. Many systems in the literature which perform automated detection include a quality estimation step and physiological feature detection, including the vascular tree and the optic nerve / macula location. In this work, we study the robustness of an automated disease detection method with respect to the accuracy of the optic nerve location and the quality of the images obtained as judged by a quality estimation algorithm. The detection algorithm features microaneurysm and exudate detection followed by feature extraction on the detected population to describe the overall retina image. Labeled images of retinas ground-truthed to disease states are used to train a supervised learning algorithm to identify the disease state of the retina image and exam set. Under the restrictions of high confidence optic nerve detections and good quality imagery, the system achieves a sensitivity and specificity of 94.8% and 78.7% with area-under-curve of 95.3%. Analysis of the effect of constraining quality and the distinction between mild non-proliferative diabetic retinopathy, normal retina images, and more severe disease states is included.

  4. Near real time water quality monitoring of Chivero and Manyame lakes of Zimbabwe

    NASA Astrophysics Data System (ADS)

    Muchini, Ronald; Gumindoga, Webster; Togarepi, Sydney; Pinias Masarira, Tarirai; Dube, Timothy

    2018-05-01

    Zimbabwe's water resources are under pressure from both point and non-point sources of pollution hence the need for regular and synoptic assessment. In-situ and laboratory based methods of water quality monitoring are point based and do not provide a synoptic coverage of the lakes. This paper presents novel methods for retrieving water quality parameters in Chivero and Manyame lakes, Zimbabwe, from remotely sensed imagery. Remotely sensed derived water quality parameters are further validated using in-situ data. It also presents an application for automated retrieval of those parameters developed in VB6, as well as a web portal for disseminating the water quality information to relevant stakeholders. The web portal is developed, using Geoserver, open layers and HTML. Results show the spatial variation of water quality and an automated remote sensing and GIS system with a web front end to disseminate water quality information.

  5. Differences in Field Effectiveness and Adoption between a Novel Automated Chlorination System and Household Manual Chlorination of Drinking Water in Dhaka, Bangladesh: A Randomized Controlled Trial

    PubMed Central

    Pickering, Amy J.; Crider, Yoshika; Amin, Nuhu; Bauza, Valerie; Unicomb, Leanne; Davis, Jennifer; Luby, Stephen P.

    2015-01-01

    The number of people served by networked systems that supply intermittent and contaminated drinking water is increasing. In these settings, centralized water treatment is ineffective, while household-level water treatment technologies have not been brought to scale. This study compares a novel low-cost technology designed to passively (automatically) dispense chlorine at shared handpumps with a household-level intervention providing water disinfection tablets (Aquatab), safe water storage containers, and behavior promotion. Twenty compounds were enrolled in Dhaka, Bangladesh, and randomly assigned to one of three groups: passive chlorinator, Aquatabs, or control. Over a 10-month intervention period, the mean percentage of households whose stored drinking water had detectable total chlorine was 75% in compounds with access to the passive chlorinator, 72% in compounds receiving Aquatabs, and 6% in control compounds. Both interventions also significantly improved microbial water quality. Aquatabs usage fell by 50% after behavioral promotion visits concluded, suggesting intensive promotion is necessary for sustained uptake. The study findings suggest high potential for an automated decentralized water treatment system to increase consistent access to clean water in low-income urban communities. PMID:25734448

  6. Differences in field effectiveness and adoption between a novel automated chlorination system and household manual chlorination of drinking water in Dhaka, Bangladesh: a randomized controlled trial.

    PubMed

    Pickering, Amy J; Crider, Yoshika; Amin, Nuhu; Bauza, Valerie; Unicomb, Leanne; Davis, Jennifer; Luby, Stephen P

    2015-01-01

    The number of people served by networked systems that supply intermittent and contaminated drinking water is increasing. In these settings, centralized water treatment is ineffective, while household-level water treatment technologies have not been brought to scale. This study compares a novel low-cost technology designed to passively (automatically) dispense chlorine at shared handpumps with a household-level intervention providing water disinfection tablets (Aquatab), safe water storage containers, and behavior promotion. Twenty compounds were enrolled in Dhaka, Bangladesh, and randomly assigned to one of three groups: passive chlorinator, Aquatabs, or control. Over a 10-month intervention period, the mean percentage of households whose stored drinking water had detectable total chlorine was 75% in compounds with access to the passive chlorinator, 72% in compounds receiving Aquatabs, and 6% in control compounds. Both interventions also significantly improved microbial water quality. Aquatabs usage fell by 50% after behavioral promotion visits concluded, suggesting intensive promotion is necessary for sustained uptake. The study findings suggest high potential for an automated decentralized water treatment system to increase consistent access to clean water in low-income urban communities.

  7. Quantitative structure-activity relationship models that stand the test of time.

    PubMed

    Davis, Andrew M; Wood, David J

    2013-04-01

    The pharmaceutical industry is in a period of intense change. While this has many drivers, attrition through the development process continues to be an important pressure. The emerging definitions of "compound quality" that are based on retrospective analyses of developmental attrition have highlighted a new direction for medicinal chemistry and the paradigm of "quality at the point of design". The time has come for retrospective analyses to catalyze prospective action. Quality at the point of design places pressure on the quality of our predictive models. Empirical QSAR models when built with care provide true predictive control, but their accuracy and precision can be improved. Here we describe AstraZeneca's experience of automation in QSAR model building and validation, and how an informatics system can provide a step-change in predictive power to project design teams, if they choose to use it.

  8. Indicators for the automated analysis of drug prescribing quality.

    PubMed

    Coste, J; Séné, B; Milstein, C; Bouée, S; Venot, A

    1998-01-01

    Irrational and inconsistent drug prescription has considerable impact on morbidity, mortality, health service utilization, and community burden. However, few studies have addressed the methodology of processing the information contained in these drug orders used to study the quality of drug prescriptions and prescriber behavior. We present a comprehensive set of quantitative indicators for the quality of drug prescriptions which can be derived from a drug order. These indicators were constructed using explicit a priori criteria which were previously validated on the basis of scientific data. Automatic computation is straightforward, using a relational database system, such that large sets of prescriptions can be processed with minimal human effort. We illustrate the feasibility and value of this approach by using a large set of 23,000 prescriptions for several diseases, selected from a nationally representative prescriptions database. Our study may result in direct and wide applications in the epidemiology of medical practice and in quality control procedures.

  9. Automated Blazar Light Curves Using Machine Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Spencer James

    2017-07-27

    This presentation describes a problem and methodology pertaining to automated blazar light curves. Namely, optical variability patterns for blazars require the construction of light curves and in order to generate the light curves, data must be filtered before processing to ensure quality.

  10. Principles of control automation of soil compacting machine operating mechanism

    NASA Astrophysics Data System (ADS)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The relevance of the qualitative compaction of soil bases in the erection of embankment and foundations in building and structure construction is given.The quality of the compactible gravel and sandy soils provides the bearing capability and, accordingly, the strength and durability of constructed buildings.It has been established that the compaction quality depends on many external actions, such as surface roughness and soil moisture; granulometry, chemical composition and degree of elasticity of originalfilled soil for compaction.The analysis of technological processes of soil bases compaction of foreign and domestic information sources showed that the solution of such important problem as a continuous monitoring of soil compaction actual degree in the process of machine operation carry out only with the use of modern means of automation. An effective vibrodynamic method of gravel and sand material sealing for the building structure foundations for various applications was justified and suggested.The method of continuous monitoring the soil compaction by measurement of the amplitudes and frequencies of harmonic oscillations on the compactible surface was determined, which allowed to determine the basic elements of facilities of soil compacting machine monitoring system of operating, etc. mechanisms: an accelerometer, a bandpass filter, a vibro-harmonics, an on-board microcontroller. Adjustable parameters have been established to improve the soil compaction degree and the soil compacting machine performance, and the adjustable parameter dependences on the overall indexhave been experimentally determined, which is the soil compaction degree.A structural scheme of automatic control of the soil compacting machine control mechanism and theoperation algorithm has been developed.

  11. An express method for optimally tuning an analog controller with respect to integral quality criteria

    NASA Astrophysics Data System (ADS)

    Golinko, I. M.; Kovrigo, Yu. M.; Kubrak, A. I.

    2014-03-01

    An express method for optimally tuning analog PI and PID controllers is considered. An integral quality criterion with minimizing the control output is proposed for optimizing control systems. The suggested criterion differs from existing ones in that the control output applied to the technological process is taken into account in a correct manner, due to which it becomes possible to maximally reduce the expenditure of material and/or energy resources in performing control of industrial equipment sets. With control organized in such manner, smaller wear and longer service life of control devices are achieved. A unimodal nature of the proposed criterion for optimally tuning a controller is numerically demonstrated using the methods of optimization theory. A functional interrelation between the optimal controller parameters and dynamic properties of a controlled plant is numerically determined for a single-loop control system. The results obtained from simulation of transients in a control system carried out using the proposed and existing functional dependences are compared with each other. The proposed calculation formulas differ from the existing ones by a simple structure and highly accurate search for the optimal controller tuning parameters. The obtained calculation formulas are recommended for being used by specialists in automation for design and optimization of control systems.

  12. ChiLin: a comprehensive ChIP-seq and DNase-seq quality control and analysis pipeline.

    PubMed

    Qin, Qian; Mei, Shenglin; Wu, Qiu; Sun, Hanfei; Li, Lewyn; Taing, Len; Chen, Sujun; Li, Fugen; Liu, Tao; Zang, Chongzhi; Xu, Han; Chen, Yiwen; Meyer, Clifford A; Zhang, Yong; Brown, Myles; Long, Henry W; Liu, X Shirley

    2016-10-03

    Transcription factor binding, histone modification, and chromatin accessibility studies are important approaches to understanding the biology of gene regulation. ChIP-seq and DNase-seq have become the standard techniques for studying protein-DNA interactions and chromatin accessibility respectively, and comprehensive quality control (QC) and analysis tools are critical to extracting the most value from these assay types. Although many analysis and QC tools have been reported, few combine ChIP-seq and DNase-seq data analysis and quality control in a unified framework with a comprehensive and unbiased reference of data quality metrics. ChiLin is a computational pipeline that automates the quality control and data analyses of ChIP-seq and DNase-seq data. It is developed using a flexible and modular software framework that can be easily extended and modified. ChiLin is ideal for batch processing of many datasets and is well suited for large collaborative projects involving ChIP-seq and DNase-seq from different designs. ChiLin generates comprehensive quality control reports that include comparisons with historical data derived from over 23,677 public ChIP-seq and DNase-seq samples (11,265 datasets) from eight literature-based classified categories. To the best of our knowledge, this atlas represents the most comprehensive ChIP-seq and DNase-seq related quality metric resource currently available. These historical metrics provide useful heuristic quality references for experiment across all commonly used assay types. Using representative datasets, we demonstrate the versatility of the pipeline by applying it to different assay types of ChIP-seq data. The pipeline software is available open source at https://github.com/cfce/chilin . ChiLin is a scalable and powerful tool to process large batches of ChIP-seq and DNase-seq datasets. The analysis output and quality metrics have been structured into user-friendly directories and reports. We have successfully compiled 23,677 profiles into a comprehensive quality atlas with fine classification for users.

  13. Control and Automation of Fluid Flow, Mass Transfer and Chemical Reactions in Microscale Segmented Flow

    NASA Astrophysics Data System (ADS)

    Abolhasani, Milad

    Flowing trains of uniformly sized bubbles/droplets (i.e., segmented flows) and the associated mass transfer enhancement over their single-phase counterparts have been studied extensively during the past fifty years. Although the scaling behaviour of segmented flow formation is increasingly well understood, the predictive adjustment of the desired flow characteristics that influence the mixing and residence times, remains a challenge. Currently, a time consuming, slow and often inconsistent manual manipulation of experimental conditions is required to address this task. In my thesis, I have overcome the above-mentioned challenges and developed an experimental strategy that for the first time provided predictive control over segmented flows in a hands-off manner. A computer-controlled platform that consisted of a real-time image processing module within an integral controller, a silicon-based microreactor and automated fluid delivery technique was designed, implemented and validated. In a first part of my thesis I utilized this approach for the automated screening of physical mass transfer and solubility characteristics of carbon dioxide (CO2) in a physical solvent at a well-defined temperature and pressure and a throughput of 12 conditions per hour. Second, by applying the segmented flow approach to a recently discovered CO2 chemical absorbent, frustrated Lewis pairs (FLPs), I determined the thermodynamic characteristics of the CO2-FLP reaction. Finally, the segmented flow approach was employed for characterization and investigation of CO2-governed liquid-liquid phase separation process. The second part of my thesis utilized the segmented flow platform for the preparation and shape control of high quality colloidal nanomaterials (e.g., CdSe/CdS) via the automated control of residence times up to approximately 5 minutes. By introducing a novel oscillatory segmented flow concept, I was able to further extend the residence time limitation to 24 hours. A case study of a slow candidate reaction, the etching of gold nanorods during up to five hours, served to illustrate the utility of oscillatory segmented flows in assessing the shape evolution of colloidal nanomaterials on-chip via continuous optical interrogation at only one sensing location. The developed cruise control strategy will enable plug'n play operation of segmented flows in applications that include flow chemistry, material synthesis and in-flow analysis and screening.

  14. New developments in automated biosensing from remote water quality stations and satellite data retrieval for resources management

    NASA Astrophysics Data System (ADS)

    Morgan, E. L.; Eagleson, K. W.; Hermann, R.; McCollough, N. D.

    1981-05-01

    Maintaining adequate water quality in a multipurpose drainage system becomes increasingly important as demands on resources become greater. Real-time water quality monitoring plays a crucial role in meeting this objective. In addition to remote automated physical monitoring, developments at the end of the 1970's allow simultaneous real-time measurements of fish breathing response to water quality changes. These advantages complement complex in-stream surveys typically carried out to evaluate the environmental quality of a system. Automated biosensing units having remote capabilities are designed to aid in the evaluation of subtle water quality changes contributing to undesirable conditions in a drainage basin. Using microprocessor-based monitors to measure fish breathing rates, the biosensing units are interfaced to a U.S. National Aeronautics and Space Administration (N.A.S.A.) remote data collection platform for National Oceanic and Atmospheric Administration (N.O.A.A.) GOES satellite retrieval and transmission of data. Simultaneously, multiparameter physical information is collected from site-specific locations and recovered in a similar manner. Real-time biological and physical data received at a data processing center are readily available for interpretation by resource managers. Management schemes incorporating real-time monitoring networks into on-going programs to simultaneously retrieve biological and physical data by satellite, radio and telephone cable give added advantages in maintaining water quality for multipurpose needs.

  15. Automation and Robotics for Space-Based Systems, 1991

    NASA Technical Reports Server (NTRS)

    Williams, Robert L., II (Editor)

    1992-01-01

    The purpose of this in-house workshop was to assess the state-of-the-art of automation and robotics for space operations from an LaRC perspective and to identify areas of opportunity for future research. Over half of the presentations came from the Automation Technology Branch, covering telerobotic control, extravehicular activity (EVA) and intra-vehicular activity (IVA) robotics, hand controllers for teleoperation, sensors, neural networks, and automated structural assembly, all applied to space missions. Other talks covered the Remote Manipulator System (RMS) active damping augmentation, space crane work, modeling, simulation, and control of large, flexible space manipulators, and virtual passive controller designs for space robots.

  16. NASA JSC water monitor system: City of Houston field demonstration

    NASA Technical Reports Server (NTRS)

    Taylor, R. E.; Jeffers, E. L.; Fricks, D. H.

    1979-01-01

    A water quality monitoring system with on-line and real time operation similar to the function in a spacecraft was investigated. A system with the capability to determine conformance to future high effluent quality standards and to increase the potential for reclamation and reuse of water was designed. Although all system capabilities were not verified in the initial field trial, fully automated operation over a sustained period with only routine manual adjustments was accomplished. Two major points were demonstrated: (1) the water monitor system has great potential in water monitoring and/or process control applications; and (2) the water monitor system represents a vast improvement over conventional (grab sample) water monitoring techniques.

  17. Costs and consequences of automated algorithms versus manual grading for the detection of referable diabetic retinopathy.

    PubMed

    Scotland, G S; McNamee, P; Fleming, A D; Goatman, K A; Philip, S; Prescott, G J; Sharp, P F; Williams, G J; Wykes, W; Leese, G P; Olson, J A

    2010-06-01

    To assess the cost-effectiveness of an improved automated grading algorithm for diabetic retinopathy against a previously described algorithm, and in comparison with manual grading. Efficacy of the alternative algorithms was assessed using a reference graded set of images from three screening centres in Scotland (1253 cases with observable/referable retinopathy and 6333 individuals with mild or no retinopathy). Screening outcomes and grading and diagnosis costs were modelled for a cohort of 180 000 people, with prevalence of referable retinopathy at 4%. Algorithm (b), which combines image quality assessment with detection algorithms for microaneurysms (MA), blot haemorrhages and exudates, was compared with a simpler algorithm (a) (using image quality assessment and MA/dot haemorrhage (DH) detection), and the current practice of manual grading. Compared with algorithm (a), algorithm (b) would identify an additional 113 cases of referable retinopathy for an incremental cost of pound 68 per additional case. Compared with manual grading, automated grading would be expected to identify between 54 and 123 fewer referable cases, for a grading cost saving between pound 3834 and pound 1727 per case missed. Extrapolation modelling over a 20-year time horizon suggests manual grading would cost between pound 25,676 and pound 267,115 per additional quality adjusted life year gained. Algorithm (b) is more cost-effective than the algorithm based on quality assessment and MA/DH detection. With respect to the value of introducing automated detection systems into screening programmes, automated grading operates within the recommended national standards in Scotland and is likely to be considered a cost-effective alternative to manual disease/no disease grading.

  18. The ANU WellBeing study: a protocol for a quasi-factorial randomised controlled trial of the effectiveness of an Internet support group and an automated Internet intervention for depression

    PubMed Central

    2010-01-01

    Background Recent projections suggest that by the year 2030 depression will be the primary cause of disease burden among developed countries. Delivery of accessible consumer-focused evidenced-based services may be an important element in reducing this burden. Many consumers report a preference for self-help modes of delivery. The Internet offers a promising modality for delivering such services and there is now evidence that automated professionally developed self-help psychological interventions can be effective. By contrast, despite their popularity, there is little evidence as to the effectiveness of Internet support groups which provide peer-to-peer mutual support. Methods/Design Members of the community with elevated psychological distress were randomised to receive one of the following: (1) Internet Support Group (ISG) intervention, (2) a multi-module automated psychoeducational and skills Internet Training Program (ITP), (3) a combination of the ISG and ITP, or (4) an Internet Attention Control website (IAC) comprising health and wellbeing information and question and answer modules. Each intervention was 12 weeks long. Assessments were conducted at baseline, post-intervention, 6 and 12 months to examine depressive symptoms, social support, self-esteem, quality of life, depression literacy, stigma and help-seeking for depression. Participants were recruited through a screening postal survey sent to 70,000 Australians aged 18 to 65 years randomly selected from four rural and four metropolitan regions in Australia. Discussion To our knowledge this study is the first randomised controlled trial of the effectiveness of a depression ISG. Trial registration Current Controlled Trials ISRCTN65657330. PMID:20211025

  19. Automation Applications in an Advanced Air Traffic Management System : Volume 5A. DELTA Simulation Model - User's Guide

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work for-e, computer resources, controller productivity, system manning, failure ef...

  20. Complacency and Automation Bias in the Use of Imperfect Automation.

    PubMed

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  1. Defining the drivers for accepting decision making automation in air traffic management.

    PubMed

    Bekier, Marek; Molesworth, Brett R C; Williamson, Ann

    2011-04-01

    Air Traffic Management (ATM) operators are under increasing pressure to improve the efficiency of their operation to cater for forecasted increases in air traffic movements. One solution involves increasing the utilisation of automation within the ATM system. The success of this approach is contingent on Air Traffic Control Operators' (ATCOs) willingness to accept increased levels of automation. The main aim of the present research was to examine the drivers underpinning ATCOs' willingness to accept increased utilisation of automation within their role. Two fictitious scenarios involving the application of two new automated decision-making tools were created. The results of an online survey revealed traditional predictors of automation acceptance such as age, trust and job satisfaction explain between 4 and 7% of the variance. Furthermore, these predictors varied depending on the purpose in which the automation was to be employed. These results are discussed from an applied and theoretical perspective. STATEMENT OF RELEVANCE: Efficiency improvements in ATM are required to cater for forecasted increases in air traffic movements. One solution is to increase the utilisation of automation within Air Traffic Control. The present research examines the drivers underpinning air traffic controllers' willingness to accept increased levels of automation in their role.

  2. Human Factors Assessment: The Passive Final Approach Spacing Tool (pFAST) Operational Evaluation

    NASA Technical Reports Server (NTRS)

    Lee, Katharine K.; Sanford, Beverly D.

    1998-01-01

    Automation to assist air traffic controllers in the current terminal and en route air traff ic environments is being developed at Ames Research Center in conjunction with the Federal Aviation Administration. This automation, known collectively as the Center-TRACON Automation System (CTAS), provides decision- making assistance to air traffic controllers through computer-generated advisories. One of the CTAS tools developed specifically to assist terminal area air traffic controllers is the Passive Final Approach Spacing Tool (pFAST). An operational evaluation of PFAST was conducted at the Dallas/Ft. Worth, Texas, Terminal Radar Approach Control (TRACON) facility. Human factors data collected during the test describe the impact of the automation upon the air traffic controller in terms of perceived workload and acceptance. Results showed that controller self-reported workload was not significantly increased or reduced by the PFAST automation; rather, controllers reported that the levels of workload remained primarily the same. Controller coordination and communication data were analyzed, and significant differences in the nature of controller coordination were found. Controller acceptance ratings indicated that PFAST was acceptable. This report describes the human factors data and results from the 1996 Operational Field Evaluation of Passive FAST.

  3. Comparison of pressure-controlled provocation discography using automated versus manual syringe pump manometry in patients with chronic low back pain.

    PubMed

    Derby, Richard; Lee, Sang Hoon; Lee, Jeong-Eun; Lee, Sang-Heon

    2011-01-01

    The study compares the rate of positive discograms using an automated versus a manual pressure-controlled injection devise and compares the pressure and volume values at various pressures and initial evoked pain and 6/10 or greater evoked pain. A retrospective study prospectively collected patient study data used in a prior prospective study and with prospectively collected data which is routinely collected per our institutional standardized audit protocol. Two custom-built disc manometers (automated injection speed control; manual injection speed control) were sequentially employed during provocation discography in 510 discs of 151 consecutive patients. Two hundred thirty-seven discs of 67 patients with chronic low back pain were evaluated using the automated manometer (automated group) and 273 discs of 84 patients were evaluated with a manual manometer (manual group). No significant differences in positive discogram rates were found between the automated and manual groups (32.1% vs 32.6% per disc, respectively, P>0.05). No significant differences in low-pressure positive discogram rates were found (16.0% vs 15.0% per disc, automated group versus manual group, respectively, P>0.05). However, there were significantly increased volumes and lower pressures at initial and "bad" pain provocation. The study results found equivalent positive discogram rates following a series of pressure-controlled discography using either an automated or manual pressure devise. There were, however significant increases in volume at both initial onset of evoked pain and at 6/10 pain when using the automated injection devise that may have caused the observed lower opening pressure and lower pressure values at initial evoked pain. Assuming increased volumes are innocuous, automated injection is inherently more controlled and may better reduce unintended and often unrecorded spurious high dynamic pressure peaks thereby reducing conscious and unconscious operator bias. Wiley Periodicals, Inc.

  4. Automating Quality Measures for Heart Failure Using Natural Language Processing: A Descriptive Study in the Department of Veterans Affairs

    PubMed Central

    Kim, Youngjun; Gobbel, Glenn Temple; Matheny, Michael E; Redd, Andrew; Bray, Bruce E; Heidenreich, Paul; Bolton, Dan; Heavirland, Julia; Kelly, Natalie; Reeves, Ruth; Kalsy, Megha; Goldstein, Mary Kane; Meystre, Stephane M

    2018-01-01

    Background We developed an accurate, stakeholder-informed, automated, natural language processing (NLP) system to measure the quality of heart failure (HF) inpatient care, and explored the potential for adoption of this system within an integrated health care system. Objective To accurately automate a United States Department of Veterans Affairs (VA) quality measure for inpatients with HF. Methods We automated the HF quality measure Congestive Heart Failure Inpatient Measure 19 (CHI19) that identifies whether a given patient has left ventricular ejection fraction (LVEF) <40%, and if so, whether an angiotensin-converting enzyme inhibitor or angiotensin-receptor blocker was prescribed at discharge if there were no contraindications. We used documents from 1083 unique inpatients from eight VA medical centers to develop a reference standard (RS) to train (n=314) and test (n=769) the Congestive Heart Failure Information Extraction Framework (CHIEF). We also conducted semi-structured interviews (n=15) for stakeholder feedback on implementation of the CHIEF. Results The CHIEF classified each hospitalization in the test set with a sensitivity (SN) of 98.9% and positive predictive value of 98.7%, compared with an RS and SN of 98.5% for available External Peer Review Program assessments. Of the 1083 patients available for the NLP system, the CHIEF evaluated and classified 100% of cases. Stakeholders identified potential implementation facilitators and clinical uses of the CHIEF. Conclusions The CHIEF provided complete data for all patients in the cohort and could potentially improve the efficiency, timeliness, and utility of HF quality measurements. PMID:29335238

  5. The Use of AMET & Automated Scripts for Model Evaluation

    EPA Science Inventory

    Brief overview of EPA’s new CMAQ website to be launched publically in June, 2017. Details on the upcoming release of the Atmospheric Model Evaluation Tool (AMET) and the creation of automated scripts for post-processing and evaluating air quality model data.

  6. Automated Bus Diagnostic System Demonstration in New York City

    DOT National Transportation Integrated Search

    1983-12-01

    In response to a growing problem with the quality and efficiency of nationwide bus maintenance practices, an award was granted to the Tri-State Regional Planning Commission for the testing of an automated bus diagnostic system (ABDS). The ABDS was de...

  7. Automated measurement of cell motility and proliferation

    PubMed Central

    Bahnson, Alfred; Athanassiou, Charalambos; Koebler, Douglas; Qian, Lei; Shun, Tongying; Shields, Donna; Yu, Hui; Wang, Hong; Goff, Julie; Cheng, Tao; Houck, Raymond; Cowsert, Lex

    2005-01-01

    Background Time-lapse microscopic imaging provides a powerful approach for following changes in cell phenotype over time. Visible responses of whole cells can yield insight into functional changes that underlie physiological processes in health and disease. For example, features of cell motility accompany molecular changes that are central to the immune response, to carcinogenesis and metastasis, to wound healing and tissue regeneration, and to the myriad developmental processes that generate an organism. Previously reported image processing methods for motility analysis required custom viewing devices and manual interactions that may introduce bias, that slow throughput, and that constrain the scope of experiments in terms of the number of treatment variables, time period of observation, replication and statistical options. Here we describe a fully automated system in which images are acquired 24/7 from 384 well plates and are automatically processed to yield high-content motility and morphological data. Results We have applied this technology to study the effects of different extracellular matrix compounds on human osteoblast-like cell lines to explore functional changes that may underlie processes involved in bone formation and maintenance. We show dose-response and kinetic data for induction of increased motility by laminin and collagen type I without significant effects on growth rate. Differential motility response was evident within 4 hours of plating cells; long-term responses differed depending upon cell type and surface coating. Average velocities were increased approximately 0.1 um/min by ten-fold increases in laminin coating concentration in some cases. Comparison with manual tracking demonstrated the accuracy of the automated method and highlighted the comparative imprecision of human tracking for analysis of cell motility data. Quality statistics are reported that associate with stage noise, interference by non-cell objects, and uncertainty in the outlining and positioning of cells by automated image analysis. Exponential growth, as monitored by total cell area, did not linearly correlate with absolute cell number, but proved valuable for selection of reliable tracking data and for disclosing between-experiment variations in cell growth. Conclusion These results demonstrate the applicability of a system that uses fully automated image acquisition and analysis to study cell motility and growth. Cellular motility response is determined in an unbiased and comparatively high throughput manner. Abundant ancillary data provide opportunities for uniform filtering according to criteria that select for biological relevance and for providing insight into features of system performance. Data quality measures have been developed that can serve as a basis for the design and quality control of experiments that are facilitated by automation and the 384 well plate format. This system is applicable to large-scale studies such as drug screening and research into effects of complex combinations of factors and matrices on cell phenotype. PMID:15831094

  8. Sensors control gas metal arc welding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siewert, T.A.; Madigan, R.B.; Quinn, T.P.

    1997-04-01

    The response time of a trained welder from the time a weld problem is identified to the time action is taken is about one second--especially after a long, uneventful period of welding. This is acceptable for manual welding because it is close to the time it takes for the weld pool to solidify. If human response time were any slower, manual welding would not be possible. However, human response time is too slow to respond to some weld events, such as melting of the contact tube in gas metal arc welding (GMAW), and only automated intelligent control systems can reactmore » fast enough to correct or avoid these problems. Control systems incorporate welding knowledge that enables intelligent decisions to be made about weld quality and, ultimately, to keep welding parameters in the range where only high-quality welds are produced. This article discusses the correlation of electrical signals with contact-tube wear, changes in shielding gas, changes in arc length, and other weld process data.« less

  9. Automated Update, Revision, and Quality Control of the Maize Genome Annotations Using MAKER-P Improves the B73 RefGen_v3 Gene Models and Identifies New Genes1[OPEN

    PubMed Central

    Law, MeiYee; Childs, Kevin L.; Campbell, Michael S.; Stein, Joshua C.; Olson, Andrew J.; Holt, Carson; Panchy, Nicholas; Lei, Jikai; Jiao, Dian; Andorf, Carson M.; Lawrence, Carolyn J.; Ware, Doreen; Shiu, Shin-Han; Sun, Yanni; Jiang, Ning; Yandell, Mark

    2015-01-01

    The large size and relative complexity of many plant genomes make creation, quality control, and dissemination of high-quality gene structure annotations challenging. In response, we have developed MAKER-P, a fast and easy-to-use genome annotation engine for plants. Here, we report the use of MAKER-P to update and revise the maize (Zea mays) B73 RefGen_v3 annotation build (5b+) in less than 3 h using the iPlant Cyberinfrastructure. MAKER-P identified and annotated 4,466 additional, well-supported protein-coding genes not present in the 5b+ annotation build, added additional untranslated regions to 1,393 5b+ gene models, identified 2,647 5b+ gene models that lack any supporting evidence (despite the use of large and diverse evidence data sets), identified 104,215 pseudogene fragments, and created an additional 2,522 noncoding gene annotations. We also describe a method for de novo training of MAKER-P for the annotation of newly sequenced grass genomes. Collectively, these results lead to the 6a maize genome annotation and demonstrate the utility of MAKER-P for rapid annotation, management, and quality control of grasses and other difficult-to-annotate plant genomes. PMID:25384563

  10. Performance Monitoring of Chilled-Water Distribution Systems Using HVAC-Cx

    PubMed Central

    Ferretti, Natascha Milesi; Galler, Michael A.; Bushby, Steven T.

    2017-01-01

    In this research we develop, test, and demonstrate the newest extension of the software HVAC-Cx (NIST and CSTB 2014), an automated commissioning tool for detecting common mechanical faults and control errors in chilled-water distribution systems (loops). The commissioning process can improve occupant comfort, ensure the persistence of correct system operation, and reduce energy consumption. Automated tools support the process by decreasing the time and the skill level required to carry out necessary quality assurance measures, and as a result they enable more thorough testing of building heating, ventilating, and air-conditioning (HVAC) systems. This paper describes the algorithm, developed by National Institute of Standards and Technology (NIST), to analyze chilled-water loops and presents the results of a passive monitoring investigation using field data obtained from BACnet® (ASHRAE 2016) controllers and presents field validation of the findings. The tool was successful in detecting faults in system operation in its first field implementation supporting the investigation phase through performance monitoring. Its findings led to a full energy retrocommissioning of the field site. PMID:29167584

  11. Performance Monitoring of Chilled-Water Distribution Systems Using HVAC-Cx.

    PubMed

    Ferretti, Natascha Milesi; Galler, Michael A; Bushby, Steven T

    2017-01-01

    In this research we develop, test, and demonstrate the newest extension of the software HVAC-Cx (NIST and CSTB 2014), an automated commissioning tool for detecting common mechanical faults and control errors in chilled-water distribution systems (loops). The commissioning process can improve occupant comfort, ensure the persistence of correct system operation, and reduce energy consumption. Automated tools support the process by decreasing the time and the skill level required to carry out necessary quality assurance measures, and as a result they enable more thorough testing of building heating, ventilating, and air-conditioning (HVAC) systems. This paper describes the algorithm, developed by National Institute of Standards and Technology (NIST), to analyze chilled-water loops and presents the results of a passive monitoring investigation using field data obtained from BACnet ® (ASHRAE 2016) controllers and presents field validation of the findings. The tool was successful in detecting faults in system operation in its first field implementation supporting the investigation phase through performance monitoring. Its findings led to a full energy retrocommissioning of the field site.

  12. A modern depleted uranium manufacturing facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zagula, T.A.

    1995-07-01

    The Specific Manufacturing Capabilities (SMC) Project located at the Idaho National Engineering Laboratory (INEL) and operated by Lockheed Martin Idaho Technologies Co. (LMIT) for the Department of Energy (DOE) manufactures depleted uranium for use in the U.S. Army MIA2 Abrams Heavy Tank Armor Program. Since 1986, SMC has fabricated more than 12 million pounds of depleted uranium (DU) products in a multitude of shapes and sizes with varying metallurgical properties while maintaining security, environmental, health and safety requirements. During initial facility design in the early 1980`s, emphasis on employee safety, radiation control and environmental consciousness was gaining momentum throughout themore » DOE complex. This fact coupled with security and production requirements forced design efforts to focus on incorporating automation, local containment and computerized material accountability at all work stations. The result was a fully automated production facility engineered to manufacture DU armor packages with virtually no human contact while maintaining security, traceability and quality requirements. This hands off approach to handling depleted uranium resulted in minimal radiation exposures and employee injuries. Construction of the manufacturing facility was complete in early 1986 with the first armor package certified in October 1986. Rolling facility construction was completed in 1987 with the first certified plate produced in the fall of 1988. Since 1988 the rolling and manufacturing facilities have delivered more than 2600 armor packages on schedule with 100% final product quality acceptance. During this period there was an annual average of only 2.2 lost time incidents and a single individual maximum radiation exposure of 150 mrem. SMC is an example of designing and operating a facility that meets regulatory requirements with respect to national security, radiation control and personnel safety while achieving production schedules and product quality.« less

  13. Effective Materials Property Information Management for the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Weiju; Cebon, David; Barabash, Oleg M

    2011-01-01

    This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in various organizations. In part these are fuelled by the demands for higher efficiency in material testing, product design and engineering analysis. But equally important, organizations are being driven by the needs for consistency, quality and traceability of data, as well as control of access to proprietary or sensitive information. Further, the use of increasingly sophisticated nonlinear, anisotropic and multi-scale engineering analyses requires both processing of large volumes of test data for development of constitutive modelsmore » and complex materials data input for Computer-Aided Engineering (CAE) software. And finally, the globalization of economy often generates great needs for sharing a single gold source of materials information between members of global engineering teams in extended supply-chains. Fortunately material property management systems have kept pace with the growing user demands and evolved to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access, version, and quality controls; (ii) a wide range of data import, export and analysis capabilities; (iii) data pedigree traceability mechanisms; (iv) data searching, reporting and viewing tools; and (v) access to the information via a wide range of interfaces. In this paper the important requirements for advanced material data management systems, future challenges and opportunities such as automated error checking, data quality characterization, identification of gaps in datasets, as well as functionalities and business models to fuel database growth and maintenance are discussed.« less

  14. MilxXplore: a web-based system to explore large imaging datasets.

    PubMed

    Bourgeat, P; Dore, V; Villemagne, V L; Rowe, C C; Salvado, O; Fripp, J

    2013-01-01

    As large-scale medical imaging studies are becoming more common, there is an increasing reliance on automated software to extract quantitative information from these images. As the size of the cohorts keeps increasing with large studies, there is a also a need for tools that allow results from automated image processing and analysis to be presented in a way that enables fast and efficient quality checking, tagging and reporting on cases in which automatic processing failed or was problematic. MilxXplore is an open source visualization platform, which provides an interface to navigate and explore imaging data in a web browser, giving the end user the opportunity to perform quality control and reporting in a user friendly, collaborative and efficient way. Compared to existing software solutions that often provide an overview of the results at the subject's level, MilxXplore pools the results of individual subjects and time points together, allowing easy and efficient navigation and browsing through the different acquisitions of a subject over time, and comparing the results against the rest of the population. MilxXplore is fast, flexible and allows remote quality checks of processed imaging data, facilitating data sharing and collaboration across multiple locations, and can be easily integrated into a cloud computing pipeline. With the growing trend of open data and open science, such a tool will become increasingly important to share and publish results of imaging analysis.

  15. Automated visual inspection for polished stone manufacture

    NASA Astrophysics Data System (ADS)

    Smith, Melvyn L.; Smith, Lyndon N.

    2003-05-01

    Increased globalisation of the ornamental stone market has lead to increased competition and more rigorous product quality requirements. As such, there are strong motivators to introduce new, more effective, inspection technologies that will help enable stone processors to reduce costs, improve quality and improve productivity. Natural stone surfaces may contain a mixture of complex two-dimensional (2D) patterns and three-dimensional (3D) features. The challenge in terms of automated inspection is to develop systems able to reliably identify 3D topographic defects, either naturally occurring or resulting from polishing, in the presence of concomitant complex 2D stochastic colour patterns. The resulting real-time analysis of the defects may be used in adaptive process control, in order to avoid the wasteful production of defective product. An innovative approach, using structured light and based upon an adaptation of the photometric stereo method, has been pioneered and developed at UWE to isolate and characterize mixed 2D and 3D surface features. The method is able to undertake tasks considered beyond the capabilities of existing surface inspection techniques. The approach has been successfully applied to real stone samples, and a selection of experimental results is presented.

  16. Xenon International Automated Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-08-05

    The Xenon International Automated Control software monitors, displays status, and allows for manual operator control as well as fully automatic control of multiple commercial and PNNL designed hardware components to generate and transmit atmospheric radioxenon concentration measurements every six hours.

  17. USSR Report: Cybernetics, Computers and Automation Technology. No. 69.

    DTIC Science & Technology

    1983-05-06

    computers in multiprocessor and multistation design , control and scientific research automation systems. The results of comparing the efficiency of...Podvizhnaya, Scientific Research Institute of Control Computers, Severodonetsk] [Text] The most significant change in the design of the SM-2M compared to...UPRAVLYAYUSHCHIYE SISTEMY I MASHINY, Nov-Dec 82) 95 APPLICATIONS Kiev Automated Control System, Design Features and Prospects for Development (V. A

  18. Rapid evaluation and quality control of next generation sequencing data with FaQCs.

    PubMed

    Lo, Chien-Chi; Chain, Patrick S G

    2014-11-19

    Next generation sequencing (NGS) technologies that parallelize the sequencing process and produce thousands to millions, or even hundreds of millions of sequences in a single sequencing run, have revolutionized genomic and genetic research. Because of the vagaries of any platform's sequencing chemistry, the experimental processing, machine failure, and so on, the quality of sequencing reads is never perfect, and often declines as the read is extended. These errors invariably affect downstream analysis/application and should therefore be identified early on to mitigate any unforeseen effects. Here we present a novel FastQ Quality Control Software (FaQCs) that can rapidly process large volumes of data, and which improves upon previous solutions to monitor the quality and remove poor quality data from sequencing runs. Both the speed of processing and the memory footprint of storing all required information have been optimized via algorithmic and parallel processing solutions. The trimmed output compared side-by-side with the original data is part of the automated PDF output. We show how this tool can help data analysis by providing a few examples, including an increased percentage of reads recruited to references, improved single nucleotide polymorphism identification as well as de novo sequence assembly metrics. FaQCs combines several features of currently available applications into a single, user-friendly process, and includes additional unique capabilities such as filtering the PhiX control sequences, conversion of FASTQ formats, and multi-threading. The original data and trimmed summaries are reported within a variety of graphics and reports, providing a simple way to do data quality control and assurance.

  19. Automated Cryocooler Monitor and Control System Software

    NASA Technical Reports Server (NTRS)

    Britchcliffe, Michael J.; Conroy, Bruce L.; Anderson, Paul E.; Wilson, Ahmad

    2011-01-01

    This software is used in an automated cryogenic control system developed to monitor and control the operation of small-scale cryocoolers. The system was designed to automate the cryogenically cooled low-noise amplifier system described in "Automated Cryocooler Monitor and Control System" (NPO-47246), NASA Tech Briefs, Vol. 35, No. 5 (May 2011), page 7a. The software contains algorithms necessary to convert non-linear output voltages from the cryogenic diode-type thermometers and vacuum pressure and helium pressure sensors, to temperature and pressure units. The control function algorithms use the monitor data to control the cooler power, vacuum solenoid, vacuum pump, and electrical warm-up heaters. The control algorithms are based on a rule-based system that activates the required device based on the operating mode. The external interface is Web-based. It acts as a Web server, providing pages for monitor, control, and configuration. No client software from the external user is required.

  20. An evolutionary view of chromatography data systems used in bioanalysis.

    PubMed

    McDowall, R D

    2010-02-01

    This is a personal view of how chromatographic peak measurement and analyte quantification for bioanalysis have evolved from the manual methods of 1970 to the electronic working possible in 2010. In four decades there have been major changes from a simple chart recorder output (that was interpreted and quantified manually) through simple automation of peak measurement, calculation of standard curves and quality control values and instrument control to the networked chromatography data systems of today that are capable of interfacing with Laboratory Information Management Systems and other IT applications. The incorporation of electronic signatures to meet regulatory requirements offers a great opportunity for business improvement and electronic working.

  1. Task Analysis and Descriptions of Required Job Competencies of Robotics/Automated Systems Technicians. Outlines for New Courses and Modules.

    ERIC Educational Resources Information Center

    Hull, Daniel M.; Lovett, James E.

    The six new robotics and automated systems specialty courses developed by the Robotics/Automated Systems Technician (RAST) project are described in this publication. Course titles are Fundamentals of Robotics and Automated Systems, Automated Systems and Support Components, Controllers for Robots and Automated Systems, Robotics and Automated…

  2. Sigma metrics used to assess analytical quality of clinical chemistry assays: importance of the allowable total error (TEa) target.

    PubMed

    Hens, Koen; Berth, Mario; Armbruster, Dave; Westgard, Sten

    2014-07-01

    Six Sigma metrics were used to assess the analytical quality of automated clinical chemistry and immunoassay tests in a large Belgian clinical laboratory and to explore the importance of the source used for estimation of the allowable total error. Clinical laboratories are continually challenged to maintain analytical quality. However, it is difficult to measure assay quality objectively and quantitatively. The Sigma metric is a single number that estimates quality based on the traditional parameters used in the clinical laboratory: allowable total error (TEa), precision and bias. In this study, Sigma metrics were calculated for 41 clinical chemistry assays for serum and urine on five ARCHITECT c16000 chemistry analyzers. Controls at two analyte concentrations were tested and Sigma metrics were calculated using three different TEa targets (Ricos biological variability, CLIA, and RiliBÄK). Sigma metrics varied with analyte concentration, the TEa target, and between/among analyzers. Sigma values identified those assays that are analytically robust and require minimal quality control rules and those that exhibit more variability and require more complex rules. The analyzer to analyzer variability was assessed on the basis of Sigma metrics. Six Sigma is a more efficient way to control quality, but the lack of TEa targets for many analytes and the sometimes inconsistent TEa targets from different sources are important variables for the interpretation and the application of Sigma metrics in a routine clinical laboratory. Sigma metrics are a valuable means of comparing the analytical quality of two or more analyzers to ensure the comparability of patient test results.

  3. Setup, Validation and Quality Control of a Centralized WGS laboratory - Lessons Learned.

    PubMed

    Arnold, Cath; Edwards, Kirstin; Desai, Meeta; Platt, Steve; Green, Jonathan; Conway, David

    2018-04-25

    Routine use of Whole Genome analysis for infectious diseases can be used to enlighten various scenarios pertaining to Public Health, including identification of microbial pathogens; relating individual cases to an outbreak of infectious disease; establishing an association between an outbreak of food poisoning and a specific food vehicle; inferring drug susceptibility; source tracing of contaminants and study of variations in the genome affect pathogenicity/virulence. We describe the setup, validation and ongoing verification of a centralised WGS laboratory to carry out the sequencing for these public health functions for the National Infection Services, Public Health England in the UK. The performance characteristics and Quality Control metrics measured during validation and verification of the entire end to end process (accuracy, precision, reproducibility and repeatability) are described and include information regarding the automated pass and release of data to service users without intervention. © Crown copyright 2018.

  4. The design of the automated control system for warehouse equipment under radio-electronic manufacturing

    NASA Astrophysics Data System (ADS)

    Kapulin, D. V.; Chemidov, I. V.; Kazantsev, M. A.

    2017-01-01

    In the paper, the aspects of design, development and implementation of the automated control system for warehousing under the manufacturing process of the radio-electronic enterprise JSC «Radiosvyaz» are discussed. The architecture of the automated control system for warehousing proposed in the paper consists of a server which is connected to the physically separated information networks: the network with a database server, which stores information about the orders for picking, and the network with the automated storage and retrieval system. This principle allows implementing the requirements for differentiation of access, ensuring the information safety and security requirements. Also, the efficiency of the developed automated solutions in terms of optimizing the warehouse’s logistic characteristics is researched.

  5. A Multiple Agent Model of Human Performance in Automated Air Traffic Control and Flight Management Operations

    NASA Technical Reports Server (NTRS)

    Corker, Kevin; Pisanich, Gregory; Condon, Gregory W. (Technical Monitor)

    1995-01-01

    A predictive model of human operator performance (flight crew and air traffic control (ATC)) has been developed and applied in order to evaluate the impact of automation developments in flight management and air traffic control. The model is used to predict the performance of a two person flight crew and the ATC operators generating and responding to clearances aided by the Center TRACON Automation System (CTAS). The purpose of the modeling is to support evaluation and design of automated aids for flight management and airspace management and to predict required changes in procedure both air and ground in response to advancing automation in both domains. Additional information is contained in the original extended abstract.

  6. Problems of Automation and Management Principles Information Flow in Manufacturing

    NASA Astrophysics Data System (ADS)

    Grigoryuk, E. N.; Bulkin, V. V.

    2017-07-01

    Automated control systems of technological processes are complex systems that are characterized by the presence of elements of the overall focus, the systemic nature of the implemented algorithms for the exchange and processing of information, as well as a large number of functional subsystems. The article gives examples of automatic control systems and automated control systems of technological processes held parallel between them by identifying strengths and weaknesses. Other proposed non-standard control system of technological process.

  7. The Magic Ear: Another Approach to Automated Classroom Control.

    ERIC Educational Resources Information Center

    George, James R., III; And Others

    "Excessive" noise outburst behavior of 24 second graders was effectively controlled under automated stimulus conditions. A voice operated relay transmitted signals to an automated combination light display and outburst time/total running time meters; under 2 conditions, the light display functioned first as a primary, then as a secondary…

  8. Validation of the Sysmex sp-1000i automated slide preparer-stainer in a clinical laboratory

    PubMed Central

    de Bitencourt, Eberson Damião dos Santos; Voegeli, Carlos Franco; Onzi, Gabriela dos Santos; Boscato, Sara Cardoso; Ghem, Carine; Munhoz, Terezinha

    2013-01-01

    Background The speed and quality of information have become essential items in the release of laboratory reports. The Sysmex®SP1000-I device has been developed to prepare and stain smear slides. However, for a device to be cleared for use in the laboratory routine it must pass through a validation process. Objective To evaluate the performance and reliability of the Sysmex® SP-1000i slide preparer-stainer incorporated into the routine of a hospital laboratory in Porto Alegre. Methods Peripheral blood samples of patients attending the laboratory for ambulatory exams with leukocyte counts between 7000/°L and 12,000/°L were evaluated, independent of gender and age. Two slides were prepared for each sample using the Sysmex® SP-1000i equipment; one of the slides was used to perform quality control tests using the CellaVision® DM96 device, and the other slide was used to compare pre-classification by the same device and the classification performed by a pharmacist-biochemist. Results The results of all the slides used as controls were acceptable according to the quality control test as established by the manufacturer of the device. In the comparison between the automated pre-classification and the classification made by the professional, there was an acceptable variation in the differential counts of leukocytes for 90% of the analyzed slides. Pearson correlation coefficient showed a strong correlation for band neutrophils (r = 0.802; p-value < 0.001), segmented neutrophils (r = 0.963; p-value < 0.001), eosinophils (r = 0.958; p-value < 0.001), lymphocytes (r = 0.985; p-value < 0.001) and atypical lymphocytes (r = 0.866; p-value < 0.001) using both methods. The red blood cell analysis was adequate for all slides analyzed by the equipment and by the professional. Conclusion The new Sysmex®SP1000-i methodology was found to be reliable, fast and safe for the routines of medium and large laboratories, improving the quality of microscopic analysis in complete blood counts. PMID:24478606

  9. Ethics, finance, and automation: a preliminary survey of problems in high frequency trading.

    PubMed

    Davis, Michael; Kumiega, Andrew; Van Vliet, Ben

    2013-09-01

    All of finance is now automated, most notably high frequency trading. This paper examines the ethical implications of this fact. As automation is an interdisciplinary endeavor, we argue that the interfaces between the respective disciplines can lead to conflicting ethical perspectives; we also argue that existing disciplinary standards do not pay enough attention to the ethical problems automation generates. Conflicting perspectives undermine the protection those who rely on trading should have. Ethics in finance can be expanded to include organizational and industry-wide responsibilities to external market participants and society. As a starting point, quality management techniques can provide a foundation for a new cross-disciplinary ethical standard in the age of automation.

  10. Automatic control of pressure support for ventilator weaning in surgical intensive care patients.

    PubMed

    Schädler, Dirk; Engel, Christoph; Elke, Gunnar; Pulletz, Sven; Haake, Nils; Frerichs, Inéz; Zick, Günther; Scholz, Jens; Weiler, Norbert

    2012-03-15

    Despite its ability to reduce overall ventilation time, protocol-guided weaning from mechanical ventilation is not routinely used in daily clinical practice. Clinical implementation of weaning protocols could be facilitated by integration of knowledge-based, closed-loop controlled protocols into respirators. To determine whether automated weaning decreases overall ventilation time compared with weaning based on a standardized written protocol in an unselected surgical patient population. In this prospective controlled trial patients ventilated for longer than 9 hours were randomly allocated to receive either weaning with automatic control of pressure support ventilation (automated-weaning group) or weaning based on a standardized written protocol (control group) using the same ventilation mode. The primary end point of the study was overall ventilation time. Overall ventilation time (median [25th and 75th percentile]) did not significantly differ between the automated-weaning (31 [19-101] h; n = 150) and control groups (39 [20-118] h; n = 150; P = 0.178). Patients who underwent cardiac surgery (n = 132) exhibited significantly shorter overall ventilation times in the automated-weaning (24 [18-57] h) than in the control group (35 [20-93] h; P = 0.035). The automated-weaning group exhibited shorter ventilation times until the first spontaneous breathing trial (1 [0-15] vs. 9 [1-51] h; P = 0.001) and a trend toward fewer tracheostomies (17 vs. 28; P = 0.075). Overall ventilation times did not significantly differ between weaning using automatic control of pressure support ventilation and weaning based on a standardized written protocol. Patients after cardiac surgery may benefit from automated weaning. Implementation of additional control variables besides the level of pressure support may further improve automated-weaning systems. Clinical trial registered with www.clinicaltrials.gov (NCT 00445289).

  11. A LEAN approach toward automated analysis and data processing of polymers using proton NMR spectroscopy.

    PubMed

    de Brouwer, Hans; Stegeman, Gerrit

    2011-02-01

    To maximize utilization of expensive laboratory instruments and to make most effective use of skilled human resources, the entire chain of data processing, calculation, and reporting that is needed to transform raw NMR data into meaningful results was automated. The LEAN process improvement tools were used to identify non-value-added steps in the existing process. These steps were eliminated using an in-house developed software package, which allowed us to meet the key requirement of improving quality and reliability compared with the existing process while freeing up valuable human resources and increasing productivity. Reliability and quality were improved by the consistent data treatment as performed by the software and the uniform administration of results. Automating a single NMR spectrophotometer led to a reduction in operator time of 35%, doubling of the annual sample throughput from 1400 to 2800, and reducing the turn around time from 6 days to less than 2. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.

  12. AHCODA-DB: a data repository with web-based mining tools for the analysis of automated high-content mouse phenomics data.

    PubMed

    Koopmans, Bastijn; Smit, August B; Verhage, Matthijs; Loos, Maarten

    2017-04-04

    Systematic, standardized and in-depth phenotyping and data analyses of rodent behaviour empowers gene-function studies, drug testing and therapy design. However, no data repositories are currently available for standardized quality control, data analysis and mining at the resolution of individual mice. Here, we present AHCODA-DB, a public data repository with standardized quality control and exclusion criteria aimed to enhance robustness of data, enabled with web-based mining tools for the analysis of individually and group-wise collected mouse phenotypic data. AHCODA-DB allows monitoring in vivo effects of compounds collected from conventional behavioural tests and from automated home-cage experiments assessing spontaneous behaviour, anxiety and cognition without human interference. AHCODA-DB includes such data from mutant mice (transgenics, knock-out, knock-in), (recombinant) inbred strains, and compound effects in wildtype mice and disease models. AHCODA-DB provides real time statistical analyses with single mouse resolution and versatile suite of data presentation tools. On March 9th, 2017 AHCODA-DB contained 650 k data points on 2419 parameters from 1563 mice. AHCODA-DB provides users with tools to systematically explore mouse behavioural data, both with positive and negative outcome, published and unpublished, across time and experiments with single mouse resolution. The standardized (automated) experimental settings and the large current dataset (1563 mice) in AHCODA-DB provide a unique framework for the interpretation of behavioural data and drug effects. The use of common ontologies allows data export to other databases such as the Mouse Phenome Database. Unbiased presentation of positive and negative data obtained under the highly standardized screening conditions increase cost efficiency of publicly funded mouse screening projects and help to reach consensus conclusions on drug responses and mouse behavioural phenotypes. The website is publicly accessible through https://public.sylics.com and can be viewed in every recent version of all commonly used browsers.

  13. Trends and developments in industrial machine vision: 2013

    NASA Astrophysics Data System (ADS)

    Niel, Kurt; Heinzl, Christoph

    2014-03-01

    When following current advancements and implementations in the field of machine vision there seems to be no borders for future developments: Calculating power constantly increases, and new ideas are spreading and previously challenging approaches are introduced in to mass market. Within the past decades these advances have had dramatic impacts on our lives. Consumer electronics, e.g. computers or telephones, which once occupied large volumes, now fit in the palm of a hand. To note just a few examples e.g. face recognition was adopted by the consumer market, 3D capturing became cheap, due to the huge community SW-coding got easier using sophisticated development platforms. However, still there is a remaining gap between consumer and industrial applications. While the first ones have to be entertaining, the second have to be reliable. Recent studies (e.g. VDMA [1], Germany) show a moderately increasing market for machine vision in industry. Asking industry regarding their needs the main challenges for industrial machine vision are simple usage and reliability for the process, quick support, full automation, self/easy adjustment at changing process parameters, "forget it in the line". Furthermore a big challenge is to support quality control: Nowadays the operator has to accurately define the tested features for checking the probes. There is an upcoming development also to let automated machine vision applications find out essential parameters in a more abstract level (top down). In this work we focus on three current and future topics for industrial machine vision: Metrology supporting automation, quality control (inline/atline/offline) as well as visualization and analysis of datasets with steadily growing sizes. Finally the general trend of the pixel orientated towards object orientated evaluation is addressed. We do not directly address the field of robotics taking advances from machine vision. This is actually a fast changing area which is worth an own contribution.

  14. The automation of an inlet mass flow control system

    NASA Technical Reports Server (NTRS)

    Supplee, Frank; Tcheng, Ping; Weisenborn, Michael

    1989-01-01

    The automation of a closed-loop computer controlled system for the inlet mass flow system (IMFS) developed for a wind tunnel facility at Langley Research Center is presented. This new PC based control system is intended to replace the manual control system presently in use in order to fully automate the plug positioning of the IMFS during wind tunnel testing. Provision is also made for communication between the PC and a host-computer in order to allow total animation of the plug positioning and data acquisition during the complete sequence of predetermined plug locations. As extensive running time is programmed for the IMFS, this new automated system will save both manpower and tunnel running time.

  15. Human factors phase III : effects of train control technology on operator performance

    DOT National Transportation Integrated Search

    2005-01-01

    This report describes a study evaluating the effects of train control technology on locomotive engineer performance. Several types : of train control systems were evaluated: partial automation (cruise control and programmed stop) and full automation ...

  16. Human factors phase III : effects of train control technology on operator performance.

    DOT National Transportation Integrated Search

    2005-01-31

    This report describes a study evaluating the effects of train control technology on locomotive engineer performance. Several types of train control systems were evaluated: partial automation (cruise control and programmed stop) and full automation we...

  17. [Standardization of operation monitoring and control of the clinical laboratory automation system].

    PubMed

    Tao, R

    2000-10-01

    Laboratory automation systems showed up in the 1980s and have been introduced to many clinical laboratories since early 1990s. Meanwhile, it was found that the difference in the specimen tube dimensions, specimen identification formats, specimen carrier transportation equipment architecture, electromechanical interfaces between the analyzers and the automation systems was preventing the systems from being introduced to a wider extent. To standardize the different interfaces and reduce the cost of laboratory automation, NCCLS and JCCLS started establishing standards for laboratory automation in 1996 and 1997 respectively. Operation monitoring and control of the laboratory automation system have been included in their activities, resulting in the publication of an NCCLS proposed standard in 1999.

  18. Effect of Pulse Parameters on Weld Quality in Pulsed Gas Metal Arc Welding: A Review

    NASA Astrophysics Data System (ADS)

    Pal, Kamal; Pal, Surjya K.

    2011-08-01

    The weld quality comprises bead geometry and its microstructure, which influence the mechanical properties of the weld. The coarse-grained weld microstructure, higher heat-affected zone, and lower penetration together with higher reinforcement reduce the weld service life in continuous mode gas metal arc welding (GMAW). Pulsed GMAW (P-GMAW) is an alternative method providing a better way for overcoming these afore mentioned problems. It uses a higher peak current to allow one molten droplet per pulse, and a lower background current to maintain the arc stability. Current pulsing refines the grains in weld fusion zone with increasing depth of penetration due to arc oscillations. Optimum weld joint characteristics can be achieved by controlling the pulse parameters. The process is versatile and easily automated. This brief review illustrates the effect of pulse parameters on weld quality.

  19. The Quality and Validation of Structures from Structural Genomics

    PubMed Central

    Domagalski, Marcin J.; Zheng, Heping; Zimmerman, Matthew D.; Dauter, Zbigniew; Wlodawer, Alexander; Minor, Wladek

    2014-01-01

    Quality control of three-dimensional structures of macromolecules is a critical step to ensure the integrity of structural biology data, especially those produced by structural genomics centers. Whereas the Protein Data Bank (PDB) has proven to be a remarkable success overall, the inconsistent quality of structures reveals a lack of universal standards for structure/deposit validation. Here, we review the state-of-the-art methods used in macromolecular structure validation, focusing on validation of structures determined by X-ray crystallography. We describe some general protocols used in the rebuilding and re-refinement of problematic structural models. We also briefly discuss some frontier areas of structure validation, including refinement of protein–ligand complexes, automation of structure redetermination, and the use of NMR structures and computational models to solve X-ray crystal structures by molecular replacement. PMID:24203341

  20. Efficient high-throughput biological process characterization: Definitive screening design with the ambr250 bioreactor system.

    PubMed

    Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam

    2015-01-01

    The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.

Top