Sample records for automated process monitoring

  1. 9 CFR 381.307 - Record review and maintenance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...

  2. 9 CFR 381.307 - Record review and maintenance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...

  3. 9 CFR 381.307 - Record review and maintenance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...

  4. 9 CFR 381.307 - Record review and maintenance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...

  5. 9 CFR 381.307 - Record review and maintenance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...

  6. 9 CFR 318.307 - Record review and maintenance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...

  7. 9 CFR 318.307 - Record review and maintenance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...

  8. 9 CFR 318.307 - Record review and maintenance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...

  9. 9 CFR 318.307 - Record review and maintenance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...

  10. 9 CFR 318.307 - Record review and maintenance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...

  11. Automated Monitoring with a BSP Fault-Detection Test

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L.; Herzog, James P.

    2003-01-01

    The figure schematically illustrates a method and procedure for automated monitoring of an asset, as well as a hardware- and-software system that implements the method and procedure. As used here, asset could signify an industrial process, power plant, medical instrument, aircraft, or any of a variety of other systems that generate electronic signals (e.g., sensor outputs). In automated monitoring, the signals are digitized and then processed in order to detect faults and otherwise monitor operational status and integrity of the monitored asset. The major distinguishing feature of the present method is that the fault-detection function is implemented by use of a Bayesian sequential probability (BSP) technique. This technique is superior to other techniques for automated monitoring because it affords sensitivity, not only to disturbances in the mean values, but also to very subtle changes in the statistical characteristics (variance, skewness, and bias) of the monitored signals.

  12. 42 CFR 137.88 - May the Secretary reduce the amount of funds required under Title V to pay for Federal functions...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... retirement benefits, automated data processing, technical assistance, and monitoring of activities under the..., technical assistance, and monitoring of activities under the Act? No, the Secretary may not reduce the... employee retirement benefits, automated data processing, technical assistance, and monitoring of activities...

  13. 42 CFR 137.88 - May the Secretary reduce the amount of funds required under Title V to pay for Federal functions...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... retirement benefits, automated data processing, technical assistance, and monitoring of activities under the..., technical assistance, and monitoring of activities under the Act? No, the Secretary may not reduce the... employee retirement benefits, automated data processing, technical assistance, and monitoring of activities...

  14. 42 CFR 137.88 - May the Secretary reduce the amount of funds required under Title V to pay for Federal functions...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... retirement benefits, automated data processing, technical assistance, and monitoring of activities under the..., technical assistance, and monitoring of activities under the Act? No, the Secretary may not reduce the... employee retirement benefits, automated data processing, technical assistance, and monitoring of activities...

  15. 42 CFR 137.88 - May the Secretary reduce the amount of funds required under Title V to pay for Federal functions...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... retirement benefits, automated data processing, technical assistance, and monitoring of activities under the..., technical assistance, and monitoring of activities under the Act? No, the Secretary may not reduce the... employee retirement benefits, automated data processing, technical assistance, and monitoring of activities...

  16. DOE Program on Seismic Characterization for Regions of Interest to CTBT Monitoring,

    DTIC Science & Technology

    1995-08-14

    processing of the monitoring network data). While developing and testing the corrections and other parameters needed by the automated processing systems...the secondary network. Parameters tabulated in the knowledge base must be appropriate for routine automated processing of network data, and must also...operation of the PNDC, as well as to results of investigations of "special events" (i.e., those events that fail to locate or discriminate during automated

  17. Automated Status Notification System

    NASA Technical Reports Server (NTRS)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  18. Technology Transfer Opportunities: Automated Ground-Water Monitoring

    USGS Publications Warehouse

    Smith, Kirk P.; Granato, Gregory E.

    1997-01-01

    Introduction A new automated ground-water monitoring system developed by the U.S. Geological Survey (USGS) measures and records values of selected water-quality properties and constituents using protocols approved for manual sampling. Prototypes using the automated process have demonstrated the ability to increase the quantity and quality of data collected and have shown the potential for reducing labor and material costs for ground-water quality data collection. Automation of water-quality monitoring systems in the field, in laboratories, and in industry have increased data density and utility while reducing operating costs. Uses for an automated ground-water monitoring system include, (but are not limited to) monitoring ground-water quality for research, monitoring known or potential contaminant sites, such as near landfills, underground storage tanks, or other facilities where potential contaminants are stored, and as an early warning system monitoring groundwater quality near public water-supply wells.

  19. Automation trust and attention allocation in multitasking workspace.

    PubMed

    Karpinsky, Nicole D; Chancey, Eric T; Palmer, Dakota B; Yamani, Yusuke

    2018-07-01

    Previous research suggests that operators with high workload can distrust and then poorly monitor automation, which has been generally inferred from automation dependence behaviors. To test automation monitoring more directly, the current study measured operators' visual attention allocation, workload, and trust toward imperfect automation in a dynamic multitasking environment. Participants concurrently performed a manual tracking task with two levels of difficulty and a system monitoring task assisted by an unreliable signaling system. Eye movement data indicate that operators allocate less visual attention to monitor automation when the tracking task is more difficult. Participants reported reduced levels of trust toward the signaling system when the tracking task demanded more focused visual attention. Analyses revealed that trust mediated the relationship between the load of the tracking task and attention allocation in Experiment 1, an effect that was not replicated in Experiment 2. Results imply a complex process underlying task load, visual attention allocation, and automation trust during multitasking. Automation designers should consider operators' task load in multitasking workspaces to avoid reduced automation monitoring and distrust toward imperfect signaling systems. Copyright © 2018. Published by Elsevier Ltd.

  20. Means of storage and automated monitoring of versions of text technical documentation

    NASA Astrophysics Data System (ADS)

    Leonovets, S. A.; Shukalov, A. V.; Zharinov, I. O.

    2018-03-01

    The paper presents automation of the process of preparation, storage and monitoring of version control of a text designer, and program documentation by means of the specialized software is considered. Automation of preparation of documentation is based on processing of the engineering data which are contained in the specifications and technical documentation or in the specification. Data handling assumes existence of strictly structured electronic documents prepared in widespread formats according to templates on the basis of industry standards and generation by an automated method of the program or designer text document. Further life cycle of the document and engineering data entering it are controlled. At each stage of life cycle, archive data storage is carried out. Studies of high-speed performance of use of different widespread document formats in case of automated monitoring and storage are given. The new developed software and the work benches available to the developer of the instrumental equipment are described.

  1. Pilots' monitoring strategies and performance on automated flight decks: an empirical study combining behavioral and eye-tracking data.

    PubMed

    Sarter, Nadine B; Mumaw, Randall J; Wickens, Christopher D

    2007-06-01

    The objective of the study was to examine pilots' automation monitoring strategies and performance on highly automated commercial flight decks. A considerable body of research and operational experience has documented breakdowns in pilot-automation coordination on modern flight decks. These breakdowns are often considered symptoms of monitoring failures even though, to date, only limited and mostly anecdotal data exist concerning pilots' monitoring strategies and performance. Twenty experienced B-747-400 airline pilots flew a 1-hr scenario involving challenging automation-related events on a full-mission simulator. Behavioral, mental model, and eye-tracking data were collected. The findings from this study confirm that pilots monitor basic flight parameters to a much greater extent than visual indications of the automation configuration. More specifically, they frequently fail to verify manual mode selections or notice automatic mode changes. In other cases, they do not process mode annunciations in sufficient depth to understand their implications for aircraft behavior. Low system observability and gaps in pilots' understanding of complex automation modes were shown to contribute to these problems. Our findings describe and explain shortcomings in pilot's automation monitoring strategies and performance based on converging behavioral, eye-tracking, and mental model data. They confirm that monitoring failures are one major contributor to breakdowns in pilot-automation interaction. The findings from this research can inform the design of improved training programs and automation interfaces that support more effective system monitoring.

  2. Technology Transfer Opportunities: Automated Ground-Water Monitoring, A Proven Technology

    USGS Publications Warehouse

    Smith, Kirk P.; Granato, Gregory E.

    1998-01-01

    Introduction The U.S. Geological Survey (USGS) has developed and tested an automated ground-water monitoring system that measures and records values of selected water-quality properties and constituents using protocols approved for manual sampling. Prototypes using the automated process have demonstrated the ability to increase the quantity and quality of data collected and have shown the potential for reducing labor and material costs for ground-water quality data collection. Automated ground-water monitoring systems can be used to monitor known or potential contaminant sites, such as near landfills, underground storage tanks, or other facilities where potential contaminants are stored, to serve as early warning systems monitoring ground-water quality near public water-supply wells, and for ground-water quality research.

  3. Classification Trees for Quality Control Processes in Automated Constructed Response Scoring.

    ERIC Educational Resources Information Center

    Williamson, David M.; Hone, Anne S.; Miller, Susan; Bejar, Isaac I.

    As the automated scoring of constructed responses reaches operational status, the issue of monitoring the scoring process becomes a primary concern, particularly when the goal is to have automated scoring operate completely unassisted by humans. Using a vignette from the Architectural Registration Examination and data for 326 cases with both human…

  4. In-situ Frequency Dependent Dielectric Sensing of Cure

    NASA Technical Reports Server (NTRS)

    Kranbuehl, David E.

    1996-01-01

    With the expanding use of polymeric materials as composite matrices, adhesives, coatings and films, the need to develop low cost, automated fabrication processes to produce consistently high quality parts is critical. Essential to the development of reliable, automated, intelligent processing is the ability to continuously monitor the changing state of the polymeric resin in-situ in the fabrication tool. This final report discusses work done on developing dielectric sensing to monitor polymeric material cure and which provides a fundamental understanding of the underlying science for the use of frequency dependent dielectri sensors to monitor the cure process.

  5. Automated System of Diagnostic Monitoring at Bureya HPP Hydraulic Engineering Installations: a New Level of Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musyurka, A. V., E-mail: musyurkaav@burges.rushydro.ru

    This article presents the design, hardware, and software solutions developed and placed in service for the automated system of diagnostic monitoring (ASDM) for hydraulic engineering installations at the Bureya HPP, and assuring a reliable process for monitoring hydraulic engineering installations. Project implementation represents a timely solution of problems addressed by the hydraulic engineering installation diagnostics section.

  6. An Aspect-Oriented Framework for Business Process Improvement

    NASA Astrophysics Data System (ADS)

    Pourshahid, Alireza; Mussbacher, Gunter; Amyot, Daniel; Weiss, Michael

    Recently, many organizations invested in Business Process Management Systems (BPMSs) in order to automate and monitor their processes. Business Activity Monitoring is one of the essential modules of a BPMS as it provides the core monitoring capabilities. Although the natural step after process monitoring is process improvement, most of the existing systems do not provide the means to help users with the improvement step. In this paper, we address this issue by proposing an aspect-oriented framework that allows the impact of changes to business processes to be explored with what-if scenarios based on the most appropriate process redesign patterns among several possibilities. As the four cornerstones of a BPMS are process, goal, performance and validation views, these views need to be aligned automatically by any approach that intends to support automated improvement of business processes. Our framework therefore provides means to reflect process changes also in the other views of the business process. A health care case study presented as a proof of concept suggests that this novel approach is feasible.

  7. Robotic platform for parallelized cultivation and monitoring of microbial growth parameters in microwell plates.

    PubMed

    Knepper, Andreas; Heiser, Michael; Glauche, Florian; Neubauer, Peter

    2014-12-01

    The enormous variation possibilities of bioprocesses challenge process development to fix a commercial process with respect to costs and time. Although some cultivation systems and some devices for unit operations combine the latest technology on miniaturization, parallelization, and sensing, the degree of automation in upstream and downstream bioprocess development is still limited to single steps. We aim to face this challenge by an interdisciplinary approach to significantly shorten development times and costs. As a first step, we scaled down analytical assays to the microliter scale and created automated procedures for starting the cultivation and monitoring the optical density (OD), pH, concentrations of glucose and acetate in the culture medium, and product formation in fed-batch cultures in the 96-well format. Then, the separate measurements of pH, OD, and concentrations of acetate and glucose were combined to one method. This method enables automated process monitoring at dedicated intervals (e.g., also during the night). By this approach, we managed to increase the information content of cultivations in 96-microwell plates, thus turning them into a suitable tool for high-throughput bioprocess development. Here, we present the flowcharts as well as cultivation data of our automation approach. © 2014 Society for Laboratory Automation and Screening.

  8. Unattended reaction monitoring using an automated microfluidic sampler and on-line liquid chromatography.

    PubMed

    Patel, Darshan C; Lyu, Yaqi Fara; Gandarilla, Jorge; Doherty, Steve

    2018-04-03

    In-process sampling and analysis is an important aspect of monitoring kinetic profiles and impurity formation or rejection, both in development and during commercial manufacturing. In pharmaceutical process development, the technology of choice for a substantial portion of this analysis is high-performance liquid chromatography (HPLC). Traditionally, the sample extraction and preparation for reaction characterization have been performed manually. This can be time consuming, laborious, and impractical for long processes. Depending on the complexity of the sample preparation, there can be variability introduced by different analysts, and in some cases, the integrity of the sample can be compromised during handling. While there are commercial instruments available for on-line monitoring with HPLC, they lack capabilities in many key areas. Some do not provide integration of the sampling and analysis, while others afford limited flexibility in sample preparation. The current offerings provide a limited number of unit operations available for sample processing and no option for workflow customizability. This work describes development of a microfluidic automated program (MAP) which fully automates the sample extraction, manipulation, and on-line LC analysis. The flexible system is controlled using an intuitive Microsoft Excel based user interface. The autonomous system is capable of unattended reaction monitoring that allows flexible unit operations and workflow customization to enable complex operations and on-line sample preparation. The automated system is shown to offer advantages over manual approaches in key areas while providing consistent and reproducible in-process data. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Kohonen Self-Organizing Maps in Validity Maintenance for Automated Scoring of Constructed Response.

    ERIC Educational Resources Information Center

    Williamson, David M.; Bejar, Isaac I.

    As the automated scoring of constructed responses reaches operational status, monitoring the scoring process becomes a primary concern, particularly if automated scoring is intended to operate completely unassisted by humans. Using actual candidate selections from the Architectural Registration Examination (n=326), this study uses Kohonen…

  10. Automated Long-Term Monitoring of Parallel Microfluidic Operations Applying a Machine Vision-Assisted Positioning Method

    PubMed Central

    Yip, Hon Ming; Li, John C. S.; Cui, Xin; Gao, Qiannan; Leung, Chi Chiu

    2014-01-01

    As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities. PMID:25133248

  11. Improving Information Exchange and Coordination amongst Homeland Security Organizations (Briefing Charts)

    DTIC Science & Technology

    2005-06-01

    need for user-defined dashboard • automated monitoring of web data sources • task driven data aggregation and display Working toward automated processing of task, resource, and intelligence updates

  12. Non-Traditional Displays for Mission Monitoring

    NASA Technical Reports Server (NTRS)

    Trujillo, Anna C.; Schutte, Paul C.

    1999-01-01

    Advances in automation capability and reliability have changed the role of humans from operating and controlling processes to simply monitoring them for anomalies. However, humans are traditionally bad monitors of highly reliable systems over time. Thus, the human is assigned a task for which he is ill equipped. We believe that this has led to the dominance of human error in process control activities such as operating transportation systems (aircraft and trains), monitoring patient health in the medical industry, and controlling plant operations. Research has shown, though, that an automated monitor can assist humans in recognizing and dealing with failures. One possible solution to this predicament is to use a polar-star display that will show deviations from normal states based on parameters that are most indicative of mission health.

  13. Performance Monitoring Of A Computer Numerically Controlled (CNC) Lathe Using Pattern Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Daneshmend, L. K.; Pak, H. A.

    1984-02-01

    On-line monitoring of the cutting process in CNC lathe is desirable to ensure unattended fault-free operation in an automated environment. The state of the cutting tool is one of the most important parameters which characterises the cutting process. Direct monitoring of the cutting tool or workpiece is not feasible during machining. However several variables related to the state of the tool can be measured on-line. A novel monitoring technique is presented which uses cutting torque as the variable for on-line monitoring. A classifier is designed on the basis of the empirical relationship between cutting torque and flank wear. The empirical model required by the on-line classifier is established during an automated training cycle using machine vision for off-line direct inspection of the tool.

  14. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    NASA Technical Reports Server (NTRS)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  15. Intelligent monitoring and diagnosis systems for the Space Station Freedom ECLSS

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, James R.

    1991-01-01

    Specific activities in NASA's environmental control and life support system (ECLSS) advanced automation project that is designed to minimize the crew and ground manpower needed for operations are discussed. Various analyses and the development of intelligent software for the initial and evolutionary Space Station Freedom (SSF) ECLSS are described. The following are also discussed: (1) intelligent monitoring and diagnostics applications under development for the ECLSS domain; (2) integration into the MSFC ECLSS hardware testbed; and (3) an evolutionary path from the baseline ECLSS automation to the more advanced ECLSS automation processes.

  16. Advanced automation for in-space vehicle processing

    NASA Technical Reports Server (NTRS)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  17. [The use of automated processing of information obtained during space flights for the monitoring and evaluation of airborne pollution].

    PubMed

    Bagmanov, B Kh; Mikhaĭlova, A Iu; Pavlov, S V

    1997-01-01

    The article describes experience on use of automated processing of information obtained during spaceflights for analysis of urban air pollution. The authors present a method for processing of information obtained during spaceflights and show how to identify foci of industrial release and area of their spread within and beyond the cities.

  18. A combination of HPLC and automated data analysis for monitoring the efficiency of high-pressure homogenization.

    PubMed

    Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver

    2017-08-01

    Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.

  19. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    PubMed

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.

  20. A new framework for analysing automated acoustic species-detection data: occupancy estimation and optimization of recordings post-processing

    USGS Publications Warehouse

    Chambert, Thierry A.; Waddle, J. Hardin; Miller, David A.W.; Walls, Susan; Nichols, James D.

    2018-01-01

    The development and use of automated species-detection technologies, such as acoustic recorders, for monitoring wildlife are rapidly expanding. Automated classification algorithms provide a cost- and time-effective means to process information-rich data, but often at the cost of additional detection errors. Appropriate methods are necessary to analyse such data while dealing with the different types of detection errors.We developed a hierarchical modelling framework for estimating species occupancy from automated species-detection data. We explore design and optimization of data post-processing procedures to account for detection errors and generate accurate estimates. Our proposed method accounts for both imperfect detection and false positive errors and utilizes information about both occurrence and abundance of detections to improve estimation.Using simulations, we show that our method provides much more accurate estimates than models ignoring the abundance of detections. The same findings are reached when we apply the methods to two real datasets on North American frogs surveyed with acoustic recorders.When false positives occur, estimator accuracy can be improved when a subset of detections produced by the classification algorithm is post-validated by a human observer. We use simulations to investigate the relationship between accuracy and effort spent on post-validation, and found that very accurate occupancy estimates can be obtained with as little as 1% of data being validated.Automated monitoring of wildlife provides opportunity and challenges. Our methods for analysing automated species-detection data help to meet key challenges unique to these data and will prove useful for many wildlife monitoring programs.

  1. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    USGS Publications Warehouse

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  2. Automated terrestrial laser scanning with near-real-time change detection - monitoring of the Séchilienne landslide

    NASA Astrophysics Data System (ADS)

    Kromer, Ryan A.; Abellán, Antonio; Hutchinson, D. Jean; Lato, Matt; Chanut, Marie-Aurelie; Dubois, Laurent; Jaboyedoff, Michel

    2017-05-01

    We present an automated terrestrial laser scanning (ATLS) system with automatic near-real-time change detection processing. The ATLS system was tested on the Séchilienne landslide in France for a 6-week period with data collected at 30 min intervals. The purpose of developing the system was to fill the gap of high-temporal-resolution TLS monitoring studies of earth surface processes and to offer a cost-effective, light, portable alternative to ground-based interferometric synthetic aperture radar (GB-InSAR) deformation monitoring. During the study, we detected the flux of talus, displacement of the landslide and pre-failure deformation of discrete rockfall events. Additionally, we found the ATLS system to be an effective tool in monitoring landslide and rockfall processes despite missing points due to poor atmospheric conditions or rainfall. Furthermore, such a system has the potential to help us better understand a wide variety of slope processes at high levels of temporal detail.

  3. Comparison of modelling accuracy with and without exploiting automated optical monitoring information in predicting the treated wastewater quality.

    PubMed

    Tomperi, Jani; Leiviskä, Kauko

    2018-06-01

    Traditionally the modelling in an activated sludge process has been based on solely the process measurements, but as the interest to optically monitor wastewater samples to characterize the floc morphology has increased, in the recent years the results of image analyses have been more frequently utilized to predict the characteristics of wastewater. This study shows that the traditional process measurements or the automated optical monitoring variables by themselves are not capable of developing the best predictive models for the treated wastewater quality in a full-scale wastewater treatment plant, but utilizing these variables together the optimal models, which show the level and changes in the treated wastewater quality, are achieved. By this early warning, process operation can be optimized to avoid environmental damages and economic losses. The study also shows that specific optical monitoring variables are important in modelling a certain quality parameter, regardless of the other input variables available.

  4. Development of a Fully Automated Guided Wave System for In-Process Cure Monitoring of CFRP Composite Laminates

    NASA Technical Reports Server (NTRS)

    Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.; Yaun, Fuh-Gwo

    2016-01-01

    A guided wave-based in-process cure monitoring technique for carbon fiber reinforced polymer (CFRP) composites was investigated at NASA Langley Research Center. A key cure transition point (vitrification) was identified and the degree of cure was monitored using metrics such as amplitude and time of arrival (TOA) of guided waves. Using an automated system preliminarily developed in this work, high-temperature piezoelectric transducers were utilized to interrogate a twenty-four ply unidirectional composite panel fabricated from Hexcel (Registered Trademark) IM7/8552 prepreg during cure. It was shown that the amplitude of the guided wave increased sharply around vitrification and the TOA curve possessed an inverse relationship with degree of cure. The work is a first step in demonstrating the feasibility of transitioning the technique to perform in-process cure monitoring in an autoclave, defect detection during cure, and ultimately a closed-loop process control to maximize composite part quality and consistency.

  5. Mass spectrometry-based monitoring of millisecond protein–ligand binding dynamics using an automated microfluidic platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cong, Yongzheng; Katipamula, Shanta; Trader, Cameron D.

    2016-01-01

    Characterizing protein-ligand binding dynamics is crucial for understanding protein function and developing new therapeutic agents. We have developed a novel microfluidic platform that features rapid mixing of protein and ligand solutions, variable incubation times, and on-chip electrospray ionization to perform label-free, solution-based monitoring of protein-ligand binding dynamics. This platform offers many advantages including automated processing, rapid mixing, and low sample consumption.

  6. Silicon Carbide Temperature Monitor Processing Improvements. Status Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unruh, Troy Casey; Daw, Joshua Earl; Ahamad Al Rashdan

    2016-01-29

    Silicon carbide (SiC) temperature monitors are used as temperature sensors in Advanced Test Reactor (ATR) irradiations at the Idaho National Laboratory (INL). Although thermocouples are typically used to provide real-time temperature indication in instrumented lead tests, other indicators, such as melt wires, are also often included in such tests as an independent technique of detecting peak temperatures incurred during irradiation. In addition, less expensive static capsule tests, which have no leads attached for real-time data transmission, often rely on melt wires as a post-irradiation technique for peak temperature indication. Melt wires are limited in that they can only detect whethermore » a single temperature is or is not exceeded. SiC monitors are advantageous because a single monitor can be used to detect for a range of temperatures that occurred during irradiation. As part of the process initiated to make SiC temperature monitors available at the ATR, post-irradiation evaluations of these monitors have been previously completed at the High Temperature Test Laboratory (HTTL). INL selected the resistance measurement approach for determining irradiation temperature from SiC temperature monitors because it is considered to be the most accurate measurement. The current process involves the repeated annealing of the SiC monitors at incrementally increasing temperature, with resistivity measurements made between annealing steps. The process is time consuming and requires the nearly constant attention of a trained staff member. In addition to the expensive and lengthy post analysis required, the current process adds many potential sources of error in the measurement, as the sensor must be repeatedly moved from furnace to test fixture. This time-consuming post irradiation analysis is a significant portion of the total cost of using these otherwise inexpensive sensors. An additional consideration of this research is that, if the SiC post processing can be automated, it could be performed in an MFC hot cell, further reducing the time and expense of lengthy decontaminations prior to processing. Sections of this report provide a general description of resistivity techniques currently used to infer peak irradiation temperature from silicon carbide temperature monitors along with some representative data, the proposed concepts to improve the process of analyzing irradiated SiC temperature monitors, the completed efforts to prove the proposed concepts, and future activities. The efforts detailed here succeeded in designing and developing a real-time automated SiC resistivity measurement system, and performed two initial test runs. Activities carried out include the assembly and integration of the system hardware; the design and development of a preliminary monitor fixture; the design of a technique to automate the data analysis and processing; the development of the communication, coordination, and user software; and the execution and troubleshooting of test run experiments using the box furnace. Although the automation system performed as required, the designed fixture did not succeed in establishing the needed electrical contacts with the SiC monitor.« less

  7. Automated gas chromatography

    DOEpatents

    Mowry, C.D.; Blair, D.S.; Rodacy, P.J.; Reber, S.D.

    1999-07-13

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute. 7 figs.

  8. Automated gas chromatography

    DOEpatents

    Mowry, Curtis D.; Blair, Dianna S.; Rodacy, Philip J.; Reber, Stephen D.

    1999-01-01

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute.

  9. Microalgal process-monitoring based on high-selectivity spectroscopy tools: status and future perspectives.

    PubMed

    Podevin, Michael; Fotidis, Ioannis A; Angelidaki, Irini

    2018-08-01

    Microalgae are well known for their ability to accumulate lipids intracellularly, which can be used for biofuels and mitigate CO 2 emissions. However, due to economic challenges, microalgae bioprocesses have maneuvered towards the simultaneous production of food, feed, fuel, and various high-value chemicals in a biorefinery concept. On-line and in-line monitoring of macromolecules such as lipids, proteins, carbohydrates, and high-value pigments will be more critical to maintain product quality and consistency for downstream processing in a biorefinery to maintain and valorize these markets. The main contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity - the capability of monitoring several analytes simultaneously - in the interest of improving product quality, productivity, and process automation of a microalgal biorefinery. The high-selectivity PAT under consideration are mid-infrared (MIR), near-infrared (NIR), and Raman vibrational spectroscopies. The current review contains a critical assessment of these technologies in the context of recent advances in software and hardware in order to move microalgae production towards process automation through multivariate process control (MVPC) and software sensors trained on "big data". The paper will also include a comprehensive overview of off-line implementations of vibrational spectroscopy in microalgal research as it pertains to spectral interpretation and process automation to aid and motivate development.

  10. A summary of the history of the development of automated remote sensing for agricultural applications

    NASA Technical Reports Server (NTRS)

    Macdonald, R. B.

    1983-01-01

    The research conducted in the United States for the past 20 years with the objective of developing automated satellite remote sensing for monitoring the earth's major food crops is reviewed. The highlights of this research include a National Academy of Science study on the applicability of remote sensing monitoring given impetus by the introduction in the mid-1960's of the first airborne multispectral scanner (MSS); design simulations for the first earth resource satellite in 1969; and the use of the airborne MSS in the Corn Blight Watch, the first large application of remote sensing in agriculture, in 1970. Other programs discussed include the CITAR research project in 1972 which established the feasibility of automating digital classification to process high volumes of Landsat MSS data; the Large Area Crop Inventory Experiment (LACIE) in 1974-78, which demonstrated automated processing of Landsat MSS data in estimating wheat crop production on a global basis; and AgRISTARS, a program designed to address the technical issues defined by LACIE.

  11. Compact, Automated, Frequency-Agile Microspectrofluorimeter

    NASA Technical Reports Server (NTRS)

    Fernandez, Salvador M.; Guignon, Ernest F.

    1995-01-01

    Compact, reliable, rugged, automated cell-culture and frequency-agile microspectrofluorimetric apparatus developed to perform experiments involving photometric imaging observations of single live cells. In original application, apparatus operates mostly unattended aboard spacecraft; potential terrestrial applications include automated or semiautomated diagnosis of pathological tissues in clinical laboratories, biomedical instrumentation, monitoring of biological process streams, and portable instrumentation for testing biological conditions in various environments. Offers obvious advantages over present laboratory instrumentation.

  12. The Deep Space Network information system in the year 2000

    NASA Technical Reports Server (NTRS)

    Markley, R. W.; Beswick, C. A.

    1992-01-01

    The Deep Space Network (DSN), the largest, most sensitive scientific communications and radio navigation network in the world, is considered. Focus is made on the telemetry processing, monitor and control, and ground data transport architectures of the DSN ground information system envisioned for the year 2000. The telemetry architecture will be unified from the front-end area to the end user. It will provide highly automated monitor and control of the DSN, automated configuration of support activities, and a vastly improved human interface. Automated decision support systems will be in place for DSN resource management, performance analysis, fault diagnosis, and contingency management.

  13. Novel online monitoring and alert system for anaerobic digestion reactors.

    PubMed

    Dong, Fang; Zhao, Quan-Bao; Li, Wen-Wei; Sheng, Guo-Ping; Zhao, Jin-Bao; Tang, Yong; Yu, Han-Qing; Kubota, Kengo; Li, Yu-You; Harada, Hideki

    2011-10-15

    Effective monitoring and diagnosis of anaerobic digestion processes is a great challenge for anaerobic digestion reactors, which limits their stable operation. In this work, an online monitoring and alert system for upflow anaerobic sludge blanket (UASB) reactors is developed on the basis of a set of novel evaluating indexes. The two indexes, i.e., stability index S and auxiliary index a, which incorporate both gas- and liquid-phase parameters for UASB, enable a quantitative and comprehensive evaluation of reactor status. A series of shock tests is conducted to evaluate the response of the monitoring and alert system to organic overloading, hydraulic, temperature, and toxicant shocks. The results show that this system enables an accurate and rapid monitoring and diagnosis of the reactor status, and offers reliable early warnings on the potential risks. As the core of this system, the evaluating indexes are demonstrated to be of high accuracy and sensitivity in process evaluation and good adaptability to the artificial intelligence and automated control apparatus. This online monitoring and alert system presents a valuable effort to promote the automated monitoring and control of anaerobic digestion process, and holds a high promise for application.

  14. Development and implementation of an automatic integration system for fibre optic sensors in the braiding process with the objective of online-monitoring of composite structures

    NASA Astrophysics Data System (ADS)

    Hufenbach, W.; Gude, M.; Czulak, A.; Kretschmann, Martin

    2014-04-01

    Increasing economic, political and ecological pressure leads to steadily rising percentage of modern processing and manufacturing processes for fibre reinforced polymers in industrial batch production. Component weights beneath a level achievable by classic construction materials, which lead to a reduced energy and cost balance during product lifetime, justify the higher fabrication costs. However, complex quality control and failure prediction slow down the substitution by composite materials. High-resolution fibre-optic sensors (FOS), due their low diameter, high measuring point density and simple handling, show a high applicability potential for an automated sensor-integration in manufacturing processes, and therefore the online monitoring of composite products manufactured in industrial scale. Integrated sensors can be used to monitor manufacturing processes, part tests as well as the component structure during product life cycle, which simplifies allows quality control during production and the optimization of single manufacturing processes.[1;2] Furthermore, detailed failure analyses lead to a enhanced understanding of failure processes appearing in composite materials. This leads to a lower wastrel number and products of a higher value and longer product life cycle, whereby costs, material and energy are saved. This work shows an automation approach for FOS-integration in the braiding process. For that purpose a braiding wheel has been supplemented with an appliance for automatic sensor application, which has been used to manufacture preforms of high-pressure composite vessels with FOS-networks integrated between the fibre layers. All following manufacturing processes (vacuum infiltration, curing) and component tests (quasi-static pressure test, programmed delamination) were monitored with the help of the integrated sensor networks. Keywords: SHM, high-pressure composite vessel, braiding, automated sensor integration, pressure test, quality control, optic-fibre sensors, Rayleigh, Luna Technologies

  15. Building Automation Systems.

    ERIC Educational Resources Information Center

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  16. Robust, Multi-layered Plan Execution and Revision for Operation of a Network of Communication Antennas

    NASA Technical Reports Server (NTRS)

    Chien, S. A.; Hill, R. W., Jr.; Govindjee, A.; Wang, X.; Estlin, T.; Griesel, M. A.; Lam, R.; Fayyad, K. V.

    1996-01-01

    This paper describes a hierarchical scheduling, planning, control, and execution monitoring architecture for automating operations of a worldwide network of communications antennas. The purpose of this paper is to describe an architecture for automating the process of capturing spacecraft data.

  17. Automated monitoring of medical protocols: a secure and distributed architecture.

    PubMed

    Alsinet, T; Ansótegui, C; Béjar, R; Fernández, C; Manyà, F

    2003-03-01

    The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.

  18. Automated Guidance from Physiological Sensing to Reduce Thermal-Work Strain Levels on a Novel Task

    USDA-ARS?s Scientific Manuscript database

    This experiment demonstrated that automated pace guidance generated from real-time physiological monitoring allowed least stressful completion of a timed (60 minute limit) 5 mile treadmill exercise. An optimal pacing policy was estimated from a Markov decision process that balanced the goals of the...

  19. 76 FR 46269 - Notice of Petitions by Firms for Determination of Eligibility To Apply for Trade Adjustment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-02

    ... novel, simple to use automation and process monitoring products for industrial customers. Methods.... Nursery Supplies, Inc........ 1415 Orchard Drive, 26-Jul-11 The firm manufactures plastic containers... containers. Technautic International, 141 Robert E. Lee 22-Jul-11 The firm manufactures automated dissolved...

  20. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Thomason, Cindy; Anderson, Paul M.; Martin, James A.

    1990-01-01

    Automated power-distribution system monitors and controls electrical power to modules in network. Handles both 208-V, 20-kHz single-phase alternating current and 120- to 150-V direct current. Power distributed to load modules from power-distribution control units (PDCU's) via subsystem distributors. Ring busses carry power to PDCU's from power source. Needs minimal attention. Detects faults and also protects against them. Potential applications include autonomous land vehicles and automated industrial process systems.

  1. Process defects and in situ monitoring methods in metal powder bed fusion: a review

    NASA Astrophysics Data System (ADS)

    Grasso, Marco; Colosimo, Bianca Maria

    2017-04-01

    Despite continuous technological enhancements of metal Additive Manufacturing (AM) systems, the lack of process repeatability and stability still represents a barrier for the industrial breakthrough. The most relevant metal AM applications currently involve industrial sectors (e.g. aerospace and bio-medical) where defects avoidance is fundamental. Because of this, there is the need to develop novel in situ monitoring tools able to keep under control the stability of the process on a layer-by-layer basis, and to detect the onset of defects as soon as possible. On the one hand, AM systems must be equipped with in situ sensing devices able to measure relevant quantities during the process, a.k.a. process signatures. On the other hand, in-process data analytics and statistical monitoring techniques are required to detect and localize the defects in an automated way. This paper reviews the literature and the commercial tools for in situ monitoring of powder bed fusion (PBF) processes. It explores the different categories of defects and their main causes, the most relevant process signatures and the in situ sensing approaches proposed so far. Particular attention is devoted to the development of automated defect detection rules and the study of process control strategies, which represent two critical fields for the development of future smart PBF systems.

  2. Analysis And Control System For Automated Welding

    NASA Technical Reports Server (NTRS)

    Powell, Bradley W.; Burroughs, Ivan A.; Kennedy, Larry Z.; Rodgers, Michael H.; Goode, K. Wayne

    1994-01-01

    Automated variable-polarity plasma arc (VPPA) welding apparatus operates under electronic supervision by welding analysis and control system. System performs all major monitoring and controlling functions. It acquires, analyzes, and displays weld-quality data in real time and adjusts process parameters accordingly. Also records pertinent data for use in post-weld analysis and documentation of quality. System includes optoelectronic sensors and data processors that provide feedback control of welding process.

  3. Flexible Automation System for Determination of Elemental Composition of Incrustations in Clogged Biliary Endoprostheses Using ICP-MS.

    PubMed

    Fleischer, Heidi; Ramani, Kinjal; Blitti, Koffi; Roddelkopf, Thomas; Warkentin, Mareike; Behrend, Detlef; Thurow, Kerstin

    2018-02-01

    Automation systems are well established in industries and life science laboratories, especially in bioscreening and high-throughput applications. An increasing demand of automation solutions can be seen in the field of analytical measurement in chemical synthesis, quality control, and medical and pharmaceutical fields, as well as research and development. In this study, an automation solution was developed and optimized for the investigation of new biliary endoprostheses (stents), which should reduce clogging after implantation in the human body. The material inside the stents (incrustations) has to be controlled regularly and under identical conditions. The elemental composition is one criterion to be monitored in stent development. The manual procedure was transferred to an automated process including sample preparation, elemental analysis using inductively coupled plasma mass spectrometry (ICP-MS), and data evaluation. Due to safety issues, microwave-assisted acid digestion was executed outside of the automation system. The performance of the automated process was determined and validated. The measurement results and the processing times were compared for both the manual and the automated procedure. Finally, real samples of stent incrustations and pig bile were analyzed using the automation system.

  4. Automation-induced monitoring inefficiency: role of display location.

    PubMed

    Singh, I L; Molloy, R; Parasuraman, R

    1997-01-01

    Operators can be poor monitors of automation if they are engaged concurrently in other tasks. However, in previous studies of this phenomenon the automated task was always presented in the periphery, away from the primary manual tasks that were centrally displayed. In this study we examined whether centrally locating an automated task would boost monitoring performance during a flight-simulation task consisting of system monitoring, tracking and fuel resource management sub-tasks. Twelve nonpilot subjects were required to perform the tracking and fuel management tasks manually while watching the automated system monitoring task for occasional failures. The automation reliability was constant at 87.5% for six subjects and variable (alternating between 87.5% and 56.25%) for the other six subjects. Each subject completed four 30 min sessions over a period of 2 days. In each automation reliability condition the automation routine was disabled for the last 20 min of the fourth session in order to simulate catastrophic automation failure (0 % reliability). Monitoring for automation failure was inefficient when automation reliability was constant but not when it varied over time, replicating previous results. Furthermore, there was no evidence of resource or speed accuracy trade-off between tasks. Thus, automation-induced failures of monitoring cannot be prevented by centrally locating the automated task.

  5. Automation-induced monitoring inefficiency: role of display location

    NASA Technical Reports Server (NTRS)

    Singh, I. L.; Molloy, R.; Parasuraman, R.

    1997-01-01

    Operators can be poor monitors of automation if they are engaged concurrently in other tasks. However, in previous studies of this phenomenon the automated task was always presented in the periphery, away from the primary manual tasks that were centrally displayed. In this study we examined whether centrally locating an automated task would boost monitoring performance during a flight-simulation task consisting of system monitoring, tracking and fuel resource management sub-tasks. Twelve nonpilot subjects were required to perform the tracking and fuel management tasks manually while watching the automated system monitoring task for occasional failures. The automation reliability was constant at 87.5% for six subjects and variable (alternating between 87.5% and 56.25%) for the other six subjects. Each subject completed four 30 min sessions over a period of 2 days. In each automation reliability condition the automation routine was disabled for the last 20 min of the fourth session in order to simulate catastrophic automation failure (0 % reliability). Monitoring for automation failure was inefficient when automation reliability was constant but not when it varied over time, replicating previous results. Furthermore, there was no evidence of resource or speed accuracy trade-off between tasks. Thus, automation-induced failures of monitoring cannot be prevented by centrally locating the automated task.

  6. AUTOMATED MONITORING OF WASTEWATER TREATMENT EFFICIENCY - PHASE I

    EPA Science Inventory

    Wastewater treatments minimize the transmission of pathogens and are required by EPA with established treatment and monitoring requirements. The efficiency of treatment processes is determined by measuring the inactivation of indicator organisms (e.g., fecal coliform...

  7. Effects of Selected Task Performance Criteria at Initiating Adaptive Task Real locations

    NASA Technical Reports Server (NTRS)

    Montgomery, Demaris A.

    2001-01-01

    In the current report various performance assessment methods used to initiate mode transfers between manual control and automation for adaptive task reallocation were tested. Participants monitored two secondary tasks for critical events while actively controlling a process in a fictional system. One of the secondary monitoring tasks could be automated whenever operators' performance was below acceptable levels. Automation of the secondary task and transfer of the secondary task back to manual control were either human- or machine-initiated. Human-initiated transfers were based on the operator's assessment of the current task demands while machine-initiated transfers were based on the operators' performance. Different performance assessment methods were tested in two separate experiments.

  8. Intelligent sensor-model automated control of PMR-15 autoclave processing

    NASA Technical Reports Server (NTRS)

    Hart, S.; Kranbuehl, D.; Loos, A.; Hinds, B.; Koury, J.

    1992-01-01

    An intelligent sensor model system has been built and used for automated control of the PMR-15 cure process in the autoclave. The system uses frequency-dependent FM sensing (FDEMS), the Loos processing model, and the Air Force QPAL intelligent software shell. The Loos model is used to predict and optimize the cure process including the time-temperature dependence of the extent of reaction, flow, and part consolidation. The FDEMS sensing system in turn monitors, in situ, the removal of solvent, changes in the viscosity, reaction advancement and cure completion in the mold continuously throughout the processing cycle. The sensor information is compared with the optimum processing conditions from the model. The QPAL composite cure control system allows comparison of the sensor monitoring with the model predictions to be broken down into a series of discrete steps and provides a language for making decisions on what to do next regarding time-temperature and pressure.

  9. Semi-automated Digital Imaging and Processing System for Measuring Lake Ice Thickness

    NASA Astrophysics Data System (ADS)

    Singh, Preetpal

    Canada is home to thousands of freshwater lakes and rivers. Apart from being sources of infinite natural beauty, rivers and lakes are an important source of water, food and transportation. The northern hemisphere of Canada experiences extreme cold temperatures in the winter resulting in a freeze up of regional lakes and rivers. Frozen lakes and rivers tend to offer unique opportunities in terms of wildlife harvesting and winter transportation. Ice roads built on frozen rivers and lakes are vital supply lines for industrial operations in the remote north. Monitoring the ice freeze-up and break-up dates annually can help predict regional climatic changes. Lake ice impacts a variety of physical, ecological and economic processes. The construction and maintenance of a winter road can cost millions of dollars annually. A good understanding of ice mechanics is required to build and deem an ice road safe. A crucial factor in calculating load bearing capacity of ice sheets is the thickness of ice. Construction costs are mainly attributed to producing and maintaining a specific thickness and density of ice that can support different loads. Climate change is leading to warmer temperatures causing the ice to thin faster. At a certain point, a winter road may not be thick enough to support travel and transportation. There is considerable interest in monitoring winter road conditions given the high construction and maintenance costs involved. Remote sensing technologies such as Synthetic Aperture Radar have been successfully utilized to study the extent of ice covers and record freeze-up and break-up dates of ice on lakes and rivers across the north. Ice road builders often used Ultrasound equipment to measure ice thickness. However, an automated monitoring system, based on machine vision and image processing technology, which can measure ice thickness on lakes has not been thought of. Machine vision and image processing techniques have successfully been used in manufacturing to detect equipment failure and identify defective products at the assembly line. The research work in this thesis combines machine vision and image processing technology to build a digital imaging and processing system for monitoring and measuring lake ice thickness in real time. An ultra-compact USB camera is programmed to acquire and transmit high resolution imagery for processing with MATLAB Image Processing toolbox. The image acquisition and transmission process is fully automated; image analysis is semi-automated and requires limited user input. Potential design changes to the prototype and ideas on fully automating the imaging and processing procedure are presented to conclude this research work.

  10. SHARP: Automated monitoring of spacecraft health and status

    NASA Technical Reports Server (NTRS)

    Atkinson, David J.; James, Mark L.; Martin, R. Gaius

    1991-01-01

    Briefly discussed here are the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory (JPL). Some of the difficulties associated with the existing technology used in mission operations are highlighted. A new automated system based on artificial intelligence technology is described which seeks to overcome many of these limitations. The system, called the Spacecraft Health Automated Reasoning Prototype (SHARP), is designed to automate health and status analysis for multi-mission spacecraft and ground data systems operations. The system has proved to be effective for detecting and analyzing potential spacecraft and ground systems problems by performing real-time analysis of spacecraft and ground data systems engineering telemetry. Telecommunications link analysis of the Voyager 2 spacecraft was the initial focus for evaluation of the system in real-time operations during the Voyager spacecraft encounter with Neptune in August 1989.

  11. SHARP - Automated monitoring of spacecraft health and status

    NASA Technical Reports Server (NTRS)

    Atkinson, David J.; James, Mark L.; Martin, R. G.

    1990-01-01

    Briefly discussed here are the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory (JPL). Some of the difficulties associated with the existing technology used in mission operations are highlighted. A new automated system based on artificial intelligence technology is described which seeks to overcome many of these limitations. The system, called the Spacecraft Health Automated Reasoning Prototype (SHARP), is designed to automate health and status analysis for multi-mission spacecraft and ground data systems operations. The system has proved to be effective for detecting and analyzing potential spacecraft and ground systems problems by performing real-time analysis of spacecraft and ground data systems engineering telemetry. Telecommunications link analysis of the Voyager 2 spacecraft was the initial focus for evaluation of the system in real-time operations during the Voyager spacecraft encounter with Neptune in August 1989.

  12. An Automated Approach to Peanut dring with real-time monitoring of in-shell Kernel Moisture Content with a Microwave Sensor

    USDA-ARS?s Scientific Manuscript database

    Today’s peanut drying processes utilize decision support software based on modeling and require substantial human interaction for moisture sampling. These conditions increase the likelihood of peanuts being overdried or underdried. This research addresses the need for an automated controller with re...

  13. Real-time bioacoustics monitoring and automated species identification.

    PubMed

    Aide, T Mitchell; Corrada-Bravo, Carlos; Campos-Cerqueira, Marconi; Milan, Carlos; Vega, Giovany; Alvarez, Rafael

    2013-01-01

    Traditionally, animal species diversity and abundance is assessed using a variety of methods that are generally costly, limited in space and time, and most importantly, they rarely include a permanent record. Given the urgency of climate change and the loss of habitat, it is vital that we use new technologies to improve and expand global biodiversity monitoring to thousands of sites around the world. In this article, we describe the acoustical component of the Automated Remote Biodiversity Monitoring Network (ARBIMON), a novel combination of hardware and software for automating data acquisition, data management, and species identification based on audio recordings. The major components of the cyberinfrastructure include: a solar powered remote monitoring station that sends 1-min recordings every 10 min to a base station, which relays the recordings in real-time to the project server, where the recordings are processed and uploaded to the project website (arbimon.net). Along with a module for viewing, listening, and annotating recordings, the website includes a species identification interface to help users create machine learning algorithms to automate species identification. To demonstrate the system we present data on the vocal activity patterns of birds, frogs, insects, and mammals from Puerto Rico and Costa Rica.

  14. The automated counting of beating rates in individual cultured heart cells.

    PubMed

    Collins, G A; Dower, R; Walker, M J

    1981-12-01

    The effect of drugs on the beating rate of cultured heart cells can be monitored in a number of ways. The simultaneous automated measurement of beating rates of a number of cells allows drug effects to be rapidly quantified. A photoresistive detector placed on a television image of a cell, when coupled to operational amplifiers, gives binary signals that can be processed by a microprocessor. On this basis, we have devised a system that is capable of simultaneously monitoring the individual beating of six single cultured heart cells. A microprocessor automatically processes data obtained under different experimental conditions and records it in suitable descriptive formats such as dose-response curves and double reciprocal plots.

  15. Keep Your Scanners Peeled: Gaze Behavior as a Measure of Automation Trust During Highly Automated Driving.

    PubMed

    Hergeth, Sebastian; Lorenz, Lutz; Vilimek, Roman; Krems, Josef F

    2016-05-01

    The feasibility of measuring drivers' automation trust via gaze behavior during highly automated driving was assessed with eye tracking and validated with self-reported automation trust in a driving simulator study. Earlier research from other domains indicates that drivers' automation trust might be inferred from gaze behavior, such as monitoring frequency. The gaze behavior and self-reported automation trust of 35 participants attending to a visually demanding non-driving-related task (NDRT) during highly automated driving was evaluated. The relationship between dispositional, situational, and learned automation trust with gaze behavior was compared. Overall, there was a consistent relationship between drivers' automation trust and gaze behavior. Participants reporting higher automation trust tended to monitor the automation less frequently. Further analyses revealed that higher automation trust was associated with lower monitoring frequency of the automation during NDRTs, and an increase in trust over the experimental session was connected with a decrease in monitoring frequency. We suggest that (a) the current results indicate a negative relationship between drivers' self-reported automation trust and monitoring frequency, (b) gaze behavior provides a more direct measure of automation trust than other behavioral measures, and (c) with further refinement, drivers' automation trust during highly automated driving might be inferred from gaze behavior. Potential applications of this research include the estimation of drivers' automation trust and reliance during highly automated driving. © 2016, Human Factors and Ergonomics Society.

  16. Augmenting team cognition in human-automation teams performing in complex operational environments.

    PubMed

    Cuevas, Haydee M; Fiore, Stephen M; Caldwell, Barrett S; Strater, Laura

    2007-05-01

    There is a growing reliance on automation (e.g., intelligent agents, semi-autonomous robotic systems) to effectively execute increasingly cognitively complex tasks. Successful team performance for such tasks has become even more dependent on team cognition, addressing both human-human and human-automation teams. Team cognition can be viewed as the binding mechanism that produces coordinated behavior within experienced teams, emerging from the interplay between each team member's individual cognition and team process behaviors (e.g., coordination, communication). In order to better understand team cognition in human-automation teams, team performance models need to address issues surrounding the effect of human-agent and human-robot interaction on critical team processes such as coordination and communication. Toward this end, we present a preliminary theoretical framework illustrating how the design and implementation of automation technology may influence team cognition and team coordination in complex operational environments. Integrating constructs from organizational and cognitive science, our proposed framework outlines how information exchange and updating between humans and automation technology may affect lower-level (e.g., working memory) and higher-level (e.g., sense making) cognitive processes as well as teams' higher-order "metacognitive" processes (e.g., performance monitoring). Issues surrounding human-automation interaction are discussed and implications are presented within the context of designing automation technology to improve task performance in human-automation teams.

  17. An automated qualification framework for the MeerKAT CAM (Control-And-Monitoring)

    NASA Astrophysics Data System (ADS)

    van den Heever, Lize; Marais, Neilen; Slabber, Martin

    2016-08-01

    This paper introduces and discusses the design of an Automated Qualification Framework (AQF) that was developed to automate as much as possible of the formal Qualification Testing of the Control And Monitoring (CAM) subsystem of the 64 dish MeerKAT radio telescope currently under construction in the Karoo region of South Africa. The AQF allows each Integrated CAM Test to reference the MeerKAT CAM requirement and associated verification requirement it covers and automatically produces the Qualification Test Procedure and Qualification Test Report from the test steps and evaluation steps annotated in the Integrated CAM Tests. The MeerKAT System Engineers are extremely happy with the AQF results, but mostly by the approach and process it enforces.

  18. Using artificial intelligence to automate remittance processing.

    PubMed

    Adams, W T; Snow, G M; Helmick, P M

    1998-06-01

    The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.

  19. 3D printed fluidics with embedded analytic functionality for automated reaction optimisation

    PubMed Central

    Capel, Andrew J; Wright, Andrew; Harding, Matthew J; Weaver, George W; Li, Yuqi; Harris, Russell A; Edmondson, Steve; Goodridge, Ruth D

    2017-01-01

    Additive manufacturing or ‘3D printing’ is being developed as a novel manufacturing process for the production of bespoke micro- and milliscale fluidic devices. When coupled with online monitoring and optimisation software, this offers an advanced, customised method for performing automated chemical synthesis. This paper reports the use of two additive manufacturing processes, stereolithography and selective laser melting, to create multifunctional fluidic devices with embedded reaction monitoring capability. The selectively laser melted parts are the first published examples of multifunctional 3D printed metal fluidic devices. These devices allow high temperature and pressure chemistry to be performed in solvent systems destructive to the majority of devices manufactured via stereolithography, polymer jetting and fused deposition modelling processes previously utilised for this application. These devices were integrated with commercially available flow chemistry, chromatographic and spectroscopic analysis equipment, allowing automated online and inline optimisation of the reaction medium. This set-up allowed the optimisation of two reactions, a ketone functional group interconversion and a fused polycyclic heterocycle formation, via spectroscopic and chromatographic analysis. PMID:28228852

  20. Providing security for automated process control systems at hydropower engineering facilities

    NASA Astrophysics Data System (ADS)

    Vasiliev, Y. S.; Zegzhda, P. D.; Zegzhda, D. P.

    2016-12-01

    This article suggests the concept of a cyberphysical system to manage computer security of automated process control systems at hydropower engineering facilities. According to the authors, this system consists of a set of information processing tools and computer-controlled physical devices. Examples of cyber attacks on power engineering facilities are provided, and a strategy of improving cybersecurity of hydropower engineering systems is suggested. The architecture of the multilevel protection of the automated process control system (APCS) of power engineering facilities is given, including security systems, control systems, access control, encryption, secure virtual private network of subsystems for monitoring and analysis of security events. The distinctive aspect of the approach is consideration of interrelations and cyber threats, arising when SCADA is integrated with the unified enterprise information system.

  1. Applications of optical sensing for laser cutting and drilling.

    PubMed

    Fox, Mahlen D T; French, Paul; Peters, Chris; Hand, Duncan P; Jones, Julian D C

    2002-08-20

    Any reliable automated production system must include process control and monitoring techniques. Two laser processing techniques potentially lending themselves to automation are percussion drilling and cutting. For drilling we investigate the performance of a modification of a nonintrusive optical focus control system we previously developed for laser welding, which exploits the chromatic aberrations of the processing optics to determine focal error. We further developed this focus control system for closed-loop control of laser cutting. We show that an extension of the technique can detect deterioration in cut quality, and we describe practical trials carried out on different materials using both oxygen and nitrogen assist gas. We base our techniques on monitoring the light generated by the process, captured nonintrusively by the effector optics and processed remotely from the workpiece. We describe the relationship between the temporal and the chromatic modulation of the detected light and process quality and show how the information can be used as the basis of a process control system.

  2. Problems of collaborative work of the automated process control system (APCS) and the its information security and solutions.

    NASA Astrophysics Data System (ADS)

    Arakelyan, E. K.; Andryushin, A. V.; Mezin, S. V.; Kosoy, A. A.; Kalinina, Ya V.; Khokhlov, I. S.

    2017-11-01

    The principle of interaction of the specified systems of technological protections by the Automated process control system (APCS) and information safety in case of incorrect execution of the algorithm of technological protection is offered. - checking the correctness of the operation of technological protection in each specific situation using the functional relationship between the monitored parameters. The methodology for assessing the economic feasibility of developing and implementing an information security system.

  3. System for monitoring non-coincident, nonstationary process signals

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W.

    2005-01-04

    An improved system for monitoring non-coincident, non-stationary, process signals. The mean, variance, and length of a reference signal is defined by an automated system, followed by the identification of the leading and falling edges of a monitored signal and the length of the monitored signal. The monitored signal is compared to the reference signal, and the monitored signal is resampled in accordance with the reference signal. The reference signal is then correlated with the resampled monitored signal such that the reference signal and the resampled monitored signal are coincident in time with each other. The resampled monitored signal is then compared to the reference signal to determine whether the resampled monitored signal is within a set of predesignated operating conditions.

  4. A Review of Diagnostic Techniques for ISHM Applications

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, Ann; Biswas, Gautam; Aaseng, Gordon; Narasimhan, Sriam; Pattipati, Krishna

    2005-01-01

    System diagnosis is an integral part of any Integrated System Health Management application. Diagnostic applications make use of system information from the design phase, such as safety and mission assurance analysis, failure modes and effects analysis, hazards analysis, functional models, fault propagation models, and testability analysis. In modern process control and equipment monitoring systems, topological and analytic , models of the nominal system, derived from design documents, are also employed for fault isolation and identification. Depending on the complexity of the monitored signals from the physical system, diagnostic applications may involve straightforward trending and feature extraction techniques to retrieve the parameters of importance from the sensor streams. They also may involve very complex analysis routines, such as signal processing, learning or classification methods to derive the parameters of importance to diagnosis. The process that is used to diagnose anomalous conditions from monitored system signals varies widely across the different approaches to system diagnosis. Rule-based expert systems, case-based reasoning systems, model-based reasoning systems, learning systems, and probabilistic reasoning systems are examples of the many diverse approaches ta diagnostic reasoning. Many engineering disciplines have specific approaches to modeling, monitoring and diagnosing anomalous conditions. Therefore, there is no "one-size-fits-all" approach to building diagnostic and health monitoring capabilities for a system. For instance, the conventional approaches to diagnosing failures in rotorcraft applications are very different from those used in communications systems. Further, online and offline automated diagnostic applications are integrated into an operations framework with flight crews, flight controllers and maintenance teams. While the emphasis of this paper is automation of health management functions, striking the correct balance between automated and human-performed tasks is a vital concern.

  5. Monitoring activities of satellite data processing services in real-time with SDDS Live Monitor

    NASA Astrophysics Data System (ADS)

    Duc Nguyen, Minh

    2017-10-01

    This work describes Live Monitor, the monitoring subsystem of SDDS - an automated system for space experiment data processing, storage, and distribution created at SINP MSU. Live Monitor allows operators and developers of satellite data centers to identify errors occurred in data processing quickly and to prevent further consequences caused by the errors. All activities of the whole data processing cycle are illustrated via a web interface in real-time. Notification messages are delivered to responsible people via emails and Telegram messenger service. The flexible monitoring mechanism implemented in Live Monitor allows us to dynamically change and control events being shown on the web interface on our demands. Physicists, whose space weather analysis models are functioning upon satellite data provided by SDDS, can use the developed RESTful API to monitor their own events and deliver customized notification messages by their needs.

  6. Method of evaluating, expanding, and collapsing connectivity regions within dynamic systems

    DOEpatents

    Bailey, David A [Schenectady, NY

    2004-11-16

    An automated process defines and maintains connectivity regions within a dynamic network. The automated process requires an initial input of a network component around which a connectivity region will be defined. The process automatically and autonomously generates a region around the initial input, stores the region's definition, and monitors the network for a change. Upon detecting a change in the network, the effect is evaluated, and if necessary the regions are adjusted and redefined to accommodate the change. Only those regions of the network affected by the change will be updated. This process eliminates the need for an operator to manually evaluate connectivity regions within a network. Since the automated process maintains the network, the reliance on an operator is minimized; thus, reducing the potential for operator error. This combination of region maintenance and reduced operator reliance, results in a reduction of overall error.

  7. Automating multistep flow synthesis: approach and challenges in integrating chemistry, machines and logic

    PubMed Central

    Shukla, Chinmay A

    2017-01-01

    The implementation of automation in the multistep flow synthesis is essential for transforming laboratory-scale chemistry into a reliable industrial process. In this review, we briefly introduce the role of automation based on its application in synthesis viz. auto sampling and inline monitoring, optimization and process control. Subsequently, we have critically reviewed a few multistep flow synthesis and suggested a possible control strategy to be implemented so that it helps to reliably transfer the laboratory-scale synthesis strategy to a pilot scale at its optimum conditions. Due to the vast literature in multistep synthesis, we have classified the literature and have identified the case studies based on few criteria viz. type of reaction, heating methods, processes involving in-line separation units, telescopic synthesis, processes involving in-line quenching and process with the smallest time scale of operation. This classification will cover the broader range in the multistep synthesis literature. PMID:28684977

  8. An automated metrics system to measure and improve the success of laboratory automation implementation.

    PubMed

    Benn, Neil; Turlais, Fabrice; Clark, Victoria; Jones, Mike; Clulow, Stephen

    2007-03-01

    The authors describe a system for collecting usage metrics from widely distributed automation systems. An application that records and stores usage data centrally, calculates run times, and charts the data was developed. Data were collected over 20 months from at least 28 workstations. The application was used to plot bar charts of date versus run time for individual workstations, the automation in a specific laboratory, or automation of a specified type. The authors show that revised user training, redeployment of equipment, and running complimentary processes on one workstation can increase the average number of runs by up to 20-fold and run times by up to 450%. Active monitoring of usage leads to more effective use of automation. Usage data could be used to determine whether purchasing particular automation was a good investment.

  9. Monitoring real-time navigation processes using the automated reasoning tool (ART)

    NASA Technical Reports Server (NTRS)

    Maletz, M. C.; Culbert, C. J.

    1985-01-01

    An expert system is described for monitoring and controlling navigation processes in real-time. The ART-based system features data-driven computation, accommodation of synchronous and asynchronous data, temporal modeling for individual time intervals and chains of time intervals, and hypothetical reasoning capabilities that consider alternative interpretations of the state of navigation processes. The concept is illustrated in terms of the NAVEX system for monitoring and controlling the high speed ground navigation console for Mission Control at Johnson Space Center. The reasoning processes are outlined, including techniques used to consider alternative data interpretations. Installation of the system has permitted using a single operator, instead of three, to monitor the ascent and entry phases of a Shuttle mission.

  10. 45 CFR 310.40 - What requirements apply for accessing systems and records for monitoring Computerized Tribal IV-D...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... records for monitoring Computerized Tribal IV-D Systems and Office Automation? 310.40 Section 310.40... COMPUTERIZED TRIBAL IV-D SYSTEMS AND OFFICE AUTOMATION Accountability and Monitoring Procedures for... monitoring Computerized Tribal IV-D Systems and Office Automation? In accordance with Part 95 of this title...

  11. Inhibiting the Physiological Stress Effects of a Sustained Attention Task on Shoulder Muscle Activity.

    PubMed

    Wixted, Fiona; O'Riordan, Cliona; O'Sullivan, Leonard

    2018-01-11

    The objective of this study was to investigate if a breathing technique could counteract the effects of hyperventilation due to a sustained attention task on shoulder muscle activity. The trend towards higher levels of automation in industry is increasing. Consequently, manufacturing operators often monitor automated process for long periods of their work shift. Prolonged monitoring work requires sustained attention, which is a cognitive process that humans are typically poor at and find stressful. As sustained attention becomes an increasing requirement of manufacturing operators' job content, the resulting stress experienced could contribute to the onset of many health problems, including work related musculoskeletal disorders (WRMSDs). The SART attention test was completed by a group of participants before and after a breathing intervention exercise. The effects of the abdominal breathing intervention on breathing rate, upper trapezius muscle activity and end-tidal CO₂ were evaluated. The breathing intervention reduced the moderation effect of end-tidal CO₂ on upper trapezius muscle activity. Abdominal breathing could be a useful technique in reducing the effects of sustained attention work on muscular activity. This research can be applied to highly-automated manufacturing industries, where prolonged monitoring of work is widespread and could, in its role as a stressor, be a potential contributor to WRMSDs.

  12. A Procedural Electroencephalogram Simulator for Evaluation of Anesthesia Monitors.

    PubMed

    Petersen, Christian Leth; Görges, Matthias; Massey, Roslyn; Dumont, Guy Albert; Ansermino, J Mark

    2016-11-01

    Recent research and advances in the automation of anesthesia are driving the need to better understand electroencephalogram (EEG)-based anesthesia end points and to test the performance of anesthesia monitors. This effort is currently limited by the need to collect raw EEG data directly from patients. A procedural method to synthesize EEG signals was implemented in a mobile software application. The application is capable of sending the simulated signal to an anesthesia depth of hypnosis monitor. Systematic sweeps of the simulator generate functional monitor response profiles reminiscent of how network analyzers are used to test electronic components. Three commercial anesthesia monitors (Entropy, NeuroSENSE, and BIS) were compared with this new technology, and significant response and feature variations between the monitor models were observed; this includes reproducible, nonmonotonic apparent multistate behavior and significant hysteresis at light levels of anesthesia. Anesthesia monitor response to a procedural simulator can reveal significant differences in internal signal processing algorithms. The ability to synthesize EEG signals at different anesthetic depths potentially provides a new method for systematically testing EEG-based monitors and automated anesthesia systems with all sensor hardware fully operational before human trials.

  13. Reconfigurable Very Long Instruction Word (VLIW) Processor

    NASA Technical Reports Server (NTRS)

    Velev, Miroslav N.

    2015-01-01

    Future NASA missions will depend on radiation-hardened, power-efficient processing systems-on-a-chip (SOCs) that consist of a range of processor cores custom tailored for space applications. Aries Design Automation, LLC, has developed a processing SOC that is optimized for software-defined radio (SDR) uses. The innovation implements the Institute of Electrical and Electronics Engineers (IEEE) RazorII voltage management technique, a microarchitectural mechanism that allows processor cores to self-monitor, self-analyze, and selfheal after timing errors, regardless of their cause (e.g., radiation; chip aging; variations in the voltage, frequency, temperature, or manufacturing process). This highly automated SOC can also execute legacy PowerPC 750 binary code instruction set architecture (ISA), which is used in the flight-control computers of many previous NASA space missions. In developing this innovation, Aries Design Automation has made significant contributions to the fields of formal verification of complex pipelined microprocessors and Boolean satisfiability (SAT) and has developed highly efficient electronic design automation tools that hold promise for future developments.

  14. Temporal and Location Based RFID Event Data Management and Processing

    NASA Astrophysics Data System (ADS)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  15. Sentinel-2 ArcGIS Tool for Environmental Monitoring

    NASA Astrophysics Data System (ADS)

    Plesoianu, Alin; Cosmin Sandric, Ionut; Anca, Paula; Vasile, Alexandru; Calugaru, Andreea; Vasile, Cristian; Zavate, Lucian

    2017-04-01

    This paper addresses one of the biggest challenges regarding Sentinel-2 data, related to the need of an efficient tool to access and process the large collection of images that are available. Consequently, developing a tool for the automation of Sentinel-2 data analysis is the most immediate need. We developed a series of tools for the automation of Sentinel-2 data download and processing for vegetation health monitoring. The tools automatically perform the following operations: downloading image tiles from ESA's Scientific Hub or other venders (Amazon), pre-processing of the images to extract the 10-m bands, creating image composites, applying a series of vegetation indexes (NDVI, OSAVI, etc.) and performing change detection analyses on different temporal data sets. All of these tools run in a dynamic way in the ArcGIS Platform, without the need of creating intermediate datasets (rasters, layers), as the images are processed on-the-fly in order to avoid data duplication. Finally, they allow complete integration with the ArcGIS environment and workflows

  16. The Virtual Mission Operations Center

    NASA Technical Reports Server (NTRS)

    Moore, Mike; Fox, Jeffrey

    1994-01-01

    Spacecraft management is becoming more human intensive as spacecraft become more complex and as operations costs are growing accordingly. Several automation approaches have been proposed to lower these costs. However, most of these approaches are not flexible enough in the operations processes and levels of automation that they support. This paper presents a concept called the Virtual Mission Operations Center (VMOC) that provides highly flexible support for dynamic spacecraft management processes and automation. In a VMOC, operations personnel can be shared among missions, the operations team can change personnel and their locations, and automation can be added and removed as appropriate. The VMOC employs a form of on-demand supervisory control called management by exception to free operators from having to actively monitor their system. The VMOC extends management by exception, however, so that distributed, dynamic teams can work together. The VMOC uses work-group computing concepts and groupware tools to provide a team infrastructure, and it employs user agents to allow operators to define and control system automation.

  17. Automated process control for plasma etching

    NASA Astrophysics Data System (ADS)

    McGeown, Margaret; Arshak, Khalil I.; Murphy, Eamonn

    1992-06-01

    This paper discusses the development and implementation of a rule-based system which assists in providing automated process control for plasma etching. The heart of the system is to establish a correspondence between a particular data pattern -- sensor or data signals -- and one or more modes of failure, i.e., a data-driven monitoring approach. The objective of this rule based system, PLETCHSY, is to create a program combining statistical process control (SPC) and fault diagnosis to help control a manufacturing process which varies over time. This can be achieved by building a process control system (PCS) with the following characteristics. A facility to monitor the performance of the process by obtaining and analyzing the data relating to the appropriate process variables. Process sensor/status signals are input into an SPC module. If trends are present, the SPC module outputs the last seven control points, a pattern which is represented by either regression or scoring. The pattern is passed to the rule-based module. When the rule-based system recognizes a pattern, it starts the diagnostic process using the pattern. If the process is considered to be going out of control, advice is provided about actions which should be taken to bring the process back into control.

  18. Experiments with Test Case Generation and Runtime Analysis

    NASA Technical Reports Server (NTRS)

    Artho, Cyrille; Drusinsky, Doron; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Rosu, Grigore; Visser, Willem; Koga, Dennis (Technical Monitor)

    2003-01-01

    Software testing is typically an ad hoc process where human testers manually write many test inputs and expected test results, perhaps automating their execution in a regression suite. This process is cumbersome and costly. This paper reports preliminary results on an approach to further automate this process. The approach consists of combining automated test case generation based on systematically exploring the program's input domain, with runtime analysis, where execution traces are monitored and verified against temporal logic specifications, or analyzed using advanced algorithms for detecting concurrency errors such as data races and deadlocks. The approach suggests to generate specifications dynamically per input instance rather than statically once-and-for-all. The paper describes experiments with variants of this approach in the context of two examples, a planetary rover controller and a space craft fault protection system.

  19. High-frequency, long-duration water sampling in acid mine drainage studies: a short review of current methods and recent advances in automated water samplers

    USGS Publications Warehouse

    Chapin, Thomas

    2015-01-01

    Hand-collected grab samples are the most common water sampling method but using grab sampling to monitor temporally variable aquatic processes such as diel metal cycling or episodic events is rarely feasible or cost-effective. Currently available automated samplers are a proven, widely used technology and typically collect up to 24 samples during a deployment. However, these automated samplers are not well suited for long-term sampling in remote areas or in freezing conditions. There is a critical need for low-cost, long-duration, high-frequency water sampling technology to improve our understanding of the geochemical response to temporally variable processes. This review article will examine recent developments in automated water sampler technology and utilize selected field data from acid mine drainage studies to illustrate the utility of high-frequency, long-duration water sampling.

  20. Annual Forest Monitoring as part of Indonesia's National Carbon Accounting System

    NASA Astrophysics Data System (ADS)

    Kustiyo, K.; Roswintiarti, O.; Tjahjaningsih, A.; Dewanti, R.; Furby, S.; Wallace, J.

    2015-04-01

    Land use and forest change, in particular deforestation, have contributed the largest proportion of Indonesia's estimated greenhouse gas emissions. Indonesia's remaining forests store globally significant carbon stocks, as well as biodiversity values. In 2010, the Government of Indonesia entered into a REDD+ partnership. A spatially detailed monitoring and reporting system for forest change which is national and operating in Indonesia is required for participation in such programs, as well as for national policy reasons including Monitoring, Reporting, and Verification (MRV), carbon accounting, and land-use and policy information. Indonesia's National Carbon Accounting System (INCAS) has been designed to meet national and international policy requirements. The INCAS remote sensing program is producing spatially-detailed annual wall-to-wall monitoring of forest cover changes from time-series Landsat imagery for the whole of Indonesia from 2000 to the present day. Work on the program commenced in 2009, under the Indonesia-Australia Forest Carbon Partnership. A principal objective was to build an operational system in Indonesia through transfer of knowledge and experience, from Australia's National Carbon Accounting System, and adaptation of this experience to Indonesia's requirements and conditions. A semi-automated system of image pre-processing (ortho-rectification, calibration, cloud masking and mosaicing) and forest extent and change mapping (supervised classification of a 'base' year, semi-automated single-year classifications and classification within a multi-temporal probabilistic framework) was developed for Landsat 5 TM and Landsat 7 ETM+. Particular attention is paid to the accuracy of each step in the processing. With the advent of Landsat 8 data and parallel development of processing capability, capacity and international collaborations within the LAPAN Data Centre this processing is being increasingly automated. Research is continuing into improved processing methodology and integration of information from other data sources. This paper presents technical elements of the INCAS remote sensing program and some results of the 2000 - 2012 mapping.

  1. 49 CFR 238.237 - Automated monitoring.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Automated monitoring. 238.237 Section 238.237... Equipment § 238.237 Automated monitoring. (a) Except as further specified in this paragraph, on or after... train speed and capabilities of the signal system. The railroad shall document the basis for setting...

  2. Near-Earth Asteroids: Destinations for Human Exploration

    NASA Technical Reports Server (NTRS)

    Barbee, Brent W.

    2014-01-01

    The Near-Earth Object Human Space Flight Accessible Targets Study (NHATS) is a system that monitors the near-Earth asteroid (NEA) population to identify NEAs whose orbital characteristics may make them potential destinations for future round-trip human space flight missions. To accomplish this monitoring, Brent Barbee (GSFC) developed and automated a system that applies specialized trajectory processing to the orbits of newly discovered NEAs, and those for which we have updated orbit knowledge, obtained from the JPL Small Bodies Database (SBDB). This automated process executes daily and the results are distributed to the general public and the astronomy community. This aids in prioritizing telescope radar time allocations for obtaining crucial follow-up observations of highly accessible NEAs during the critical, because it is often fleeting, time period surrounding the time at which the NEAs are initially discovered.

  3. FDEMS Sensing for Automated Intelligent Processing of PMR-15

    NASA Technical Reports Server (NTRS)

    Kranbuehl, David E.; Hood, D. K.; Rogozinski, J.; Barksdale, R.; Loos, Alfred C.; McRae, Doug

    1993-01-01

    The purpose of this grant was to develop frequency dependent dielectric measurements, often called FDEMS (frequency dependent electromagnetic sensing), to monitor and intelligently control the cure process in PMR-15, a stoichiometric mixture of a nadic ester, dimethyl ester, and methylendianiline in a monomor ratio.

  4. Online low-field NMR spectroscopy for process control of an industrial lithiation reaction-automated data analysis.

    PubMed

    Kern, Simon; Meyer, Klas; Guhl, Svetlana; Gräßer, Patrick; Paul, Andrea; King, Rudibert; Maiwald, Michael

    2018-05-01

    Monitoring specific chemical properties is the key to chemical process control. Today, mainly optical online methods are applied, which require time- and cost-intensive calibration effort. NMR spectroscopy, with its advantage being a direct comparison method without need for calibration, has a high potential for enabling closed-loop process control while exhibiting short set-up times. Compact NMR instruments make NMR spectroscopy accessible in industrial and rough environments for process monitoring and advanced process control strategies. We present a fully automated data analysis approach which is completely based on physically motivated spectral models as first principles information (indirect hard modeling-IHM) and applied it to a given pharmaceutical lithiation reaction in the framework of the European Union's Horizon 2020 project CONSENS. Online low-field NMR (LF NMR) data was analyzed by IHM with low calibration effort, compared to a multivariate PLS-R (partial least squares regression) approach, and both validated using online high-field NMR (HF NMR) spectroscopy. Graphical abstract NMR sensor module for monitoring of the aromatic coupling of 1-fluoro-2-nitrobenzene (FNB) with aniline to 2-nitrodiphenylamine (NDPA) using lithium-bis(trimethylsilyl) amide (Li-HMDS) in continuous operation. Online 43.5 MHz low-field NMR (LF) was compared to 500 MHz high-field NMR spectroscopy (HF) as reference method.

  5. Monitoring cognitive function and need with the automated neuropsychological assessment metrics in Decompression Sickness (DCS) research

    NASA Technical Reports Server (NTRS)

    Nesthus, Thomas E.; Schiflett, Sammuel G.

    1993-01-01

    Hypobaric decompression sickness (DCS) research presents the medical monitor with the difficult task of assessing the onset and progression of DCS largely on the basis of subjective symptoms. Even with the introduction of precordial Doppler ultrasound techniques for the detection of venous gas emboli (VGE), correct prediction of DCS can be made only about 65 percent of the time according to data from the Armstrong Laboratory's (AL's) hypobaric DCS database. An AL research protocol concerned with exercise and its effects on denitrogenation efficiency includes implementation of a performance assessment test battery to evaluate cognitive functioning during a 4-h simulated 30,000 ft (9144 m) exposure. Information gained from such a test battery may assist the medical monitor in identifying early signs of DCS and subtle neurologic dysfunction related to cases of asymptomatic, but advanced, DCS. This presentation concerns the selection and integration of a test battery and the timely graphic display of subject test results for the principal investigator and medical monitor. A subset of the Automated Neuropsychological Assessment Metrics (ANAM) developed through the Office of Military Performance Assessment Technology (OMPAT) was selected. The ANAM software provides a library of simple tests designed for precise measurement of processing efficiency in a variety of cognitive domains. For our application and time constraints, two tests requiring high levels of cognitive processing and memory were chosen along with one test requiring fine psychomotor performance. Accuracy, speed, and processing throughout variables as well as RMS error were collected. An automated mood survey provided 'state' information on six scales including anger, happiness, fear, depression, activity, and fatigue. An integrated and interactive LOTUS 1-2-3 macro was developed to import and display past and present task performance and mood-change information.

  6. Automated tracking of lava lake level using thermal images at Kīlauea Volcano, Hawai’i

    USGS Publications Warehouse

    Patrick, Matthew R.; Swanson, Don; Orr, Tim R.

    2016-01-01

    Tracking the level of the lava lake in Halema‘uma‘u Crater, at the summit of Kīlauea Volcano, Hawai’i, is an essential part of monitoring the ongoing eruption and forecasting potentially hazardous changes in activity. We describe a simple automated image processing routine that analyzes continuously-acquired thermal images of the lava lake and measures lava level. The method uses three image segmentation approaches, based on edge detection, short-term change analysis, and composite temperature thresholding, to identify and track the lake margin in the images. These relative measurements from the images are periodically calibrated with laser rangefinder measurements to produce real-time estimates of lake elevation. Continuous, automated tracking of the lava level has been an important tool used by the U.S. Geological Survey’s Hawaiian Volcano Observatory since 2012 in real-time operational monitoring of the volcano and its hazard potential.

  7. Eavesdropping on the Arctic: Automated bioacoustics reveal dynamics in songbird breeding phenology.

    PubMed

    Oliver, Ruth Y; Ellis, Daniel P W; Chmura, Helen E; Krause, Jesse S; Pérez, Jonathan H; Sweet, Shannan K; Gough, Laura; Wingfield, John C; Boelman, Natalie T

    2018-06-01

    Bioacoustic networks could vastly expand the coverage of wildlife monitoring to complement satellite observations of climate and vegetation. This approach would enable global-scale understanding of how climate change influences phenomena such as migratory timing of avian species. The enormous data sets that autonomous recorders typically generate demand automated analyses that remain largely undeveloped. We devised automated signal processing and machine learning approaches to estimate dates on which songbird communities arrived at arctic breeding grounds. Acoustically estimated dates agreed well with those determined via traditional surveys and were strongly related to the landscape's snow-free dates. We found that environmental conditions heavily influenced daily variation in songbird vocal activity, especially before egg laying. Our novel approaches demonstrate that variation in avian migratory arrival can be detected autonomously. Large-scale deployment of this innovation in wildlife monitoring would enable the coverage necessary to assess and forecast changes in bird migration in the face of climate change.

  8. Automated space processing payloads study. Volume 2, book 2: Technical report, appendices A through E. [instrument packages and space shuttles

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Experiment hardware and operational requirements for space shuttle experiments are discussed along with payload and system concepts. Appendixes are included in which experiment data sheets, chamber environmental control and monitoring, method for collection and storage of electrophoretically-separated samples, preliminary thermal evaluation of electromagnetic levitation facilities L1, L2, and L3, and applicable industrial automation equipment are discussed.

  9. [Organization of monitoring of electromagnetic radiation in the urban environment].

    PubMed

    Savel'ev, S I; Dvoeglazova, S V; Koz'min, V A; Kochkin, D E; Begishev, M R

    2008-01-01

    The authors describe new current approaches to monitoring the environment, including the sources of electromagnetic radiation and noise. Electronic maps of the area under study are shown to be made, by constructing the isolines or distributing the actual levels of controlled factors. These current approaches to electromagnetic and acoustic monitoring make it possible to automate a process of measurements, to analyze the established situation, and to simplify the risk controlling methodology.

  10. Measuring, managing and maximizing refinery performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bascur, O.A.; Kennedy, J.P.

    1996-01-01

    Implementing continuous quality improvement is a confluence of total quality management, people empowerment, performance indicators and information engineering. Supporting information technologies allow a refiner to narrow the gap between management objectives and the process control level. Dynamic performance monitoring benefits come from production cost savings, improved communications and enhanced decision making. A refinery workgroup information flow model helps automate continuous improvement of processes, performance and the organization. The paper discusses the rethinking of refinery operations, dynamic performance monitoring, continuous process improvement, the knowledge coordinator and repository manager, an integrated plant operations workflow, and successful implementation.

  11. Instrumentation Automation for Concrete Structures: Report 2, Automation Hardware and Retrofitting Techniques, and Report 3, Available Data Collection and Reduction Software

    DTIC Science & Technology

    1987-06-01

    commercial products. · OP -- Typical cutout at a plumbiinc location where an automated monitoring system has bv :• installed. The sensor used with the...This report provides a description of commercially available sensors , instruments, and ADP equipment that may be selected to fully automate...automated. The automated plumbline monitoring system includes up to twelve sensors , repeaters, a system controller, and a printer. The system may

  12. Sensor-model prediction, monitoring and in-situ control of liquid RTM advanced fiber architecture composite processing

    NASA Technical Reports Server (NTRS)

    Kranbuehl, D.; Kingsley, P.; Hart, S.; Loos, A.; Hasko, G.; Dexter, B.

    1992-01-01

    In-situ frequency dependent electromagnetic sensors (FDEMS) and the Loos resin transfer model have been used to select and control the processing properties of an epoxy resin during liquid pressure RTM impregnation and cure. Once correlated with viscosity and degree of cure the FDEMS sensor monitors and the RTM processing model predicts the reaction advancement of the resin, viscosity and the impregnation of the fabric. This provides a direct means for predicting, monitoring, and controlling the liquid RTM process in-situ in the mold throughout the fabrication process and the effects of time, temperature, vacuum and pressure. Most importantly, the FDEMS-sensor model system has been developed to make intelligent decisions, thereby automating the liquid RTM process and removing the need for operator direction.

  13. Vascular Glucose Sensor Symposium: Continuous Glucose Monitoring Systems (CGMS) for Hospitalized and Ambulatory Patients at Risk for Hyperglycemia, Hypoglycemia, and Glycemic Variability.

    PubMed

    Joseph, Jeffrey I; Torjman, Marc C; Strasma, Paul J

    2015-07-01

    Hyperglycemia, hypoglycemia, and glycemic variability have been associated with increased morbidity, mortality, length of stay, and cost in a variety of critical care and non-critical care patient populations in the hospital. The results from prospective randomized clinical trials designed to determine the risks and benefits of intensive insulin therapy and tight glycemic control have been confusing; and at times conflicting. The limitations of point-of-care blood glucose (BG) monitoring in the hospital highlight the great clinical need for an automated real-time continuous glucose monitoring system (CGMS) that can accurately measure the concentration of glucose every few minutes. Automation and standardization of the glucose measurement process have the potential to significantly improve BG control, clinical outcome, safety and cost. © 2015 Diabetes Technology Society.

  14. The Automation and Exoplanet Orbital Characterization from the Gemini Planet Imager Exoplanet Survey

    NASA Astrophysics Data System (ADS)

    Jinfei Wang, Jason; Graham, James; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry; Kalas, Paul; arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Ruffio, Jean-Baptiste; Sivaramakrishnan, Anand; Gemini Planet Imager Exoplanet Survey Collaboration

    2018-01-01

    The Gemini Planet Imager (GPI) Exoplanet Survey (GPIES) is a multi-year 600-star survey to discover and characterize young Jovian exoplanets and their planet forming environments. For large surveys like GPIES, it is critical to have a uniform dataset processed with the latest techniques and calibrations. I will describe the GPI Data Cruncher, an automated data processing framework that is able to generate fully reduced data minutes after the data are taken and can also reprocess the entire campaign in a single day on a supercomputer. The Data Cruncher integrates into a larger automated data processing infrastructure which syncs, logs, and displays the data. I will discuss the benefits of the GPIES data infrastructure, including optimizing observing strategies, finding planets, characterizing instrument performance, and constraining giant planet occurrence. I will also discuss my work in characterizing the exoplanets we have imaged in GPIES through monitoring their orbits. Using advanced data processing algorithms and GPI's precise astrometric calibration, I will show that GPI can achieve one milliarcsecond astrometry on the extensively-studied planet Beta Pic b. With GPI, we can confidently rule out a possible transit of Beta Pic b, but have precise timings on a Hill sphere transit, and I will discuss efforts to search for transiting circumplanetary material this year. I will also discuss the orbital monitoring of other exoplanets as part of GPIES.

  15. Safety Evaluation of an Automated Remote Monitoring System for Heart Failure in an Urban, Indigent Population.

    PubMed

    Gross-Schulman, Sandra; Sklaroff, Laura Myerchin; Hertz, Crystal Coyazo; Guterman, Jeffrey J

    2017-12-01

    Heart Failure (HF) is the most expensive preventable condition, regardless of patient ethnicity, race, socioeconomic status, sex, and insurance status. Remote telemonitoring with timely outpatient care can significantly reduce avoidable HF hospitalizations. Human outreach, the traditional method used for remote monitoring, is effective but costly. Automated systems can potentially provide positive clinical, fiscal, and satisfaction outcomes in chronic disease monitoring. The authors implemented a telephonic HF automated remote monitoring system that utilizes deterministic decision tree logic to identify patients who are at risk of clinical decompensation. This safety study evaluated the degree of clinical concordance between the automated system and traditional human monitoring. This study focused on a broad underserved population and demonstrated a safe, reliable, and inexpensive method of monitoring patients with HF.

  16. Advanced I&C for Fault-Tolerant Supervisory Control of Small Modular Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Daniel G.

    In this research, we have developed a supervisory control approach to enable automated control of SMRs. By design the supervisory control system has an hierarchical, interconnected, adaptive control architecture. A considerable advantage to this architecture is that it allows subsystems to communicate at different/finer granularity, facilitates monitoring of process at the modular and plant levels, and enables supervisory control. We have investigated the deployment of automation, monitoring, and data collection technologies to enable operation of multiple SMRs. Each unit's controller collects and transfers information from local loops and optimize that unit’s parameters. Information is passed from the each SMR unitmore » controller to the supervisory controller, which supervises the actions of SMR units and manage plant processes. The information processed at the supervisory level will provide operators the necessary information needed for reactor, unit, and plant operation. In conjunction with the supervisory effort, we have investigated techniques for fault-tolerant networks, over which information is transmitted between local loops and the supervisory controller to maintain a safe level of operational normalcy in the presence of anomalies. The fault-tolerance of the supervisory control architecture, the network that supports it, and the impact of fault-tolerance on multi-unit SMR plant control has been a second focus of this research. To this end, we have investigated the deployment of advanced automation, monitoring, and data collection and communications technologies to enable operation of multiple SMRs. We have created a fault-tolerant multi-unit SMR supervisory controller that collects and transfers information from local loops, supervise their actions, and adaptively optimize the controller parameters. The goal of this research has been to develop the methodologies and procedures for fault-tolerant supervisory control of small modular reactors. To achieve this goal, we have identified the following objectives. These objective are an ordered approach to the research: I) Development of a supervisory digital I&C system II) Fault-tolerance of the supervisory control architecture III) Automated decision making and online monitoring.« less

  17. Software design for automated assembly of truss structures

    NASA Technical Reports Server (NTRS)

    Herstrom, Catherine L.; Grantham, Carolyn; Allen, Cheryl L.; Doggett, William R.; Will, Ralph W.

    1992-01-01

    Concern over the limited intravehicular activity time has increased the interest in performing in-space assembly and construction operations with automated robotic systems. A technique being considered at LaRC is a supervised-autonomy approach, which can be monitored by an Earth-based supervisor that intervenes only when the automated system encounters a problem. A test-bed to support evaluation of the hardware and software requirements for supervised-autonomy assembly methods was developed. This report describes the design of the software system necessary to support the assembly process. The software is hierarchical and supports both automated assembly operations and supervisor error-recovery procedures, including the capability to pause and reverse any operation. The software design serves as a model for the development of software for more sophisticated automated systems and as a test-bed for evaluation of new concepts and hardware components.

  18. Automated Monitoring of Carbon Fluxes in a Northern Rocky Mountain Forest Indicates Above-Average Net Primary Productivity During the 2015 Western U.S. Drought

    NASA Astrophysics Data System (ADS)

    Stenzel, J.; Hudiburg, T. W.

    2016-12-01

    As global temperatures rise in the 21st century, "hotter" droughts will become more intense and persistent, particularly in areas which already experience seasonal drought. Because forests represent a large and persistent terrestrial carbon sink which has previously offset a significant proportion of anthropogenic carbon emissions, forest carbon cycle responses to drought have become a prominent research concern. However, robust mechanistic modeling of carbon balance responses to projected drought effects requires improved observation-driven representations of carbon cycle processes; many such component processes are rarely monitored in complex terrain, are modeled or unrepresented quantities at eddy covariance sites, or are monitored at course temporal scales that are not conducive to elucidating process responses at process time scales. In the present study, we demonstrate the use of newly available and affordable automated dendrometers for the estimation of intra-seasonal Net Primary Productivity (NPP) in a Northern Rocky Mountain conifer forest which is impacted by seasonal drought. Results from our pilot study suggest that NPP was restricted by mid-summer moisture deficit under the extraordinary 2015 Western U.S. drought, with greater than 90% off stand growth occurring prior to August. Examination of growth on an inter-annual scale, however, suggests that the study site experienced above-average NPP during this exceptionally hot year. Taken together, these findings indicate that intensifying mid-summer drought in regional forests has affected the timing but has not diminished the magnitude of this carbon flux. By employing automated instrumentation for the intra-annual assessment of NPP, we reveal that annual NPP in regional forests is largely determined before mid-summer and is therefore surprisingly resilient to intensities of seasonal drought that exceed normal conditions of the 20th century.

  19. A Tool for Automatic Verification of Real-Time Expert Systems

    NASA Technical Reports Server (NTRS)

    Traylor, B.; Schwuttke, U.; Quan, A.

    1994-01-01

    The creation of an automated, user-driven tool for expert system development, validation, and verification is curretly onoging at NASA's Jet Propulsion Laboratory. In the new age of faster, better, cheaper missions, there is an increased willingness to utilize embedded expert systems for encapsulating and preserving mission expertise in systems which combine conventional algorithmic processing and artifical intelligence. The once-questioned role of automation in spacecraft monitoring is now becoming one of increasing importance.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Egorov, Oleg; O'Hara, Matthew J.; Grate, Jay W.

    An automated fluidic instrument is described that rapidly determines the total 99Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the 99Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of 99Tc. We developed amore » preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of 99Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.« less

  1. Launch Commit Criteria Monitoring Agent

    NASA Technical Reports Server (NTRS)

    Semmel, Glenn S.; Davis, Steven R.; Leucht, Kurt W.; Rowe, Dan A.; Kelly, Andrew O.; Boeloeni, Ladislau

    2005-01-01

    The Spaceport Processing Systems Branch at NASA Kennedy Space Center has developed and deployed a software agent to monitor the Space Shuttle's ground processing telemetry stream. The application, the Launch Commit Criteria Monitoring Agent, increases situational awareness for system and hardware engineers during Shuttle launch countdown. The agent provides autonomous monitoring of the telemetry stream, automatically alerts system engineers when predefined criteria have been met, identifies limit warnings and violations of launch commit criteria, aids Shuttle engineers through troubleshooting procedures, and provides additional insight to verify appropriate troubleshooting of problems by contractors. The agent has successfully detected launch commit criteria warnings and violations on a simulated playback data stream. Efficiency and safety are improved through increased automation.

  2. A GPS-based Real-time Road Traffic Monitoring System

    NASA Astrophysics Data System (ADS)

    Tanti, Kamal Kumar

    In recent years, monitoring systems are astonishingly inclined towards ever more automatic; reliably interconnected, distributed and autonomous operation. Specifically, the measurement, logging, data processing and interpretation activities may be carried out by separate units at different locations in near real-time. The recent evolution of mobile communication devices and communication technologies has fostered a growing interest in the GIS & GPS-based location-aware systems and services. This paper describes a real-time road traffic monitoring system based on integrated mobile field devices (GPS/GSM/IOs) working in tandem with advanced GIS-based application software providing on-the-fly authentications for real-time monitoring and security enhancement. The described system is developed as a fully automated, continuous, real-time monitoring system that employs GPS sensors and Ethernet and/or serial port communication techniques are used to transfer data between GPS receivers at target points and a central processing computer. The data can be processed locally or remotely based on the requirements of client’s satisfaction. Due to the modular architecture of the system, other sensor types may be supported with minimal effort. Data on the distributed network & measurements are transmitted via cellular SIM cards to a Control Unit, which provides for post-processing and network management. The Control Unit may be remotely accessed via an Internet connection. The new system will not only provide more consistent data about the road traffic conditions but also will provide methods for integrating with other Intelligent Transportation Systems (ITS). For communication between the mobile device and central monitoring service GSM technology is used. The resulting system is characterized by autonomy, reliability and a high degree of automation.

  3. Performance Monitoring of Chilled-Water Distribution Systems Using HVAC-Cx

    PubMed Central

    Ferretti, Natascha Milesi; Galler, Michael A.; Bushby, Steven T.

    2017-01-01

    In this research we develop, test, and demonstrate the newest extension of the software HVAC-Cx (NIST and CSTB 2014), an automated commissioning tool for detecting common mechanical faults and control errors in chilled-water distribution systems (loops). The commissioning process can improve occupant comfort, ensure the persistence of correct system operation, and reduce energy consumption. Automated tools support the process by decreasing the time and the skill level required to carry out necessary quality assurance measures, and as a result they enable more thorough testing of building heating, ventilating, and air-conditioning (HVAC) systems. This paper describes the algorithm, developed by National Institute of Standards and Technology (NIST), to analyze chilled-water loops and presents the results of a passive monitoring investigation using field data obtained from BACnet® (ASHRAE 2016) controllers and presents field validation of the findings. The tool was successful in detecting faults in system operation in its first field implementation supporting the investigation phase through performance monitoring. Its findings led to a full energy retrocommissioning of the field site. PMID:29167584

  4. Performance Monitoring of Chilled-Water Distribution Systems Using HVAC-Cx.

    PubMed

    Ferretti, Natascha Milesi; Galler, Michael A; Bushby, Steven T

    2017-01-01

    In this research we develop, test, and demonstrate the newest extension of the software HVAC-Cx (NIST and CSTB 2014), an automated commissioning tool for detecting common mechanical faults and control errors in chilled-water distribution systems (loops). The commissioning process can improve occupant comfort, ensure the persistence of correct system operation, and reduce energy consumption. Automated tools support the process by decreasing the time and the skill level required to carry out necessary quality assurance measures, and as a result they enable more thorough testing of building heating, ventilating, and air-conditioning (HVAC) systems. This paper describes the algorithm, developed by National Institute of Standards and Technology (NIST), to analyze chilled-water loops and presents the results of a passive monitoring investigation using field data obtained from BACnet ® (ASHRAE 2016) controllers and presents field validation of the findings. The tool was successful in detecting faults in system operation in its first field implementation supporting the investigation phase through performance monitoring. Its findings led to a full energy retrocommissioning of the field site.

  5. Continuous emission monitoring and accounting automated systems at an HPP

    NASA Astrophysics Data System (ADS)

    Roslyakov, P. V.; Ionkin, I. L.; Kondrateva, O. E.; Borovkova, A. M.; Seregin, V. A.; Morozov, I. V.

    2015-03-01

    Environmental and industrial emission monitoring at HPP's is a very urgent task today. Industrial monitoring assumes monitoring of emissions of harmful pollutants and optimization of fuel combustion technological processes at HPP's. Environmental monitoring is a system to assess ambient air quality with respect to a number of separate sources of harmful substances in pollution of atmospheric air of the area. Works on creating an industrial monitoring system are carried out at the National Research University Moscow Power Engineering Institute (MPEI) on the basis of the MPEI combined heat and power plant, and environmental monitoring stations are installed in Lefortovo raion, where the CHPP is located.

  6. PleurAlert: an augmented chest drainage system with electronic sensing, automated alerts and internet connectivity.

    PubMed

    Leeson, Cory E; Weaver, Robert A; Bissell, Taylor; Hoyer, Rachel; McClain, Corinne; Nelson, Douglas A; Samosky, Joseph T

    2012-01-01

    We have enhanced a common medical device, the chest tube drainage container, with electronic sensing of fluid volume, automated detection of critical alarm conditions and the ability to automatically send alert text messages to a nurse's cell phone. The PleurAlert system provides a simple touch-screen interface and can graphically display chest tube output over time. Our design augments a device whose basic function dates back 50 years by adding technology to automate and optimize a monitoring process that can be time consuming and inconvenient for nurses. The system may also enhance detection of emergency conditions and speed response time.

  7. Towards a geophysical decision-support system for monitoring and managing unstable slopes

    NASA Astrophysics Data System (ADS)

    Chambers, J. E.; Meldrum, P.; Wilkinson, P. B.; Uhlemann, S.; Swift, R. T.; Inauen, C.; Gunn, D.; Kuras, O.; Whiteley, J.; Kendall, J. M.

    2017-12-01

    Conventional approaches for condition monitoring, such as walk over surveys, remote sensing or intrusive sampling, are often inadequate for predicting instabilities in natural and engineered slopes. Surface observations cannot detect the subsurface precursors to failure events; instead they can only identify failure once it has begun. On the other hand, intrusive investigations using boreholes only sample a very small volume of ground and hence small scale deterioration process in heterogeneous ground conditions can easily be missed. It is increasingly being recognised that geophysical techniques can complement conventional approaches by providing spatial subsurface information. Here we describe the development and testing of a new geophysical slope monitoring system. It is built around low-cost electrical resistivity tomography instrumentation, combined with integrated geotechnical logging capability, and coupled with data telemetry. An automated data processing and analysis workflow is being developed to streamline information delivery. The development of this approach has provided the basis of a decision-support tool for monitoring and managing unstable slopes. The hardware component of the system has been operational at a number of field sites associated with a range of natural and engineered slopes for up to two years. We report on the monitoring results from these sites, discuss the practicalities of installing and maintaining long-term geophysical monitoring infrastructure, and consider the requirements of a fully automated data processing and analysis workflow. We propose that the result of this development work is a practical decision-support tool that can provide near-real-time information relating to the internal condition of problematic slopes.

  8. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline.

    PubMed

    Loh, K B; Ramli, N; Tan, L K; Roziah, M; Rahmat, K; Ariffin, H

    2012-07-01

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. Diffusion tensor imaging outperforms conventional MRI in depicting white matter maturation. • DTI will become an important clinical tool for diagnosing paediatric neurological diseases. • DTI appears especially helpful for developmental abnormalities, tumours and white matter disease. • An automated processing pipeline assists quantitative analysis of high throughput DTI data.

  9. Best Management Practices (BMP) Monitoring Manual Field Guide: Implementation and Effectiveness for Protection of Water Resources

    Treesearch

    David Welsch; Roger Ryder; Tim Post

    2006-01-01

    The specific purpose of the BMP protocol is to create an economical, standardized, and repeatable BMP monitoring process that is completely automated, from data gathering through report generation, in order to provide measured data, ease of use, and compatibility with State BMP programs.The protocol was developed to meet the following needs:? Document the use and...

  10. A Comparative Experimental Study on the Use of Machine Learning Approaches for Automated Valve Monitoring Based on Acoustic Emission Parameters

    NASA Astrophysics Data System (ADS)

    Ali, Salah M.; Hui, K. H.; Hee, L. M.; Salman Leong, M.; Al-Obaidi, M. A.; Ali, Y. H.; Abdelrhman, Ahmed M.

    2018-03-01

    Acoustic emission (AE) analysis has become a vital tool for initiating the maintenance tasks in many industries. However, the analysis process and interpretation has been found to be highly dependent on the experts. Therefore, an automated monitoring method would be required to reduce the cost and time consumed in the interpretation of AE signal. This paper investigates the application of two of the most common machine learning approaches namely artificial neural network (ANN) and support vector machine (SVM) to automate the diagnosis of valve faults in reciprocating compressor based on AE signal parameters. Since the accuracy is an essential factor in any automated diagnostic system, this paper also provides a comparative study based on predictive performance of ANN and SVM. AE parameters data was acquired from single stage reciprocating air compressor with different operational and valve conditions. ANN and SVM diagnosis models were subsequently devised by combining AE parameters of different conditions. Results demonstrate that ANN and SVM models have the same results in term of prediction accuracy. However, SVM model is recommended to automate diagnose the valve condition in due to the ability of handling a high number of input features with low sampling data sets.

  11. Workflow management in large distributed systems

    NASA Astrophysics Data System (ADS)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  12. Driver Vigilance in Automated Vehicles: Hazard Detection Failures Are a Matter of Time.

    PubMed

    Greenlee, Eric T; DeLucia, Patricia R; Newton, David C

    2018-06-01

    The primary aim of the current study was to determine whether monitoring the roadway for hazards during automated driving results in a vigilance decrement. Although automated vehicles are relatively novel, the nature of human-automation interaction within them has the classic hallmarks of a vigilance task. Drivers must maintain attention for prolonged periods of time to detect and respond to rare and unpredictable events, for example, roadway hazards that automation may be ill equipped to detect. Given the similarity with traditional vigilance tasks, we predicted that drivers of a simulated automated vehicle would demonstrate a vigilance decrement in hazard detection performance. Participants "drove" a simulated automated vehicle for 40 minutes. During that time, their task was to monitor the roadway for roadway hazards. As predicted, hazard detection rate declined precipitously, and reaction times slowed as the drive progressed. Further, subjective ratings of workload and task-related stress indicated that sustained monitoring is demanding and distressing and it is a challenge to maintain task engagement. Monitoring the roadway for potential hazards during automated driving results in workload, stress, and performance decrements similar to those observed in traditional vigilance tasks. To the degree that vigilance is required of automated vehicle drivers, performance errors and associated safety risks are likely to occur as a function of time on task. Vigilance should be a focal safety concern in the development of vehicle automation.

  13. Autonomous cloud based site monitoring through hydro geophysical data assimilation, processing and result delivery

    NASA Astrophysics Data System (ADS)

    Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.

    2017-12-01

    Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical geophysical ingestion and processing and the ability for co analysis and visualization of the raw and processed data with other data of interest (e.g. soil temperature, soil moisture, precipitation). We will demonstrate current PAF capabilities and discuss future efforts.

  14. Knowledge representation and user interface concepts to support mixed-initiative diagnosis

    NASA Technical Reports Server (NTRS)

    Sobelman, Beverly H.; Holtzblatt, Lester J.

    1989-01-01

    The Remote Maintenance Monitoring System (RMMS) provides automated support for the maintenance and repair of ModComp computer systems used in the Launch Processing System (LPS) at Kennedy Space Center. RMMS supports manual and automated diagnosis of intermittent hardware failures, providing an efficient means for accessing and analyzing the data generated by catastrophic failure recovery procedures. This paper describes the design and functionality of the user interface for interactive analysis of memory dump data, relating it to the underlying declarative representation of memory dumps.

  15. Automated Selection Of Pictures In Sequences

    NASA Technical Reports Server (NTRS)

    Rorvig, Mark E.; Shelton, Robert O.

    1995-01-01

    Method of automated selection of film or video motion-picture frames for storage or examination developed. Beneficial in situations in which quantity of visual information available exceeds amount stored or examined by humans in reasonable amount of time, and/or necessary to reduce large number of motion-picture frames to few conveying significantly different information in manner intermediate between movie and comic book or storyboard. For example, computerized vision system monitoring industrial process programmed to sound alarm when changes in scene exceed normal limits.

  16. Smart Phase Tuning in Microwave Photonic Integrated Circuits Toward Automated Frequency Multiplication by Design

    NASA Astrophysics Data System (ADS)

    Nabavi, N.

    2018-07-01

    The author investigates the monitoring methods for fine adjustment of the previously proposed on-chip architecture for frequency multiplication and translation of harmonics by design. Digital signal processing (DSP) algorithms are utilized to create an optimized microwave photonic integrated circuit functionality toward automated frequency multiplication. The implemented DSP algorithms are formed on discrete Fourier transform and optimization-based algorithms (Greedy and gradient-based algorithms), which are analytically derived and numerically compared based on the accuracy and speed of convergence criteria.

  17. 40 CFR 75.10 - General operating requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... continuous emission monitoring system and a flow monitoring system with an automated data acquisition and handling system for measuring and recording SO2 concentration (in ppm), volumetric gas flow (in scfh), and... emission monitoring system and a flow monitoring system with an automated data acquisition and handling...

  18. Automation for deep space vehicle monitoring

    NASA Technical Reports Server (NTRS)

    Schwuttke, Ursula M.

    1991-01-01

    Information on automation for deep space vehicle monitoring is given in viewgraph form. Information is given on automation goals and strategy; the Monitor Analyzer of Real-time Voyager Engineering Link (MARVEL); intelligent input data management; decision theory for making tradeoffs; dynamic tradeoff evaluation; evaluation of anomaly detection results; evaluation of data management methods; system level analysis with cooperating expert systems; the distributed architecture of multiple expert systems; and event driven response.

  19. Eavesdropping on the Arctic: Automated bioacoustics reveal dynamics in songbird breeding phenology

    PubMed Central

    Ellis, Daniel P. W.; Pérez, Jonathan H.; Wingfield, John C.; Boelman, Natalie T.

    2018-01-01

    Bioacoustic networks could vastly expand the coverage of wildlife monitoring to complement satellite observations of climate and vegetation. This approach would enable global-scale understanding of how climate change influences phenomena such as migratory timing of avian species. The enormous data sets that autonomous recorders typically generate demand automated analyses that remain largely undeveloped. We devised automated signal processing and machine learning approaches to estimate dates on which songbird communities arrived at arctic breeding grounds. Acoustically estimated dates agreed well with those determined via traditional surveys and were strongly related to the landscape’s snow-free dates. We found that environmental conditions heavily influenced daily variation in songbird vocal activity, especially before egg laying. Our novel approaches demonstrate that variation in avian migratory arrival can be detected autonomously. Large-scale deployment of this innovation in wildlife monitoring would enable the coverage necessary to assess and forecast changes in bird migration in the face of climate change. PMID:29938220

  20. Automated power distribution system hardware. [for space station power supplies

    NASA Technical Reports Server (NTRS)

    Anderson, Paul M.; Martin, James A.; Thomason, Cindy

    1989-01-01

    An automated power distribution system testbed for the space station common modules has been developed. It incorporates automated control and monitoring of a utility-type power system. Automated power system switchgear, control and sensor hardware requirements, hardware design, test results, and potential applications are discussed. The system is designed so that the automated control and monitoring of the power system is compatible with both a 208-V, 20-kHz single-phase AC system and a high-voltage (120 to 150 V) DC system.

  1. Proceedings of the Eleventh International Symposium on Remote Sensing of Environment, volume 2. [application and processing of remotely sensed data

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Application and processing of remotely sensed data are discussed. Areas of application include: pollution monitoring, water quality, land use, marine resources, ocean surface properties, and agriculture. Image processing and scene analysis are described along with automated photointerpretation and classification techniques. Data from infrared and multispectral band scanners onboard LANDSAT satellites are emphasized.

  2. Migration monitoring with automated technology

    Treesearch

    Rhonda L. Millikin

    2005-01-01

    Automated technology can supplement ground-based methods of migration monitoring by providing: (1) unbiased and automated sampling; (2) independent validation of current methods; (3) a larger sample area for landscape-level analysis of habitat selection for stopover, and (4) an opportunity to study flight behavior. In particular, radar-acoustic sensor fusion can...

  3. Automated Power Assessment for Helicopter Turboshaft Engines

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Litt, Jonathan S.

    2008-01-01

    An accurate indication of available power is required for helicopter mission planning purposes. Available power is currently estimated on U.S. Army Blackhawk helicopters by performing a Maximum Power Check (MPC), a manual procedure performed by maintenance pilots on a periodic basis. The MPC establishes Engine Torque Factor (ETF), an indication of available power. It is desirable to replace the current manual MPC procedure with an automated approach that will enable continuous real-time assessment of available power utilizing normal mission data. This report presents an automated power assessment approach which processes data currently collected within helicopter Health and Usage Monitoring System (HUMS) units. The overall approach consists of: 1) a steady-state data filter which identifies and extracts steady-state operating points within HUMS data sets; 2) engine performance curve trend monitoring and updating; and 3) automated ETF calculation. The algorithm is coded in MATLAB (The MathWorks, Inc.) and currently runs on a PC. Results from the application of this technique to HUMS mission data collected from UH-60L aircraft equipped with T700-GE-701C engines are presented and compared to manually calculated ETF values. Potential future enhancements are discussed.

  4. Subunit mass analysis for monitoring antibody oxidation.

    PubMed

    Sokolowska, Izabela; Mo, Jingjie; Dong, Jia; Lewis, Michael J; Hu, Ping

    2017-04-01

    Methionine oxidation is a common posttranslational modification (PTM) of monoclonal antibodies (mAbs). Oxidation can reduce the in-vivo half-life, efficacy and stability of the product. Peptide mapping is commonly used to monitor the levels of oxidation, but this is a relatively time-consuming method. A high-throughput, automated subunit mass analysis method was developed to monitor antibody methionine oxidation. In this method, samples were treated with IdeS, EndoS and dithiothreitol to generate three individual IgG subunits (light chain, Fd' and single chain Fc). These subunits were analyzed by reversed phase-ultra performance liquid chromatography coupled with an online quadrupole time-of-flight mass spectrometer and the levels of oxidation on each subunit were quantitated based on the deconvoluted mass spectra using the UNIFI software. The oxidation results obtained by subunit mass analysis correlated well with the results obtained by peptide mapping. Method qualification demonstrated that this subunit method had excellent repeatability and intermediate precision. In addition, UNIFI software used in this application allows automated data acquisition and processing, which makes this method suitable for high-throughput process monitoring and product characterization. Finally, subunit mass analysis revealed the different patterns of Fc methionine oxidation induced by chemical and photo stress, which makes it attractive for investigating the root cause of oxidation.

  5. Deep Learning and Image Processing for Automated Crack Detection and Defect Measurement in Underground Structures

    NASA Astrophysics Data System (ADS)

    Panella, F.; Boehm, J.; Loo, Y.; Kaushik, A.; Gonzalez, D.

    2018-05-01

    This work presents the combination of Deep-Learning (DL) and image processing to produce an automated cracks recognition and defect measurement tool for civil structures. The authors focus on tunnel civil structures and survey and have developed an end to end tool for asset management of underground structures. In order to maintain the serviceability of tunnels, regular inspection is needed to assess their structural status. The traditional method of carrying out the survey is the visual inspection: simple, but slow and relatively expensive and the quality of the output depends on the ability and experience of the engineer as well as on the total workload (stress and tiredness may influence the ability to observe and record information). As a result of these issues, in the last decade there is the desire to automate the monitoring using new methods of inspection. The present paper has the goal of combining DL with traditional image processing to create a tool able to detect, locate and measure the structural defect.

  6. Mission simulation as an approach to develop requirements for automation in Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Erickson, J. D.; Eckelkamp, R. E.; Barta, D. J.; Dragg, J.; Henninger, D. L. (Principal Investigator)

    1996-01-01

    This paper examines mission simulation as an approach to develop requirements for automation and robotics for Advanced Life Support Systems (ALSS). The focus is on requirements and applications for command and control, control and monitoring, situation assessment and response, diagnosis and recovery, adaptive planning and scheduling, and other automation applications in addition to mechanized equipment and robotics applications to reduce the excessive human labor requirements to operate and maintain an ALSS. Based on principles of systems engineering, an approach is proposed to assess requirements for automation and robotics using mission simulation tools. First, the story of a simulated mission is defined in terms of processes with attendant types of resources needed, including options for use of automation and robotic systems. Next, systems dynamics models are used in simulation to reveal the implications for selected resource allocation schemes in terms of resources required to complete operational tasks. The simulations not only help establish ALSS design criteria, but also may offer guidance to ALSS research efforts by identifying gaps in knowledge about procedures and/or biophysical processes. Simulations of a planned one-year mission with 4 crewmembers in a Human Rated Test Facility are presented as an approach to evaluation of mission feasibility and definition of automation and robotics requirements.

  7. A precipitation collector and automated pH-monitoring system

    Treesearch

    Gerald M. Aubertin; Benjamin C. Thorner; John Campbell

    1976-01-01

    A sensitive precipitation collector and automated pH-monitoring system are described. This system provides for continuous monitoring and recording of the pH of precipitation. Discrete or composite rainwater samples are manually obtainable for chemical analyses. The system can easily be adapted to accommodate a flow-through specific conductance probe and monitoring...

  8. Completely automated modal analysis procedure based on the combination of different OMA methods

    NASA Astrophysics Data System (ADS)

    Ripamonti, Francesco; Bussini, Alberto; Resta, Ferruccio

    2018-03-01

    In this work a completely automated output-only Modal Analysis procedure is presented and all its benefits are listed. Based on the merging of different Operational Modal Analysis methods and a statistical approach, the identification process has been improved becoming more robust and giving as results only the real natural frequencies, damping ratios and mode shapes of the system. The effect of the temperature can be taken into account as well, leading to the creation of a better tool for automated Structural Health Monitoring. The algorithm has been developed and tested on a numerical model of a scaled three-story steel building present in the laboratories of Politecnico di Milano.

  9. A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*

    PubMed Central

    Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing

    2016-01-01

    Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569

  10. Human-system Interfaces to Automatic Systems: Review Guidance and Technical Basis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    OHara, J.M.; Higgins, J.C.

    Automation has become ubiquitous in modern complex systems and commercial nuclear power plants are no exception. Beyond the control of plant functions and systems, automation is applied to a wide range of additional functions including monitoring and detection, situation assessment, response planning, response implementation, and interface management. Automation has become a 'team player' supporting plant personnel in nearly all aspects of plant operation. In light of the increasing use and importance of automation in new and future plants, guidance is needed to enable the NRC staff to conduct safety reviews of the human factors engineering (HFE) aspects of modern automation.more » The objective of the research described in this report was to develop guidance for reviewing the operator's interface with automation. We first developed a characterization of the important HFE aspects of automation based on how it is implemented in current systems. The characterization included five dimensions: Level of automation, function of automation, modes of automation, flexibility of allocation, and reliability of automation. Next, we reviewed literature pertaining to the effects of these aspects of automation on human performance and the design of human-system interfaces (HSIs) for automation. Then, we used the technical basis established by the literature to develop design review guidance. The guidance is divided into the following seven topics: Automation displays, interaction and control, automation modes, automation levels, adaptive automation, error tolerance and failure management, and HSI integration. In addition, we identified insights into the automaton design process, operator training, and operations.« less

  11. Auto-rickshaw: an automated crystal structure determination platform as an efficient tool for the validation of an X-ray diffraction experiment.

    PubMed

    Panjikar, Santosh; Parthasarathy, Venkataraman; Lamzin, Victor S; Weiss, Manfred S; Tucker, Paul A

    2005-04-01

    The EMBL-Hamburg Automated Crystal Structure Determination Platform is a system that combines a number of existing macromolecular crystallographic computer programs and several decision-makers into a software pipeline for automated and efficient crystal structure determination. The pipeline can be invoked as soon as X-ray data from derivatized protein crystals have been collected and processed. It is controlled by a web-based graphical user interface for data and parameter input, and for monitoring the progress of structure determination. A large number of possible structure-solution paths are encoded in the system and the optimal path is selected by the decision-makers as the structure solution evolves. The processes have been optimized for speed so that the pipeline can be used effectively for validating the X-ray experiment at a synchrotron beamline.

  12. Generic HPLC platform for automated enzyme reaction monitoring: Advancing the assay toolbox for transaminases and other PLP-dependent enzymes.

    PubMed

    Börner, Tim; Grey, Carl; Adlercreutz, Patrick

    2016-08-01

    Methods for rapid and direct quantification of enzyme kinetics independent of the substrate stand in high demand for both fundamental research and bioprocess development. This study addresses the need for a generic method by developing an automated, standardizable HPLC platform monitoring reaction progress in near real-time. The method was applied to amine transaminase (ATA) catalyzed reactions intensifying process development for chiral amine synthesis. Autosampler-assisted pipetting facilitates integrated mixing and sampling under controlled temperature. Crude enzyme formulations in high and low substrate concentrations can be employed. Sequential, small (1 µL) sample injections and immediate detection after separation permits fast reaction monitoring with excellent sensitivity, accuracy and reproducibility. Due to its modular design, different chromatographic techniques, e.g. reverse phase and size exclusion chromatography (SEC) can be employed. A novel assay for pyridoxal 5'-phosphate-dependent enzymes is presented using SEC for direct monitoring of enzyme-bound and free reaction intermediates. Time-resolved changes of the different cofactor states, e.g. pyridoxal 5'-phosphate, pyridoxamine 5'-phosphate and the internal aldimine were traced in both half reactions. The combination of the automated HPLC platform with SEC offers a method for substrate-independent screening, which renders a missing piece in the assay and screening toolbox for ATAs and other PLP-dependent enzymes. Copyright © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. New developments in automated biosensing from remote water quality stations and satellite data retrieval for resources management

    NASA Astrophysics Data System (ADS)

    Morgan, E. L.; Eagleson, K. W.; Hermann, R.; McCollough, N. D.

    1981-05-01

    Maintaining adequate water quality in a multipurpose drainage system becomes increasingly important as demands on resources become greater. Real-time water quality monitoring plays a crucial role in meeting this objective. In addition to remote automated physical monitoring, developments at the end of the 1970's allow simultaneous real-time measurements of fish breathing response to water quality changes. These advantages complement complex in-stream surveys typically carried out to evaluate the environmental quality of a system. Automated biosensing units having remote capabilities are designed to aid in the evaluation of subtle water quality changes contributing to undesirable conditions in a drainage basin. Using microprocessor-based monitors to measure fish breathing rates, the biosensing units are interfaced to a U.S. National Aeronautics and Space Administration (N.A.S.A.) remote data collection platform for National Oceanic and Atmospheric Administration (N.O.A.A.) GOES satellite retrieval and transmission of data. Simultaneously, multiparameter physical information is collected from site-specific locations and recovered in a similar manner. Real-time biological and physical data received at a data processing center are readily available for interpretation by resource managers. Management schemes incorporating real-time monitoring networks into on-going programs to simultaneously retrieve biological and physical data by satellite, radio and telephone cable give added advantages in maintaining water quality for multipurpose needs.

  14. An automated tool for a daily harmful algal bloom monitoring using MODIS imagery downscaled to 250 meters spatial resolution

    NASA Astrophysics Data System (ADS)

    El Alem, A.

    2016-12-01

    Harmful algal bloom (HAB) causes negative impacts to other organisms by producing natural toxins, mechanical damage to other micro-organisms, or simply by degrading waters quality. Contaminated waters could expose several billions of population to serious intoxications problems. Traditionally, HAB monitoring is made with standard methods limited to a restricted network of sampling points. However, rapid evolution of HABs makes it difficult to monitor their variation in time and space, threating then public safety. Daily monitoring is then the best way to control and to mitigate their harmful effect upon population, particularly for sources feeding cities. Recently, an approach for estimating chlorophyll-a (Chl-a) concentration, as a proxy of HAB presence, in inland waters based MODIS imagery downscaled to 250 meters spatial resolution was developed. Statistical evaluation of the developed approach highlighted the accuracy of Chl-a estimate with a R2 = 0.98, a relative RMSE of 15%, a relative BIAS of -2%, and a relative NASH of 0.95. Temporal resolution of MODIS sensor allows then a daily monitoring of HAB spatial distribution for inland waters of more than 2.25 Km2 of surface. Groupe-Hemisphere, a company specialized in environmental and sustainable planning in Quebec, has shown a great interest to the developed approach. Given the complexity of the preprocessing (geometric and atmospheric corrections as well as downscaling spatial resolution) and processing (Chl-a estimate) of images, a standalone application under the MATLAB's GUI environment was developed. The application allows an automated process for all preprocessing and processing steps. Outputs produced by the application for end users, many of whom may be decision makers or policy makers in the public and private sectors, allows a near-real time monitoring of water quality for a more efficient management.

  15. Robo-Lector - a novel platform for automated high-throughput cultivations in microtiter plates with high information content.

    PubMed

    Huber, Robert; Ritter, Daniel; Hering, Till; Hillmer, Anne-Kathrin; Kensy, Frank; Müller, Carsten; Wang, Le; Büchs, Jochen

    2009-08-01

    In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector) was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3) pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times) simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only +/- 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main culture plate, where subsequently similar growth kinetics could be obtained. The Robo-Lector generates extensive kinetic data in high-throughput cultivations, particularly for biomass and fluorescence protein formation. Based on the non-invasive on-line-monitoring signals, actions of the liquid-handling robot can easily be triggered. This interaction between the robot and the BioLector (Robo-Lector) combines high-content data generation with systematic high-throughput experimentation in an automated fashion, offering new possibilities to study biological production systems. The presented platform uses a standard liquid-handling workstation with widespread automation possibilities. Thus, high-throughput cultivations can now be combined with small-scale downstream processing techniques and analytical assays. Ultimately, this novel versatile platform can accelerate and intensify research and development in the field of systems biology as well as modelling and bioprocess optimization.

  16. A semi-automated method of monitoring dam passage of American Eels Anguilla rostrata

    USGS Publications Warehouse

    Welsh, Stuart A.; Aldinger, Joni L.

    2014-01-01

    Fish passage facilities at dams have become an important focus of fishery management in riverine systems. Given the personnel and travel costs associated with physical monitoring programs, automated or semi-automated systems are an attractive alternative for monitoring fish passage facilities. We designed and tested a semi-automated system for eel ladder monitoring at Millville Dam on the lower Shenandoah River, West Virginia. A motion-activated eel ladder camera (ELC) photographed each yellow-phase American Eel Anguilla rostrata that passed through the ladder. Digital images (with date and time stamps) of American Eels allowed for total daily counts and measurements of eel TL using photogrammetric methods with digital imaging software. We compared physical counts of American Eels with camera-based counts; TLs obtained with a measuring board were compared with TLs derived from photogrammetric methods. Data from the ELC were consistent with data obtained by physical methods, thus supporting the semi-automated camera system as a viable option for monitoring American Eel passage. Time stamps on digital images allowed for the documentation of eel passage time—data that were not obtainable from physical monitoring efforts. The ELC has application to eel ladder facilities but can also be used to monitor dam passage of other taxa, such as crayfishes, lampreys, and water snakes.

  17. E-healthcare at an experimental welfare techno house in Japan.

    PubMed

    Tamura, Toshiyo; Kawarada, Atsushi; Nambu, Masayuki; Tsukada, Akira; Sasaki, Kazuo; Yamakoshi, Ken-Ichi

    2007-01-01

    An automated monitoring system for home health care has been designed for an experimental house in Japan called the Welfare Techno House (WTH). Automated electrocardiogram (ECG) measurements can be taken while in bed, in the bathtub, and on the toilet, without the subject's awareness, and without using body surface electrodes. In order to evaluate this automated health monitoring system, overnight measurements were performed to monitor health status during the daily lives of both young and elderly subjects.

  18. 40 CFR Table A-1 to Subpart A of... - Summary of Applicable Requirements for Reference and Equivalent Methods for Air Monitoring of...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Methods for Air Monitoring of Criteria Pollutants Pollutant Ref. or equivalent Manual or automated Applicable part 50 appendix Applicable subparts of part 53 A B C D E F SO2 Reference Manual A Equivalent Manual ✓ ✓ Automated ✓ ✓ ✓ CO Reference Automated C ✓ ✓ Equivalent Manual ✓ ✓ Automated ✓ ✓ ✓ O3...

  19. Validation of an automated mite counter for Dermanyssus gallinae in experimental laying hen cages.

    PubMed

    Mul, Monique F; van Riel, Johan W; Meerburg, Bastiaan G; Dicke, Marcel; George, David R; Groot Koerkamp, Peter W G

    2015-08-01

    For integrated pest management (IPM) programs to be maximally effective, monitoring of the growth and decline of the pest populations is essential. Here, we present the validation results of a new automated monitoring device for the poultry red mite (Dermanyssus gallinae), a serious pest in laying hen facilities world-wide. This monitoring device (called an "automated mite counter") was validated in experimental laying hen cages with live birds and a growing population of D. gallinae. This validation study resulted in 17 data points of 'number of mites counted' by the automated mite counter and the 'number of mites present' in the experimental laying hen cages. The study demonstrated that the automated mite counter was able to track the D. gallinae population effectively. A wider evaluation showed that this automated mite counter can become a useful tool in IPM of D. gallinae in laying hen facilities.

  20. Automated Cryocooler Monitor and Control System Software

    NASA Technical Reports Server (NTRS)

    Britchcliffe, Michael J.; Conroy, Bruce L.; Anderson, Paul E.; Wilson, Ahmad

    2011-01-01

    This software is used in an automated cryogenic control system developed to monitor and control the operation of small-scale cryocoolers. The system was designed to automate the cryogenically cooled low-noise amplifier system described in "Automated Cryocooler Monitor and Control System" (NPO-47246), NASA Tech Briefs, Vol. 35, No. 5 (May 2011), page 7a. The software contains algorithms necessary to convert non-linear output voltages from the cryogenic diode-type thermometers and vacuum pressure and helium pressure sensors, to temperature and pressure units. The control function algorithms use the monitor data to control the cooler power, vacuum solenoid, vacuum pump, and electrical warm-up heaters. The control algorithms are based on a rule-based system that activates the required device based on the operating mode. The external interface is Web-based. It acts as a Web server, providing pages for monitor, control, and configuration. No client software from the external user is required.

  1. AUTOMATED PRODUCTION OF SEAGRASS MAPS FROM SIDESCAN SONAR IMAGERY: ACCURACY, VARIABILITY AND PATCH RESOLUTION

    EPA Science Inventory

    Maps of seagrass beds are useful for monitoring estuarine condition, managing habitats, and modeling estuarine processes. We recently developed inexpensive methods for collecting and classifying sidescan sonar (SSS) imagery for seagrass presence in turbid waters as shallow as 1-...

  2. Automated live cell screening system based on a 24-well-microplate with integrated micro fluidics.

    PubMed

    Lob, V; Geisler, T; Brischwein, M; Uhl, R; Wolf, B

    2007-11-01

    In research, pharmacologic drug-screening and medical diagnostics, the trend towards the utilization of functional assays using living cells is persisting. Research groups working with living cells are confronted with the problem, that common endpoint measurement methods are not able to map dynamic changes. With consideration of time as a further dimension, the dynamic and networked molecular processes of cells in culture can be monitored. These processes can be investigated by measuring several extracellular parameters. This paper describes a high-content system that provides real-time monitoring data of cell parameters (metabolic and morphological alterations), e.g., upon treatment with drug compounds. Accessible are acidification rates, the oxygen consumption and changes in adhesion forces within 24 cell cultures in parallel. Addressing the rising interest in biomedical and pharmacological high-content screening assays, a concept has been developed, which integrates multi-parametric sensor readout, automated imaging and probe handling into a single embedded platform. A life-maintenance system keeps important environmental parameters (gas, humidity, sterility, temperature) constant.

  3. Applying Standard Interfaces to a Process-Control Language

    NASA Technical Reports Server (NTRS)

    Berthold, Richard T.

    2005-01-01

    A method of applying open-operating-system standard interfaces to the NASA User Interface Language (UIL) has been devised. UIL is a computing language that can be used in monitoring and controlling automated processes: for example, the Timeliner computer program, written in UIL, is a general-purpose software system for monitoring and controlling sequences of automated tasks in a target system. In providing the major elements of connectivity between UIL and the target system, the present method offers advantages over the prior method. Most notably, unlike in the prior method, the software description of the target system can be made independent of the applicable compiler software and need not be linked to the applicable executable compiler image. Also unlike in the prior method, it is not necessary to recompile the source code and relink the source code to a new executable compiler image. Abstraction of the description of the target system to a data file can be defined easily, with intuitive syntax, and knowledge of the source-code language is not needed for the definition.

  4. Analysis of Trinity Power Metrics for Automated Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michalenko, Ashley Christine

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  5. Automated acquisition system for routine, noninvasive monitoring of physiological data.

    PubMed

    Ogawa, M; Tamura, T; Togawa, T

    1998-01-01

    A fully automated, noninvasive data-acquisition system was developed to permit long-term measurement of physiological functions at home, without disturbing subjects' normal routines. The system consists of unconstrained monitors built into furnishings and structures in a home environment. An electrocardiographic (ECG) monitor in the bathtub measures heart function during bathing, a temperature monitor in the bed measures body temperature, and a weight monitor built into the toilet serves as a scale to record weight. All three monitors are connected to one computer and function with data-acquisition programs and a data format rule. The unconstrained physiological parameter monitors and fully automated measurement procedures collect data noninvasively without the subject's awareness. The system was tested for 1 week by a healthy male subject, aged 28, in laboratory-based facilities.

  6. Incidence rates of hospital-acquired urinary tract and bloodstream infections generated by automated compilation of electronically available healthcare data.

    PubMed

    Redder, J D; Leth, R A; Møller, J K

    2015-11-01

    Monitoring of hospital-acquired infection (HAI) by automated compilation of registry data may address the disadvantages of laborious, costly and potentially subjective and often random sampling of data by manual surveillance. To evaluate a system for automated monitoring of hospital-acquired urinary tract (HA-UTI) and bloodstream infections (HA-BSI) and to report incidence rates over a five-year period in a Danish hospital trust. Based primarily on electronically available data relating to microbiology results and antibiotic prescriptions, the automated monitoring of HA-UTIs and HA-BSIs was validated against data from six previous point-prevalence surveys (PPS) from 2010 to 2013 and data from a manual assessment (HA-UTI only) of one department of internal medicine from January 2010. Incidence rates (infections per 1000 bed-days) from 2010 to 2014 were calculated. Compared with the PPSs, the automated monitoring showed a sensitivity of 88% in detecting UTI in general, 78% in detecting HA-UTI, and 100% in detecting BSI in general. The monthly incidence rates varied between 4.14 and 6.61 per 1000 bed-days for HA-UTI and between 0.09 and 1.25 per 1000 bed-days for HA-BSI. Replacing PPSs with automated monitoring of HAIs may provide better and more objective data and constitute a promising foundation for individual patient risk analyses and epidemiological studies. Automated monitoring may be universally applicable in hospitals with electronic databases comprising microbiological findings, admission data, and antibiotic prescriptions. Copyright © 2015 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  7. Automated daily quality control analysis for mammography in a multi-unit imaging center.

    PubMed

    Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli

    2018-01-01

    Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.

  8. NASA JSC water monitor system: City of Houston field demonstration

    NASA Technical Reports Server (NTRS)

    Taylor, R. E.; Jeffers, E. L.; Fricks, D. H.

    1979-01-01

    A water quality monitoring system with on-line and real time operation similar to the function in a spacecraft was investigated. A system with the capability to determine conformance to future high effluent quality standards and to increase the potential for reclamation and reuse of water was designed. Although all system capabilities were not verified in the initial field trial, fully automated operation over a sustained period with only routine manual adjustments was accomplished. Two major points were demonstrated: (1) the water monitor system has great potential in water monitoring and/or process control applications; and (2) the water monitor system represents a vast improvement over conventional (grab sample) water monitoring techniques.

  9. PILOT: An intelligent distributed operations support system

    NASA Technical Reports Server (NTRS)

    Rasmussen, Arthur N.

    1993-01-01

    The Real-Time Data System (RTDS) project is exploring the application of advanced technologies to the real-time flight operations environment of the Mission Control Centers at NASA's Johnson Space Center. The system, based on a network of engineering workstations, provides services such as delivery of real time telemetry data to flight control applications. To automate the operation of this complex distributed environment, a facility called PILOT (Process Integrity Level and Operation Tracker) is being developed. PILOT comprises a set of distributed agents cooperating with a rule-based expert system; together they monitor process operation and data flows throughout the RTDS network. The goal of PILOT is to provide unattended management and automated operation under user control.

  10. Near-infrared spectroscopy monitoring and control of the fluidized bed granulation and coating processes-A review.

    PubMed

    Liu, Ronghua; Li, Lian; Yin, Wenping; Xu, Dongbo; Zang, Hengchang

    2017-09-15

    The fluidized bed granulation and pellets coating technologies are widely used in pharmaceutical industry, because the particles made in a fluidized bed have good flowability, compressibility, and the coating thickness of pellets are homogeneous. With the popularization of process analytical technology (PAT), real-time analysis for critical quality attributes (CQA) was getting more attention. Near-infrared (NIR) spectroscopy, as a PAT tool, could realize the real-time monitoring and control during the granulating and coating processes, which could optimize the manufacturing processes. This article reviewed the application of NIR spectroscopy in CQA (moisture content, particle size and tablet/pellet thickness) monitoring during fluidized bed granulation and coating processes. Through this review, we would like to provide references for realizing automated control and intelligent production in fluidized bed granulation and pellets coating of pharmaceutical industry. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Job monitoring on DIRAC for Belle II distributed computing

    NASA Astrophysics Data System (ADS)

    Kato, Yuji; Hayasaka, Kiyoshi; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo

    2015-12-01

    We developed a monitoring system for Belle II distributed computing, which consists of active and passive methods. In this paper we describe the passive monitoring system, where information stored in the DIRAC database is processed and visualized. We divide the DIRAC workload management flow into steps and store characteristic variables which indicate issues. These variables are chosen carefully based on our experiences, then visualized. As a result, we are able to effectively detect issues. Finally, we discuss the future development for automating log analysis, notification of issues, and disabling problematic sites.

  12. Automating security monitoring and analysis for Space Station Freedom's electric power system

    NASA Technical Reports Server (NTRS)

    Dolce, James L.; Sobajic, Dejan J.; Pao, Yoh-Han

    1990-01-01

    Operating a large, space power system requires classifying the system's status and analyzing its security. Conventional algorithms are used by terrestrial electric utilities to provide such information to their dispatchers, but their application aboard Space Station Freedom will consume too much processing time. A new approach for monitoring and analysis using adaptive pattern techniques is presented. This approach yields an on-line security monitoring and analysis algorithm that is accurate and fast; and thus, it can free the Space Station Freedom's power control computers for other tasks.

  13. Automating security monitoring and analysis for Space Station Freedom's electric power system

    NASA Technical Reports Server (NTRS)

    Dolce, James L.; Sobajic, Dejan J.; Pao, Yoh-Han

    1990-01-01

    Operating a large, space power system requires classifying the system's status and analyzing its security. Conventional algorithms are used by terrestrial electric utilities to provide such information to their dispatchers, but their application aboard Space Station Freedom will consume too much processing time. A novel approach for monitoring and analysis using adaptive pattern techniques is presented. This approach yields an on-line security monitoring and analysis algorithm that is accurate and fast; and thus, it can free the Space Station Freedom's power control computers for other tasks.

  14. Apparatus and method for automated monitoring of airborne bacterial spores

    NASA Technical Reports Server (NTRS)

    Ponce, Adrian (Inventor)

    2009-01-01

    An apparatus and method for automated monitoring of airborne bacterial spores. The apparatus is provided with an air sampler, a surface for capturing airborne spores, a thermal lysis unit to release DPA from bacterial spores, a source of lanthanide ions, and a spectrometer for excitation and detection of the characteristic fluorescence of the aromatic molecules in bacterial spores complexed with lanthanide ions. In accordance with the method: computer-programmed steps allow for automation of the apparatus for the monitoring of airborne bacterial spores.

  15. Collaboration, Automation, and Information Management at Hanford High Level Radioactive Waste (HLW) Tank Farms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurah, Mirwaise Y.; Roberts, Mark A.

    Washington River Protection Solutions (WRPS), operator of High Level Radioactive Waste (HLW) Tank Farms at the Hanford Site, is taking an over 20-year leap in technology, replacing systems that were monitored with clipboards and obsolete computer systems, as well as solving major operations and maintenance hurdles in the area of process automation and information management. While WRPS is fully compliant with procedures and regulations, the current systems are not integrated and do not share data efficiently, hampering how information is obtained and managed.

  16. Controlling Wafer Contamination Using Automated On-Line Metrology during Wet Chemical Cleaning

    NASA Astrophysics Data System (ADS)

    Wang, Jason; Kingston, Skip; Han, Ye; Saini, Harmesh; McDonald, Robert; Mui, Rudy

    2003-09-01

    The capabilities of a trace contamination analyzer are discussed and demonstrated. This analytical tool utilizes an electrospray, time-of-flight mass spectrometer (ES-TOF-MS) for fully automated on-line monitoring of wafer cleaning solutions. The analyzer provides rich information on metallic, anionic, cationic, elemental, and organic species through its ability to provide harsh (elemental) and soft (molecular) ionization under both positive and negative modes. It is designed to meet semiconductor process control and yield management needs for the ever increasing complex new chemistries present in wafer fabrication.

  17. Can the black box be cracked? The augmentation of microbial ecology by high-resolution, automated sensing technologies.

    PubMed

    Shade, Ashley; Carey, Cayelan C; Kara, Emily; Bertilsson, Stefan; McMahon, Katherine D; Smith, Matthew C

    2009-08-01

    Automated sensing technologies, 'ASTs,' are tools that can monitor environmental or microbial-related variables at increasingly high temporal resolution. Microbial ecologists are poised to use AST data to couple microbial structure, function and associated environmental observations on temporal scales pertinent to microbial processes. In the context of aquatic microbiology, we discuss three applications of ASTs: windows on the microbial world, adaptive sampling and adaptive management. We challenge microbial ecologists to push AST potential in helping to reveal relationships between microbial structure and function.

  18. Recycling isotachophoresis - A novel approach to preparative protein fractionation

    NASA Technical Reports Server (NTRS)

    Sloan, Jeffrey E.; Thormann, Wolfgang; Bier, Milan; Twitty, Garland E.; Mosher, Richard A.

    1986-01-01

    The concept of automated recycling isotachophoresis (RITP) as a purification methodology is discussed, in addition to a description of the apparatus. In the present automated RITP, the computer system follows the achievement of steady state using arrays of universal and specific sensors, monitors the position of the front edge of the zone structure, activates the counterflow if the leading boundary passes a specified position along the separation axis, or changes the applied current, accordingly. The system demonstrates high resolution, in addition to higher processing rates than are possible in zone electrophoresis or isoelectric focusing.

  19. Visual Sensing for Urban Flood Monitoring

    PubMed Central

    Lo, Shi-Wei; Wu, Jyh-Horng; Lin, Fang-Pang; Hsu, Ching-Han

    2015-01-01

    With the increasing climatic extremes, the frequency and severity of urban flood events have intensified worldwide. In this study, image-based automated monitoring of flood formation and analyses of water level fluctuation were proposed as value-added intelligent sensing applications to turn a passive monitoring camera into a visual sensor. Combined with the proposed visual sensing method, traditional hydrological monitoring cameras have the ability to sense and analyze the local situation of flood events. This can solve the current problem that image-based flood monitoring heavily relies on continuous manned monitoring. Conventional sensing networks can only offer one-dimensional physical parameters measured by gauge sensors, whereas visual sensors can acquire dynamic image information of monitored sites and provide disaster prevention agencies with actual field information for decision-making to relieve flood hazards. The visual sensing method established in this study provides spatiotemporal information that can be used for automated remote analysis for monitoring urban floods. This paper focuses on the determination of flood formation based on image-processing techniques. The experimental results suggest that the visual sensing approach may be a reliable way for determining the water fluctuation and measuring its elevation and flood intrusion with respect to real-world coordinates. The performance of the proposed method has been confirmed; it has the capability to monitor and analyze the flood status, and therefore, it can serve as an active flood warning system. PMID:26287201

  20. Digital Image Correlation for Performance Monitoring

    NASA Technical Reports Server (NTRS)

    Palaviccini, Miguel; Turner, Dan; Herzberg, Michael

    2016-01-01

    Evaluating the health of a mechanism requires more than just a binary evaluation of whether an operation was completed. It requires analyzing more comprehensive, full-field data. Health monitoring is a process of non-destructively identifying characteristics that indicate the fitness of an engineered component. In order to monitor unit health in a production setting, an automated test system must be created to capture the motion of mechanism parts in a real-time and non-intrusive manner. One way to accomplish this is by using high-speed video and Digital Image Correlation (DIC). In this approach, individual frames of the video are analyzed to track the motion of mechanism components. The derived performance metrics allow for state-of-health monitoring and improved fidelity of mechanism modeling. The results are in-situ state-of-health identification and performance prediction. This paper introduces basic concepts of this test method, and discusses two main themes: the use of laser marking to add fiducial patterns to mechanism components, and new software developed to track objects with complex shapes, even as they move behind obstructions. Finally, the implementation of these tests into an automated tester is discussed.

  1. Permanent Monitoring of the Reference Point of the 20m Radio Telescope Wettzell

    NASA Technical Reports Server (NTRS)

    Neidhardt, Alexander; Losler, Michael; Eschelbach, Cornelia; Schenk, Andreas

    2010-01-01

    To achieve the goals of the VLBI2010 project and the Global Geodetic Observing System (GGOS), an automated monitoring of the reference points of the various geodetic space techniques, including Very Long Baseline Interferometry (VLBI), is desirable. The resulting permanent monitoring of the local-tie vectors at co-location stations is essential to obtain the sub-millimeter level in the combinations. For this reason a monitoring system was installed at the Geodetic Observatory Wettzell by the Geodetic Institute of the University of Karlsruhe (GIK) to observe the 20m VLBI radio telescope from May to August 2009. A specially developed software from GIK collected data from automated total station measurements, meteorological sensors, and sensors in the telescope monument (e.g., Invar cable data). A real-time visualization directly offered a live view of the measurements during the regular observation operations. Additional scintillometer measurements allowed refraction corrections during the post-processing. This project is one of the first feasibility studies aimed at determining significant deformations of the VLBI antenna due to, for instance, changes in temperature.

  2. Quantifying biodiversity using digital cameras and automated image analysis.

    NASA Astrophysics Data System (ADS)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and enabling automatic deletion of images generated by erroneous triggering (e.g. cloud movements). This is the first step to a hierarchical image processing framework, where situation subclasses such as birds or climatic conditions can be fed into more appropriate automated or semi-automated data mining software.

  3. Assessment of error rates in acoustic monitoring with the R package monitoR

    USGS Publications Warehouse

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    Detecting population-scale reactions to climate change and land-use change may require monitoring many sites for many years, a process that is suited for an automated system. We developed and tested monitoR, an R package for long-term, multi-taxa acoustic monitoring programs. We tested monitoR with two northeastern songbird species: black-throated green warbler (Setophaga virens) and ovenbird (Seiurus aurocapilla). We compared detection results from monitoR in 52 10-minute surveys recorded at 10 sites in Vermont and New York, USA to a subset of songs identified by a human that were of a single song type and had visually identifiable spectrograms (e.g. a signal:noise ratio of at least 10 dB: 166 out of 439 total songs for black-throated green warbler, 502 out of 990 total songs for ovenbird). monitoR’s automated detection process uses a ‘score cutoff’, which is the minimum match needed for an unknown event to be considered a detection and results in a true positive, true negative, false positive or false negative detection. At the chosen score cut-offs, monitoR correctly identified presence for black-throated green warbler and ovenbird in 64% and 72% of the 52 surveys using binary point matching, respectively, and 73% and 72% of the 52 surveys using spectrogram cross-correlation, respectively. Of individual songs, 72% of black-throated green warbler songs and 62% of ovenbird songs were identified by binary point matching. Spectrogram cross-correlation identified 83% of black-throated green warbler songs and 66% of ovenbird songs. False positive rates were  for song event detection.

  4. Design of smart neonatal health monitoring system using SMCC

    PubMed Central

    Mukherjee, Anwesha; Bhakta, Ishita

    2016-01-01

    Automated health monitoring and alert system development is a demanding research area today. Most of the currently available monitoring and controlling medical devices are wired which limits freeness of working environment. Wireless sensor network (WSN) is a better alternative in such an environment. Neonatal intensive care unit is used to take care of sick and premature neonates. Hypothermia is an independent risk factor for neonatal mortality and morbidity. To prevent it an automated monitoring system is required. In this Letter, an automated neonatal health monitoring system is designed using sensor mobile cloud computing (SMCC). SMCC is based on WSN and MCC. In the authors’ system temperature sensor, acceleration sensor and heart rate measurement sensor are used to monitor body temperature, acceleration due to body movement and heart rate of neonates. The sensor data are stored inside the cloud. The health person continuously monitors and accesses these data through the mobile device using an Android Application for neonatal monitoring. When an abnormal situation arises, an alert is generated in the mobile device of the health person. By alerting health professional using such an automated system, early care is provided to the affected babies and the probability of recovery is increased. PMID:28261491

  5. Design of smart neonatal health monitoring system using SMCC.

    PubMed

    De, Debashis; Mukherjee, Anwesha; Sau, Arkaprabha; Bhakta, Ishita

    2017-02-01

    Automated health monitoring and alert system development is a demanding research area today. Most of the currently available monitoring and controlling medical devices are wired which limits freeness of working environment. Wireless sensor network (WSN) is a better alternative in such an environment. Neonatal intensive care unit is used to take care of sick and premature neonates. Hypothermia is an independent risk factor for neonatal mortality and morbidity. To prevent it an automated monitoring system is required. In this Letter, an automated neonatal health monitoring system is designed using sensor mobile cloud computing (SMCC). SMCC is based on WSN and MCC. In the authors' system temperature sensor, acceleration sensor and heart rate measurement sensor are used to monitor body temperature, acceleration due to body movement and heart rate of neonates. The sensor data are stored inside the cloud. The health person continuously monitors and accesses these data through the mobile device using an Android Application for neonatal monitoring. When an abnormal situation arises, an alert is generated in the mobile device of the health person. By alerting health professional using such an automated system, early care is provided to the affected babies and the probability of recovery is increased.

  6. Psychophysiological Control of Acognitive Task Using Adaptive Automation

    NASA Technical Reports Server (NTRS)

    Freeman, Frederick; Pope, Alan T. (Technical Monitor)

    2001-01-01

    The major focus of the present proposal was to examine psychophysiological variables related to hazardous states of awareness induced by monitoring automated systems. With the increased use of automation in today's work environment, people's roles in the work place are being redefined from that of active participant to one of passive monitor. Although the introduction of automated systems has a number of benefits, there are also a number of disadvantages regarding worker performance. Byrne and Parasuraman have argued for the use of psychophysiological measures in the development and the implementation of adaptive automation. While both performance based and model based adaptive automation have been studied, the use of psychophysiological measures, especially EEG, offers the advantage of real time evaluation of the state of the subject. The current study used the closed-loop system, developed at NASA-Langley Research Center, to control the state of awareness of subjects while they performed a cognitive vigilance task. Previous research in our laboratory, supported by NASA, has demonstrated that, in an adaptive automation, closed-loop environment, subjects perform a tracking task better under a negative than a positive, feedback condition. In addition, this condition produces less subjective workload and larger P300 event related potentials to auditory stimuli presented in a concurrent oddball task. We have also recently shown that the closed-loop system used to control the level of automation in a tracking task can also be used to control the event rate of stimuli in a vigilance monitoring task. By changing the event rate based on the subject's index of arousal, we have been able to produce improved monitoring, relative to various control groups. We have demonstrated in our initial closed-loop experiments with the the vigilance paradigm that using a negative feedback contingency (i.e. increasing event rates when the EEG index is low and decreasing event rates when the EEG index is high) results in a marked decrease of the vigilance decrement over a 40 minute session. This effect is in direct contrast to performance of a positive feedback group, as well as a number of other control groups which demonstrated the typical vigilance decrement. Interestingly, however, the negative feedback group performed at virtually the same level as a yoked control group. The yoked control group received the same order of changes in event rate that were generated by the negative feedback subjects using the closed-loop system. Thus it would appear to be possible to optimize vigilance performance by controlling the stimuli which subjects are asked to process.

  7. Automated Power Systems Management (APSM)

    NASA Technical Reports Server (NTRS)

    Bridgeforth, A. O.

    1981-01-01

    A breadboard power system incorporating autonomous functions of monitoring, fault detection and recovery, command and control was developed, tested and evaluated to demonstrate technology feasibility. Autonomous functions including switching of redundant power processing elements, individual load fault removal, and battery charge/discharge control were implemented by means of a distributed microcomputer system within the power subsystem. Three local microcomputers provide the monitoring, control and command function interfaces between the central power subsystem microcomputer and the power sources, power processing and power distribution elements. The central microcomputer is the interface between the local microcomputers and the spacecraft central computer or ground test equipment.

  8. Automatic Derivation of Forest Cover and Forest Cover Change Using Dense Multi-Temporal Time Series Data from Landsat and SPOT 5 Take5

    NASA Astrophysics Data System (ADS)

    Storch, Cornelia; Wagner, Thomas; Ramminger, Gernot; Pape, Marlon; Ott, Hannes; Hausler, Thomas; Gomez, Sharon

    2016-08-01

    The paper presents a description of the methods development for an automated processing chain for the classification of Forest Cover and Change based on high resolution multi-temporal time series Landsat and SPOT5Take5 data with focus on the dry forest ecosystems of Africa. The method has been developed within the European Space Agency (ESA) funded Global monitoring for Environment and Security Service Element for Forest Monitoring (GSE FM) project on dry forest areas; the demonstration site selected was in Malawi. The methods are based on the principles of a robust, but still flexible monitoring system, to cope with most complex Earth Observation (EO) data scenarios, varying in terms of data quality, source, accuracy, information content, completeness etc. The method allows automated tracking of change dates, data gap filling and takes into account phenology, seasonality of tree species with respect to leaf fall and heavy cloud cover during the rainy season.

  9. "SmartMonitor"--an intelligent security system for the protection of individuals and small properties with the possibility of home automation.

    PubMed

    Frejlichowski, Dariusz; Gościewska, Katarzyna; Forczmański, Paweł; Hofman, Radosław

    2014-06-05

    "SmartMonitor" is an intelligent security system based on image analysis that combines the advantages of alarm, video surveillance and home automation systems. The system is a complete solution that automatically reacts to every learned situation in a pre-specified way and has various applications, e.g., home and surrounding protection against unauthorized intrusion, crime detection or supervision over ill persons. The software is based on well-known and proven methods and algorithms for visual content analysis (VCA) that were appropriately modified and adopted to fit specific needs and create a video processing model which consists of foreground region detection and localization, candidate object extraction, object classification and tracking. In this paper, the "SmartMonitor" system is presented along with its architecture, employed methods and algorithms, and object analysis approach. Some experimental results on system operation are also provided. In the paper, focus is put on one of the aforementioned functionalities of the system, namely supervision over ill persons.

  10. Engineering the smart factory

    NASA Astrophysics Data System (ADS)

    Harrison, Robert; Vera, Daniel; Ahmad, Bilal

    2016-10-01

    The fourth industrial revolution promises to create what has been called the smart factory. The vision is that within such modular structured smart factories, cyber-physical systems monitor physical processes, create a virtual copy of the physical world and make decentralised decisions. This paper provides a view of this initiative from an automation systems perspective. In this context it considers how future automation systems might be effectively configured and supported through their lifecycles and how integration, application modelling, visualisation and reuse of such systems might be best achieved. The paper briefly describes limitations in current engineering methods, and new emerging approaches including the cyber physical systems (CPS) engineering tools being developed by the automation systems group (ASG) at Warwick Manufacturing Group, University of Warwick, UK.

  11. [Standardization of operation monitoring and control of the clinical laboratory automation system].

    PubMed

    Tao, R

    2000-10-01

    Laboratory automation systems showed up in the 1980s and have been introduced to many clinical laboratories since early 1990s. Meanwhile, it was found that the difference in the specimen tube dimensions, specimen identification formats, specimen carrier transportation equipment architecture, electromechanical interfaces between the analyzers and the automation systems was preventing the systems from being introduced to a wider extent. To standardize the different interfaces and reduce the cost of laboratory automation, NCCLS and JCCLS started establishing standards for laboratory automation in 1996 and 1997 respectively. Operation monitoring and control of the laboratory automation system have been included in their activities, resulting in the publication of an NCCLS proposed standard in 1999.

  12. Survey Available Computer Software for Automated Production Planning and Inventory Control, and Software and Hardware for Data Logging and Monitoring Shop Floor Activities

    DTIC Science & Technology

    1993-08-01

    pricing and sales, order processing , and purchasing. The class of manufacturing planning functions include aggregate production planning, materials...level. I Depending on the application, each control level will have a number of functions associated with it. For instance, order processing , purchasing...include accounting, sales forecasting, product costing, pricing and sales, order processing , and purchasing. The class of manufacturing planning functions

  13. Automated assembly of fast-axis collimation (FAC) lenses for diode laser bar modules

    NASA Astrophysics Data System (ADS)

    Miesner, Jörn; Timmermann, Andre; Meinschien, Jens; Neumann, Bernhard; Wright, Steve; Tekin, Tolga; Schröder, Henning; Westphalen, Thomas; Frischkorn, Felix

    2009-02-01

    Laser diodes and diode laser bars are key components in high power semiconductor lasers and solid state laser systems. During manufacture, the assembly of the fast axis collimation (FAC) lens is a crucial step. The goal of our activities is to design an automated assembly system for high volume production. In this paper the results of an intermediate milestone will be reported: a demonstration system was designed, realized and tested to prove the feasibility of all of the system components and process features. The demonstration system consists of a high precision handling system, metrology for process feedback, a powerful digital image processing system and tooling for glue dispensing, UV curing and laser operation. The system components as well as their interaction with each other were tested in an experimental system in order to glean design knowledge for the fully automated assembly system. The adjustment of the FAC lens is performed by a series of predefined steps monitored by two cameras concurrently imaging the far field and the near field intensity distributions. Feedback from these cameras processed by a powerful and efficient image processing algorithm control a five axis precision motion system to optimize the fast axis collimation of the laser beam. Automated cementing of the FAC to the diode bar completes the process. The presentation will show the system concept, the algorithm of the adjustment as well as experimental results. A critical discussion of the results will close the talk.

  14. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring.

    PubMed

    Shu, Tongxin; Xia, Min; Chen, Jiahong; Silva, Clarence de

    2017-11-05

    Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy.

  15. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring

    PubMed Central

    Shu, Tongxin; Xia, Min; Chen, Jiahong; de Silva, Clarence

    2017-01-01

    Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy. PMID:29113087

  16. An artificial intelligence approach to onboard fault monitoring and diagnosis for aircraft applications

    NASA Technical Reports Server (NTRS)

    Schutte, P. C.; Abbott, K. H.

    1986-01-01

    Real-time onboard fault monitoring and diagnosis for aircraft applications, whether performed by the human pilot or by automation, presents many difficult problems. Quick response to failures may be critical, the pilot often must compensate for the failure while diagnosing it, his information about the state of the aircraft is often incomplete, and the behavior of the aircraft changes as the effect of the failure propagates through the system. A research effort was initiated to identify guidelines for automation of onboard fault monitoring and diagnosis and associated crew interfaces. The effort began by determining the flight crew's information requirements for fault monitoring and diagnosis and the various reasoning strategies they use. Based on this information, a conceptual architecture was developed for the fault monitoring and diagnosis process. This architecture represents an approach and a framework which, once incorporated with the necessary detail and knowledge, can be a fully operational fault monitoring and diagnosis system, as well as providing the basis for comparison of this approach to other fault monitoring and diagnosis concepts. The architecture encompasses all aspects of the aircraft's operation, including navigation, guidance and controls, and subsystem status. The portion of the architecture that encompasses subsystem monitoring and diagnosis was implemented for an aircraft turbofan engine to explore and demonstrate the AI concepts involved. This paper describes the architecture and the implementation for the engine subsystem.

  17. Automation bias: a systematic review of frequency, effect mediators, and mitigators.

    PubMed

    Goddard, Kate; Roudsari, Abdul; Wyatt, Jeremy C

    2012-01-01

    Automation bias (AB)--the tendency to over-rely on automation--has been studied in various academic fields. Clinical decision support systems (CDSS) aim to benefit the clinical decision-making process. Although most research shows overall improved performance with use, there is often a failure to recognize the new errors that CDSS can introduce. With a focus on healthcare, a systematic review of the literature from a variety of research fields has been carried out, assessing the frequency and severity of AB, the effect mediators, and interventions potentially mitigating this effect. This is discussed alongside automation-induced complacency, or insufficient monitoring of automation output. A mix of subject specific and freetext terms around the themes of automation, human-automation interaction, and task performance and error were used to search article databases. Of 13 821 retrieved papers, 74 met the inclusion criteria. User factors such as cognitive style, decision support systems (DSS), and task specific experience mediated AB, as did attitudinal driving factors such as trust and confidence. Environmental mediators included workload, task complexity, and time constraint, which pressurized cognitive resources. Mitigators of AB included implementation factors such as training and emphasizing user accountability, and DSS design factors such as the position of advice on the screen, updated confidence levels attached to DSS output, and the provision of information versus recommendation. By uncovering the mechanisms by which AB operates, this review aims to help optimize the clinical decision-making process for CDSS developers and healthcare practitioners.

  18. Bioreactor design for successive culture of anchorage-dependent cells operated in an automated manner.

    PubMed

    Kino-Oka, Masahiro; Ogawa, Natsuki; Umegaki, Ryota; Taya, Masahito

    2005-01-01

    A novel bioreactor system was designed to perform a series of batchwise cultures of anchorage-dependent cells by means of automated operations of medium change and passage for cell transfer. The experimental data on contamination frequency ensured the biological cleanliness in the bioreactor system, which facilitated the operations in a closed environment, as compared with that in flask culture system with manual handlings. In addition, the tools for growth prediction (based on growth kinetics) and real-time growth monitoring by measurement of medium components (based on small-volume analyzing machinery) were installed into the bioreactor system to schedule the operations of medium change and passage and to confirm that culture proceeds as scheduled, respectively. The successive culture of anchorage-dependent cells was conducted with the bioreactor running in an automated way. The automated bioreactor gave a successful culture performance with fair accordance to preset scheduling based on the information in the latest subculture, realizing 79- fold cell expansion for 169 h. In addition, the correlation factor between experimental data and scheduled values through the bioreactor performance was 0.998. It was concluded that the proposed bioreactor with the integration of the prediction and monitoring tools could offer a feasible system for the manufacturing process of cultured tissue products.

  19. Role of automation in the ACRV operations

    NASA Technical Reports Server (NTRS)

    Sepahban, S. F.

    1992-01-01

    The Assured Crew Return Vehicle (ACRV) will provide the Space Station Freedom with contingency means of return to earth (1) of one disabled crew member during medical emergencies, (2) of all crew members in case of accidents or failures of SSF systems, and (3) in case of interruption of the Space Shuttle flights. A wide range of vehicle configurations and system approaches are currently under study. The Program requirements focus on minimizing life cycle costs by ensuring simple operations, built-in reliability and maintainability. The ACRV philosophy of embedded operations is based on maximum use of existing facilities, resources and processes, while minimizing the interfaces and impacts to the Space Shuttle and Freedom programs. A preliminary integrated operations concept based on this philosophy and covering the ground, flight, mission support, and landing and recovery operations has been produced. To implement the ACRV operations concept, the underlying approach has been to rely on vehicle autonomy and automation, to the extent possible. Candidate functions and processes which may benefit from current or near-term automation and robotics technologies are identified. These include, but are not limited to, built-in automated ground tests and checkouts; use of the Freedom and the Orbiter remote manipulator systems, for ACRV berthing; automated passive monitoring and performance trend analysis, and periodic active checkouts during dormant periods. The major ACRV operations concept issues as they relate to the use of automation are discussed.

  20. Multi-Mission Automated Task Invocation Subsystem

    NASA Technical Reports Server (NTRS)

    Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.

    2009-01-01

    Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,

  1. Methodology for Designing Operational Banking Risks Monitoring System

    NASA Astrophysics Data System (ADS)

    Kostjunina, T. N.

    2018-05-01

    The research looks at principles of designing an information system for monitoring operational banking risks. A proposed design methodology enables one to automate processes of collecting data on information security incidents in the banking network, serving as the basis for an integrated approach to the creation of an operational risk management system. The system can operate remotely ensuring tracking and forecasting of various operational events in the bank network. A structure of a content management system is described.

  2. Semi-automated 96-well liquid-liquid extraction for quantitation of drugs in biological fluids.

    PubMed

    Zhang, N; Hoffman, K L; Li, W; Rossi, D T

    2000-02-01

    A semi-automated liquid-liquid extraction (LLE) technique for biological fluid sample preparation was introduced for the quantitation of four drugs in rat plasma. All liquid transferring during the sample preparation was automated using a Tomtec Quadra 96 Model 320 liquid handling robot, which processed up to 96 samples in parallel. The samples were either in 96-deep-well plate or tube-rack format. One plate of samples can be prepared in approximately 1.5 h, and the 96-well plate is directly compatible with the autosampler of an LC/MS system. Selection of organic solvents and recoveries are discussed. Also, precision, relative error, linearity and quantitation of the semi automated LLE method are estimated for four example drugs using LC/MS/MS with a multiple reaction monitoring (MRM) approach. The applicability of this method and future directions are evaluated.

  3. Circadian Effects on Simple Components of Complex Task Performance

    NASA Technical Reports Server (NTRS)

    Clegg, Benjamin A.; Wickens, Christopher D.; Vieane, Alex Z.; Gutzwiller, Robert S.; Sebok, Angelia L.

    2015-01-01

    The goal of this study was to advance understanding and prediction of the impact of circadian rhythm on aspects of complex task performance during unexpected automation failures, and subsequent fault management. Participants trained on two tasks: a process control simulation, featuring automated support; and a multi-tasking platform. Participants then completed one task in a very early morning (circadian night) session, and the other during a late afternoon (circadian day) session. Small effects of time of day were seen on simple components of task performance, but impacts on more demanding components, such as those that occur following an automation failure, were muted relative to previous studies where circadian rhythm was compounded with sleep deprivation and fatigue. Circadian low participants engaged in compensatory strategies, rather than passively monitoring the automation. The findings and implications are discussed in the context of a model that includes the effects of sleep and fatigue factors.

  4. Advanced in In Situ Inspection of Automated Fiber Placement Systems

    NASA Technical Reports Server (NTRS)

    Juarez, Peter D.; Cramer, K. Elliott; Seebo, Jeffrey P.

    2016-01-01

    Automated Fiber Placement (AFP) systems have been developed to help take advantage of the tailorability of composite structures in aerospace applications. AFP systems allow the repeatable placement of uncured, spool fed, preimpregnated carbon fiber tape (tows) onto substrates in desired thicknesses and orientations. This automated process can incur defects, such as overlapping tow lines, which can severely undermine the structural integrity of the part. Current defect detection and abatement methods are very labor intensive, and still mostly rely on human manual inspection. Proposed is a thermographic in situ inspection technique which monitors tow placement with an on board thermal camera using the preheated substrate as a through transmission heat source. An investigation of the concept is conducted, and preliminary laboratory results are presented. Also included will be a brief overview of other emerging technologies that tackle the same issue. Keywords: Automated Fiber Placement, Manufacturing defects, Thermography

  5. Wireless energizing system for an automated implantable sensor.

    PubMed

    Swain, Biswaranjan; Nayak, Praveen P; Kar, Durga P; Bhuyan, Satyanarayan; Mishra, Laxmi P

    2016-07-01

    The wireless drive of an automated implantable electronic sensor has been explored for health monitoring applications. The proposed system comprises of an automated biomedical sensing system which is energized through resonant inductive coupling. The implantable sensor unit is able to monitor the body temperature parameter and sends back the corresponding telemetry data wirelessly to the data recoding unit. It has been observed that the wireless power delivery system is capable of energizing the automated biomedical implantable electronic sensor placed over a distance of 3 cm from the power transmitter with an energy transfer efficiency of 26% at the operating resonant frequency of 562 kHz. This proposed method ensures real-time monitoring of different human body temperatures around the clock. The monitored temperature data have been compared with a calibrated temperature measurement system to ascertain the accuracy of the proposed system. The investigated technique can also be useful for monitoring other body parameters such as blood pressure, bladder pressure, and physiological signals of the patient in vivo using various implantable sensors.

  6. Automated monitor and control for deep space network subsystems

    NASA Technical Reports Server (NTRS)

    Smyth, P.

    1989-01-01

    The problem of automating monitor and control loops for Deep Space Network (DSN) subsystems is considered and an overview of currently available automation techniques is given. The use of standard numerical models, knowledge-based systems, and neural networks is considered. It is argued that none of these techniques alone possess sufficient generality to deal with the demands imposed by the DSN environment. However, it is shown that schemes that integrate the better aspects of each approach and are referenced to a formal system model show considerable promise, although such an integrated technology is not yet available for implementation. Frequent reference is made to the receiver subsystem since this work was largely motivated by experience in developing an automated monitor and control loop for the advanced receiver.

  7. Application of automated measurement stations for continuous water quality monitoring of the Dender river in Flanders, Belgium.

    PubMed

    Vandenberghe, V; Goethals, P L M; Van Griensven, A; Meirlaen, J; De Pauw, N; Vanrolleghem, P; Bauwens, W

    2005-09-01

    During the summer of 1999, two automated water quality measurement stations were installed along the Dender river in Belgium. The variables dissolved oxygen, temperature, conductivity, pH, rain-intensity, flow and solar radiation were measured continuously. In this paper these on-line measurement series are presented and interpreted using also additional measurements and ecological expert-knowledge. The purpose was to demonstrate the variability in time and space of the aquatic processes and the consequences of conducting and interpreting discrete measurements for river quality assessment and management. The large fluctuations of the data illustrated the importance of continuous measurements for the complete description and modelling of the biological processes in the river.

  8. An automated dose tracking system for adaptive radiation therapy.

    PubMed

    Liu, Chang; Kim, Jinkoo; Kumarasiri, Akila; Mayyas, Essa; Brown, Stephen L; Wen, Ning; Siddiqui, Farzan; Chetty, Indrin J

    2018-02-01

    The implementation of adaptive radiation therapy (ART) into routine clinical practice is technically challenging and requires significant resources to perform and validate each process step. The objective of this report is to identify the key components of ART, to illustrate how a specific automated procedure improves efficiency, and to facilitate the routine clinical application of ART. Data was used from patient images, exported from a clinical database and converted to an intermediate format for point-wise dose tracking and accumulation. The process was automated using in-house developed software containing three modularized components: an ART engine, user interactive tools, and integration tools. The ART engine conducts computing tasks using the following modules: data importing, image pre-processing, dose mapping, dose accumulation, and reporting. In addition, custom graphical user interfaces (GUIs) were developed to allow user interaction with select processes such as deformable image registration (DIR). A commercial scripting application programming interface was used to incorporate automated dose calculation for application in routine treatment planning. Each module was considered an independent program, written in C++or C#, running in a distributed Windows environment, scheduled and monitored by integration tools. The automated tracking system was retrospectively evaluated for 20 patients with prostate cancer and 96 patients with head and neck cancer, under institutional review board (IRB) approval. In addition, the system was evaluated prospectively using 4 patients with head and neck cancer. Altogether 780 prostate dose fractions and 2586 head and neck cancer dose fractions went processed, including DIR and dose mapping. On average, daily cumulative dose was computed in 3 h and the manual work was limited to 13 min per case with approximately 10% of cases requiring an additional 10 min for image registration refinement. An efficient and convenient dose tracking system for ART in the clinical setting is presented. The software and automated processes were rigorously evaluated and validated using patient image datasets. Automation of the various procedures has improved efficiency significantly, allowing for the routine clinical application of ART for improving radiation therapy effectiveness. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Misclassification of OSA severity with automated scoring of home sleep recordings.

    PubMed

    Aurora, R Nisha; Swartz, Rachel; Punjabi, Naresh M

    2015-03-01

    The advent of home sleep testing has allowed for the development of an ambulatory care model for OSA that most health-care providers can easily deploy. Although automated algorithms that accompany home sleep monitors can identify and classify disordered breathing events, it is unclear whether manual scoring followed by expert review of home sleep recordings is of any value. Thus, this study examined the agreement between automated and manual scoring of home sleep recordings. Two type 3 monitors (ApneaLink Plus [ResMed] and Embletta [Embla Systems]) were examined in distinct study samples. Data from manual and automated scoring were available for 200 subjects. Two thresholds for oxygen desaturation (≥ 3% and ≥ 4%) were used to define disordered breathing events. Agreement between manual and automated scoring was examined using Pearson correlation coefficients and Bland-Altman analyses. Automated scoring consistently underscored disordered breathing events compared with manual scoring for both sleep monitors irrespective of whether a ≥ 3% or ≥ 4% oxygen desaturation threshold was used to define the apnea-hypopnea index (AHI). For the ApneaLink Plus monitor, Bland-Altman analyses revealed an average AHI difference between manual and automated scoring of 6.1 (95% CI, 4.9-7.3) and 4.6 (95% CI, 3.5-5.6) events/h for the ≥ 3% and ≥ 4% oxygen desaturation thresholds, respectively. Similarly for the Embletta monitor, the average difference between manual and automated scoring was 5.3 (95% CI, 3.2-7.3) and 8.4 (95% CI, 7.2-9.6) events/h, respectively. Although agreement between automated and manual scoring of home sleep recordings varies based on the device used, modest agreement was observed between the two approaches. However, manual review of home sleep test recordings can decrease the misclassification of OSA severity, particularly for those with mild disease. ClinicalTrials.gov; No.: NCT01503164; www.clinicaltrials.gov.

  10. Misclassification of OSA Severity With Automated Scoring of Home Sleep Recordings

    PubMed Central

    Aurora, R. Nisha; Swartz, Rachel

    2015-01-01

    BACKGROUND: The advent of home sleep testing has allowed for the development of an ambulatory care model for OSA that most health-care providers can easily deploy. Although automated algorithms that accompany home sleep monitors can identify and classify disordered breathing events, it is unclear whether manual scoring followed by expert review of home sleep recordings is of any value. Thus, this study examined the agreement between automated and manual scoring of home sleep recordings. METHODS: Two type 3 monitors (ApneaLink Plus [ResMed] and Embletta [Embla Systems]) were examined in distinct study samples. Data from manual and automated scoring were available for 200 subjects. Two thresholds for oxygen desaturation (≥ 3% and ≥ 4%) were used to define disordered breathing events. Agreement between manual and automated scoring was examined using Pearson correlation coefficients and Bland-Altman analyses. RESULTS: Automated scoring consistently underscored disordered breathing events compared with manual scoring for both sleep monitors irrespective of whether a ≥ 3% or ≥ 4% oxygen desaturation threshold was used to define the apnea-hypopnea index (AHI). For the ApneaLink Plus monitor, Bland-Altman analyses revealed an average AHI difference between manual and automated scoring of 6.1 (95% CI, 4.9-7.3) and 4.6 (95% CI, 3.5-5.6) events/h for the ≥ 3% and ≥ 4% oxygen desaturation thresholds, respectively. Similarly for the Embletta monitor, the average difference between manual and automated scoring was 5.3 (95% CI, 3.2-7.3) and 8.4 (95% CI, 7.2-9.6) events/h, respectively. CONCLUSIONS: Although agreement between automated and manual scoring of home sleep recordings varies based on the device used, modest agreement was observed between the two approaches. However, manual review of home sleep test recordings can decrease the misclassification of OSA severity, particularly for those with mild disease. TRIAL REGISTRY: ClinicalTrials.gov; No.: NCT01503164; www.clinicaltrials.gov PMID:25411804

  11. Automated manufacturing of chimeric antigen receptor T cells for adoptive immunotherapy using CliniMACS prodigy.

    PubMed

    Mock, Ulrike; Nickolay, Lauren; Philip, Brian; Cheung, Gordon Weng-Kit; Zhan, Hong; Johnston, Ian C D; Kaiser, Andrew D; Peggs, Karl; Pule, Martin; Thrasher, Adrian J; Qasim, Waseem

    2016-08-01

    Novel cell therapies derived from human T lymphocytes are exhibiting enormous potential in early-phase clinical trials in patients with hematologic malignancies. Ex vivo modification of T cells is currently limited to a small number of centers with the required infrastructure and expertise. The process requires isolation, activation, transduction, expansion and cryopreservation steps. To simplify procedures and widen applicability for clinical therapies, automation of these procedures is being developed. The CliniMACS Prodigy (Miltenyi Biotec) has recently been adapted for lentiviral transduction of T cells and here we analyse the feasibility of a clinically compliant T-cell engineering process for the manufacture of T cells encoding chimeric antigen receptors (CAR) for CD19 (CAR19), a widely targeted antigen in B-cell malignancies. Using a closed, single-use tubing set we processed mononuclear cells from fresh or frozen leukapheresis harvests collected from healthy volunteer donors. Cells were phenotyped and subjected to automated processing and activation using TransAct, a polymeric nanomatrix activation reagent incorporating CD3/CD28-specific antibodies. Cells were then transduced and expanded in the CentriCult-Unit of the tubing set, under stabilized culture conditions with automated feeding and media exchange. The process was continuously monitored to determine kinetics of expansion, transduction efficiency and phenotype of the engineered cells in comparison with small-scale transductions run in parallel. We found that transduction efficiencies, phenotype and function of CAR19 T cells were comparable with existing procedures and overall T-cell yields sufficient for anticipated therapeutic dosing. The automation of closed-system T-cell engineering should improve dissemination of emerging immunotherapies and greatly widen applicability. Copyright © 2016. Published by Elsevier Inc.

  12. NOUS: A Knowledge Graph Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knowledge graphs represent information as entities and relationships between them. For tasks such as natural language question answering or automated analysis of text, a knowledge graph provides valuable context to establish the specific type of entities being discussed. It allow us to derive better context about newly arriving information and leads to intelligent reasoning capabilities. We address two primary needs: A) Automated construction of knowledge graphs is a technically challenging, expensive process; and B) The ability to synthesize new information by monitoring newly emerging knowledge is a transformational capability that does not exist in state of the art systems.

  13. Robo-Lector – a novel platform for automated high-throughput cultivations in microtiter plates with high information content

    PubMed Central

    Huber, Robert; Ritter, Daniel; Hering, Till; Hillmer, Anne-Kathrin; Kensy, Frank; Müller, Carsten; Wang, Le; Büchs, Jochen

    2009-01-01

    Background In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays. Results To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector) was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3) pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times) simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only ± 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main culture plate, where subsequently similar growth kinetics could be obtained. Conclusion The Robo-Lector generates extensive kinetic data in high-throughput cultivations, particularly for biomass and fluorescence protein formation. Based on the non-invasive on-line-monitoring signals, actions of the liquid-handling robot can easily be triggered. This interaction between the robot and the BioLector (Robo-Lector) combines high-content data generation with systematic high-throughput experimentation in an automated fashion, offering new possibilities to study biological production systems. The presented platform uses a standard liquid-handling workstation with widespread automation possibilities. Thus, high-throughput cultivations can now be combined with small-scale downstream processing techniques and analytical assays. Ultimately, this novel versatile platform can accelerate and intensify research and development in the field of systems biology as well as modelling and bioprocess optimization. PMID:19646274

  14. Automated soil gas monitoring chamber

    DOEpatents

    Edwards, Nelson T.; Riggs, Jeffery S.

    2003-07-29

    A chamber for trapping soil gases as they evolve from the soil without disturbance to the soil and to the natural microclimate within the chamber has been invented. The chamber opens between measurements and therefore does not alter the metabolic processes that influence soil gas efflux rates. A multiple chamber system provides for repetitive multi-point sampling, undisturbed metabolic soil processes between sampling, and an essentially airtight sampling chamber operating at ambient pressure.

  15. Interoperability for Space Mission Monitor and Control: Applying Technologies from Manufacturing Automation and Process Control Industries

    NASA Technical Reports Server (NTRS)

    Jones, Michael K.

    1998-01-01

    Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.

  16. Systems for monitoring and digitally recording water-quality parameters

    USGS Publications Warehouse

    Smoot, George F.; Blakey, James F.

    1966-01-01

    Digital recording of water-quality parameters is a link in the automated data collection and processing system of the U.S. Geological Survey. The monitoring and digital recording systems adopted by the Geological Survey, while punching all measurements on a standard paper tape, provide a choice of compatible components to construct a system to meet specific physical problems and data needs. As many as 10 parameters can be recorded by an Instrument, with the only limiting criterion being that measurements are expressed as electrical signals.

  17. Tools for automated acoustic monitoring within the R package monitoR

    USGS Publications Warehouse

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors.

  18. Syndromic surveillance using veterinary laboratory data: data pre-processing and algorithm performance evaluation

    PubMed Central

    Dórea, Fernanda C.; McEwen, Beverly J.; McNab, W. Bruce; Revie, Crawford W.; Sanchez, Javier

    2013-01-01

    Diagnostic test orders to an animal laboratory were explored as a data source for monitoring trends in the incidence of clinical syndromes in cattle. Four years of real data and over 200 simulated outbreak signals were used to compare pre-processing methods that could remove temporal effects in the data, as well as temporal aberration detection algorithms that provided high sensitivity and specificity. Weekly differencing demonstrated solid performance in removing day-of-week effects, even in series with low daily counts. For aberration detection, the results indicated that no single algorithm showed performance superior to all others across the range of outbreak scenarios simulated. Exponentially weighted moving average charts and Holt–Winters exponential smoothing demonstrated complementary performance, with the latter offering an automated method to adjust to changes in the time series that will likely occur in the future. Shewhart charts provided lower sensitivity but earlier detection in some scenarios. Cumulative sum charts did not appear to add value to the system; however, the poor performance of this algorithm was attributed to characteristics of the data monitored. These findings indicate that automated monitoring aimed at early detection of temporal aberrations will likely be most effective when a range of algorithms are implemented in parallel. PMID:23576782

  19. Syndromic surveillance using veterinary laboratory data: data pre-processing and algorithm performance evaluation.

    PubMed

    Dórea, Fernanda C; McEwen, Beverly J; McNab, W Bruce; Revie, Crawford W; Sanchez, Javier

    2013-06-06

    Diagnostic test orders to an animal laboratory were explored as a data source for monitoring trends in the incidence of clinical syndromes in cattle. Four years of real data and over 200 simulated outbreak signals were used to compare pre-processing methods that could remove temporal effects in the data, as well as temporal aberration detection algorithms that provided high sensitivity and specificity. Weekly differencing demonstrated solid performance in removing day-of-week effects, even in series with low daily counts. For aberration detection, the results indicated that no single algorithm showed performance superior to all others across the range of outbreak scenarios simulated. Exponentially weighted moving average charts and Holt-Winters exponential smoothing demonstrated complementary performance, with the latter offering an automated method to adjust to changes in the time series that will likely occur in the future. Shewhart charts provided lower sensitivity but earlier detection in some scenarios. Cumulative sum charts did not appear to add value to the system; however, the poor performance of this algorithm was attributed to characteristics of the data monitored. These findings indicate that automated monitoring aimed at early detection of temporal aberrations will likely be most effective when a range of algorithms are implemented in parallel.

  20. Using Bayesian Networks and Decision Theory to Model Physical Security

    DTIC Science & Technology

    2003-02-01

    Home automation technologies allow a person to monitor and control various activities within a home or office setting. Cameras, sensors and other...components used along with the simple rules in the home automation software provide an environment where the lights, security and other appliances can be...monitored and controlled. These home automation technologies, however, lack the power to reason under uncertain conditions and thus the system can

  1. Possibilities in optical monitoring of laser welding process

    NASA Astrophysics Data System (ADS)

    Horník, Petr; Mrňa, Libor; Pavelka, Jan

    2016-11-01

    Laser welding is a modern, widely used but still not really common method of welding. With increasing demands on the quality of the welds, it is usual to apply automated machine welding and with on-line monitoring of the welding process. The resulting quality of the weld is largely affected by the behavior of keyhole. However, its direct observation during the welding process is practically impossible and it is necessary to use indirect methods. At ISI we have developed optical methods of monitoring the process. Most advanced is an analysis of radiation of laser-induced plasma plume forming in the keyhole where changes in the frequency of the plasma bursts are monitored and evaluated using Fourier and autocorrelation analysis. Another solution, robust and suitable for industry, is based on the observation of the keyhole inlet opening through a coaxial camera mounted in the welding head and the subsequent image processing by computer vision methods. A high-speed camera is used to understand the dynamics of the plasma plume. Through optical spectroscopy of the plume, we can study the excitation of elements in a material. It is also beneficial to monitor the gas flow of shielding gas using schlieren method.

  2. Monitoring Agents for Assisting NASA Engineers with Shuttle Ground Processing

    NASA Technical Reports Server (NTRS)

    Semmel, Glenn S.; Davis, Steven R.; Leucht, Kurt W.; Rowe, Danil A.; Smith, Kevin E.; Boeloeni, Ladislau

    2005-01-01

    The Spaceport Processing Systems Branch at NASA Kennedy Space Center has designed, developed, and deployed a rule-based agent to monitor the Space Shuttle's ground processing telemetry stream. The NASA Engineering Shuttle Telemetry Agent increases situational awareness for system and hardware engineers during ground processing of the Shuttle's subsystems. The agent provides autonomous monitoring of the telemetry stream and automatically alerts system engineers when user defined conditions are satisfied. Efficiency and safety are improved through increased automation. Sandia National Labs' Java Expert System Shell is employed as the agent's rule engine. The shell's predicate logic lends itself well to capturing the heuristics and specifying the engineering rules within this domain. The declarative paradigm of the rule-based agent yields a highly modular and scalable design spanning multiple subsystems of the Shuttle. Several hundred monitoring rules have been written thus far with corresponding notifications sent to Shuttle engineers. This chapter discusses the rule-based telemetry agent used for Space Shuttle ground processing. We present the problem domain along with design and development considerations such as information modeling, knowledge capture, and the deployment of the product. We also present ongoing work with other condition monitoring agents.

  3. Considerations In The Design And Specifications Of An Automatic Inspection System

    NASA Astrophysics Data System (ADS)

    Lee, David T.

    1980-05-01

    Considerable activities have been centered around the automation of manufacturing quality control and inspection functions. Several reasons can be cited for this development. The continuous pressure of direct and indirect labor cost increase is only one of the obvious motivations. With the drive for electronics miniaturization come more and more complex processes where control parameters are critical and the yield is highly susceptible to inadequate process monitor and inspection. With multi-step, multi-layer process for substrate fabrication, process defects that are not detected and corrected at certain critical points may render the entire subassembly useless. As a process becomes more complex, the time required to test the product increases significantly in the total build cycle. The urgency to reduce test time brings more pressure to improve in-process control and inspection. The advances and improvements of components, assemblies and systems such as micro-processors, micro-computers, programmable controllers, and other intelligent devices, have made the automation of quality control much more cost effective and justifiable.

  4. Autonomous Operations System: Development and Application

    NASA Technical Reports Server (NTRS)

    Toro Medina, Jaime A.; Wilkins, Kim N.; Walker, Mark; Stahl, Gerald M.

    2016-01-01

    Autonomous control systems provides the ability of self-governance beyond the conventional control system. As the complexity of mechanical and electrical systems increases, there develops a natural drive for developing robust control systems to manage complicated operations. By closing the bridge between conventional automated systems to knowledge based self-awareness systems, nominal control of operations can evolve into relying on safe critical mitigation processes to support any off-nominal behavior. Current research and development efforts lead by the Autonomous Propellant Loading (APL) group at NASA Kennedy Space Center aims to improve cryogenic propellant transfer operations by developing an automated control and health monitoring system. As an integrated systems, the center aims to produce an Autonomous Operations System (AOS) capable of integrating health management operations with automated control to produce a fully autonomous system.

  5. A Hybrid-Cloud Science Data System Enabling Advanced Rapid Imaging & Analysis for Monitoring Hazards

    NASA Astrophysics Data System (ADS)

    Hua, H.; Owen, S. E.; Yun, S.; Lundgren, P.; Moore, A. W.; Fielding, E. J.; Radulescu, C.; Sacco, G.; Stough, T. M.; Mattmann, C. A.; Cervelli, P. F.; Poland, M. P.; Cruz, J.

    2012-12-01

    Volcanic eruptions, landslides, and levee failures are some examples of hazards that can be more accurately forecasted with sufficient monitoring of precursory ground deformation, such as the high-resolution measurements from GPS and InSAR. In addition, coherence and reflectivity change maps can be used to detect surface change due to lava flows, mudslides, tornadoes, floods, and other natural and man-made disasters. However, it is difficult for many volcano observatories and other monitoring agencies to process GPS and InSAR products in an automated scenario needed for continual monitoring of events. Additionally, numerous interoperability barriers exist in multi-sensor observation data access, preparation, and fusion to create actionable products. Combining high spatial resolution InSAR products with high temporal resolution GPS products--and automating this data preparation & processing across global-scale areas of interests--present an untapped science and monitoring opportunity. The global coverage offered by satellite-based SAR observations, and the rapidly expanding GPS networks, can provide orders of magnitude more data on these hazardous events if we have a data system that can efficiently and effectively analyze the voluminous raw data, and provide users the tools to access data from their regions of interest. Currently, combined GPS & InSAR time series are primarily generated for specific research applications, and are not implemented to run on large-scale continuous data sets and delivered to decision-making communities. We are developing an advanced service-oriented architecture for hazard monitoring leveraging NASA-funded algorithms and data management to enable both science and decision-making communities to monitor areas of interests via seamless data preparation, processing, and distribution. Our objectives: * Enable high-volume and low-latency automatic generation of NASA Solid Earth science data products (InSAR and GPS) to support hazards monitoring. * Facilitate NASA-USGS collaborations to share NASA InSAR and GPS data products, which are difficult to process in high-volume and low-latency, for decision-support. * Enable interoperable discovery, access, and sharing of NASA observations and derived actionable products, and between the observation and decision-making communities. * Enable their improved understanding through visualization, mining, and cross-agency sharing. Existing InSAR & GPS processing packages and other software are integrated for generating geodetic decision support monitoring products. We employ semantic and cloud-based data management and processing techniques for handling large data volumes, reducing end product latency, codifying data system information with semantics, and deploying interoperable services for actionable products to decision-making communities.

  6. Portable Positron Measurement System (PPMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akers, Doug

    Portable Positron Measurement System (PPMS) is an automated, non-destructive inspection system based on positron annihilation, which characterizes a material's in situatomic-level properties during the manufacturing processes of formation, solidification, and heat treatment. Simultaneous manufacturing and quality monitoring now are possible. Learn more about the lab's project on our facebook site http://www.facebook.com/idahonationallaboratory.

  7. Project Planning and Reporting

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Project Planning Analysis and Reporting System (PPARS) is automated aid in monitoring and scheduling of activities within project. PPARS system consists of PPARS Batch Program, five preprocessor programs, and two post-processor programs. PPARS Batch program is full CPM (Critical Path Method) scheduling program with resource capabilities. Can process networks with up to 10,000 activities.

  8. Portable Positron Measurement System (PPMS)

    ScienceCinema

    None

    2017-12-09

    Portable Positron Measurement System (PPMS) is an automated, non-destructive inspection system based on positron annihilation, which characterizes a material's in situatomic-level properties during the manufacturing processes of formation, solidification, and heat treatment. Simultaneous manufacturing and quality monitoring now are possible. Learn more about the lab's project on our facebook site http://www.facebook.com/idahonationallaboratory.

  9. 76 FR 35446 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-17

    ...) approval to conduct survey research to monitor the health care industry's awareness of, and preparation for...) Outpatient Physical Therapy--Speech Pathology Survey Report; Use: CMS-1856 is used as an application to be... the Automated Survey Process Environment (ASPEN). CMS- 1893 is used by the State survey agency to...

  10. Adaptive function allocation reduces performance costs of static automation

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja; Mouloua, Mustapha; Molloy, Robert; Hilburn, Brian

    1993-01-01

    Adaptive automation offers the option of flexible function allocation between the pilot and on-board computer systems. One of the important claims for the superiority of adaptive over static automation is that such systems do not suffer from some of the drawbacks associated with conventional function allocation. Several experiments designed to test this claim are reported in this article. The efficacy of adaptive function allocation was examined using a laboratory flight-simulation task involving multiple functions of tracking, fuel-management, and systems monitoring. The results show that monitoring inefficiency represents one of the performance costs of static automation. Adaptive function allocation can reduce the performance cost associated with long-term static automation.

  11. Technology transfer potential of an automated water monitoring system. [market research

    NASA Technical Reports Server (NTRS)

    Jamieson, W. M.; Hillman, M. E. D.; Eischen, M. A.; Stilwell, J. M.

    1976-01-01

    The nature and characteristics of the potential economic need (markets) for a highly integrated water quality monitoring system were investigated. The technological, institutional and marketing factors that would influence the transfer and adoption of an automated system were studied for application to public and private water supply, public and private wastewater treatment and environmental monitoring of rivers and lakes.

  12. Classification of Automated Search Traffic

    NASA Astrophysics Data System (ADS)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  13. Early detection of health and welfare compromises through automated detection of behavioural changes in pigs.

    PubMed

    Matthews, Stephen G; Miller, Amy L; Clapp, James; Plötz, Thomas; Kyriazakis, Ilias

    2016-11-01

    Early detection of health and welfare compromises in commercial piggeries is essential for timely intervention to enhance treatment success, reduce impact on welfare, and promote sustainable pig production. Behavioural changes that precede or accompany subclinical and clinical signs may have diagnostic value. Often referred to as sickness behaviour, this encompasses changes in feeding, drinking, and elimination behaviours, social behaviours, and locomotion and posture. Such subtle changes in behaviour are not easy to quantify and require lengthy observation input by staff, which is impractical on a commercial scale. Automated early-warning systems may provide an alternative by objectively measuring behaviour with sensors to automatically monitor and detect behavioural changes. This paper aims to: (1) review the quantifiable changes in behaviours with potential diagnostic value; (2) subsequently identify available sensors for measuring behaviours; and (3) describe the progress towards automating monitoring and detection, which may allow such behavioural changes to be captured, measured, and interpreted and thus lead to automation in commercial, housed piggeries. Multiple sensor modalities are available for automatic measurement and monitoring of behaviour, which require humans to actively identify behavioural changes. This has been demonstrated for the detection of small deviations in diurnal drinking, deviations in feeding behaviour, monitoring coughs and vocalisation, and monitoring thermal comfort, but not social behaviour. However, current progress is in the early stages of developing fully automated detection systems that do not require humans to identify behavioural changes; e.g., through automated alerts sent to mobile phones. Challenges for achieving automation are multifaceted and trade-offs are considered between health, welfare, and costs, between analysis of individuals and groups, and between generic and compromise-specific behaviours. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Wireless energizing system for an automated implantable sensor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swain, Biswaranjan; Nayak, Praveen P.; Kar, Durga P.

    The wireless drive of an automated implantable electronic sensor has been explored for health monitoring applications. The proposed system comprises of an automated biomedical sensing system which is energized through resonant inductive coupling. The implantable sensor unit is able to monitor the body temperature parameter and sends back the corresponding telemetry data wirelessly to the data recoding unit. It has been observed that the wireless power delivery system is capable of energizing the automated biomedical implantable electronic sensor placed over a distance of 3 cm from the power transmitter with an energy transfer efficiency of 26% at the operating resonantmore » frequency of 562 kHz. This proposed method ensures real-time monitoring of different human body temperatures around the clock. The monitored temperature data have been compared with a calibrated temperature measurement system to ascertain the accuracy of the proposed system. The investigated technique can also be useful for monitoring other body parameters such as blood pressure, bladder pressure, and physiological signals of the patient in vivo using various implantable sensors.« less

  15. The Multi-Isotope Process (MIP) Monitor Project: FY13 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meier, David E.; Coble, Jamie B.; Jordan, David V.

    The Multi-Isotope Process (MIP) Monitor provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of “… (minimization of) the risks of nuclear proliferation and terrorism.” The MIP Monitor measures the distribution of the radioactive isotopes in product and waste streams of a nuclear reprocessing facility. These isotopes are monitored online by gamma spectrometry and compared, in near-real-time, to spectral patterns representing “normal” process conditions using multivariate analysis and pattern recognition algorithms. The combination of multivariate analysis and gamma spectroscopy allows us to detect small changes in the gamma spectrum, which may indicatemore » changes in process conditions. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting for systems representing aqueous reprocessing facilities. However, pyroprocessing is emerging as an alternative to aqueous reprocessing techniques.« less

  16. Digital Image Correlation for Performance Monitoring.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palaviccini, Miguel; Turner, Daniel Z.; Herzberg, Michael

    2016-02-01

    Evaluating the health of a mechanism requires more than just a binary evaluation of whether an operation was completed. It requires analyzing more comprehensive, full-field data. Health monitoring is a process of nondestructively identifying characteristics that indicate the fitness of an engineered component. In order to monitor unit health in a production setting, an automated test system must be created to capture the motion of mechanism parts in a real-time and non-intrusive manner. One way to accomplish this is by using high-speed video (HSV) and Digital Image Correlation (DIC). In this approach, individual frames of the video are analyzed tomore » track the motion of mechanism components. The derived performance metrics allow for state-of-health monitoring and improved fidelity of mechanism modeling. The results are in-situ state-of-health identification and performance prediction. This paper introduces basic concepts of this test method, and discusses two main themes: the use of laser marking to add fiducial patterns to mechanism components, and new software developed to track objects with complex shapes, even as they move behind obstructions. Finally, the implementation of these tests into an automated tester is discussed.« less

  17. Autonomous Mission Operations Roadmap

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy David

    2014-01-01

    As light time delays increase, the number of such situations in which crew autonomy is the best way to conduct the mission is expected to increase. However, there are significant open questions regarding which functions to allocate to ground and crew as the time delays increase. In situations where the ideal solution is to allocate responsibility to the crew and the vehicle, a second question arises: should the activity be the responsibility of the crew or an automated vehicle function? More specifically, we must answer the following questions: What aspects of mission operation responsibilities (Plan, Train, Fly) should be allocated to ground based or vehicle based planning, monitoring, and control in the presence of significant light-time delay between the vehicle and the Earth?How should the allocated ground based planning, monitoring, and control be distributed across the flight control team and ground system automation? How should the allocated vehicle based planning, monitoring, and control be distributed between the flight crew and onboard system automation?When during the mission should responsibility shift from flight control team to crew or from crew to vehicle, and what should the process of shifting responsibility be as the mission progresses? NASA is developing a roadmap of capabilities for Autonomous Mission Operations for human spaceflight. This presentation will describe the current state of development of this roadmap, with specific attention to in-space inspection tasks that crews might perform with minimum assistance from the ground.

  18. Methodology for Automated Detection of Degradation and Faults in Packaged Air Conditioners and Heat Pumps Using Only Two Sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-02-10

    The software was created in the process of developing a system known as the Smart Monitoring and Diagnostic System (SMDS) for packaged air conditioners and heat pumps used on commercial buildings (known as RTUs). The SMDS provides automated remote monitoring and detection of performance degradation and faults in these RTUs and could increase the awareness by building owners and maintenance providers of the condition of the equipment, the cost of operating it in degraded condition, and the quality of maintenance and repair service when it is performed. The SMDS provides these capabilities and would enable conditioned-based maintenance rather than themore » reactive and schedule-based preventive maintenance commonly used today, when maintenance of RTUs is done at all. Improved maintenance would help ensure persistent peak operating efficiencies, reducing energy consumption by an estimated 10% to 30%.« less

  19. Automated Monitoring of Pipeline Rights-of-Way

    NASA Technical Reports Server (NTRS)

    Frost, Chard Ritchie

    2010-01-01

    NASA Ames Research Center and the Pipeline Research Council International, Inc. have partnered in the formation of a research program to identify and develop the key technologies required to enable automated detection of threats to gas and oil transmission and distribution pipelines. This presentation describes the Right-of-way Automated Monitoring (RAM) program and highlights research successes to date, continuing challenges to implementing the RAM objectives, and the program's ongoing work and plans.

  20. Automation bias: a systematic review of frequency, effect mediators, and mitigators

    PubMed Central

    Roudsari, Abdul; Wyatt, Jeremy C

    2011-01-01

    Automation bias (AB)—the tendency to over-rely on automation—has been studied in various academic fields. Clinical decision support systems (CDSS) aim to benefit the clinical decision-making process. Although most research shows overall improved performance with use, there is often a failure to recognize the new errors that CDSS can introduce. With a focus on healthcare, a systematic review of the literature from a variety of research fields has been carried out, assessing the frequency and severity of AB, the effect mediators, and interventions potentially mitigating this effect. This is discussed alongside automation-induced complacency, or insufficient monitoring of automation output. A mix of subject specific and freetext terms around the themes of automation, human–automation interaction, and task performance and error were used to search article databases. Of 13 821 retrieved papers, 74 met the inclusion criteria. User factors such as cognitive style, decision support systems (DSS), and task specific experience mediated AB, as did attitudinal driving factors such as trust and confidence. Environmental mediators included workload, task complexity, and time constraint, which pressurized cognitive resources. Mitigators of AB included implementation factors such as training and emphasizing user accountability, and DSS design factors such as the position of advice on the screen, updated confidence levels attached to DSS output, and the provision of information versus recommendation. By uncovering the mechanisms by which AB operates, this review aims to help optimize the clinical decision-making process for CDSS developers and healthcare practitioners. PMID:21685142

  1. A radar data processing and enhancement system

    NASA Technical Reports Server (NTRS)

    Anderson, K. F.; Wrin, J. W.; James, R.

    1986-01-01

    This report describes the space position data processing system of the NASA Western Aeronautical Test Range. The system is installed at the Dryden Flight Research Facility of NASA Ames Research Center. This operational radar data system (RADATS) provides simultaneous data processing for multiple data inputs and tracking and antenna pointing outputs while performing real-time monitoring, control, and data enhancement functions. Experience in support of the space shuttle and aeronautical flight research missions is described, as well as the automated calibration and configuration functions of the system.

  2. Model-based reasoning in SSF ECLSS

    NASA Technical Reports Server (NTRS)

    Miller, J. K.; Williams, George P. W., Jr.

    1992-01-01

    The interacting processes and reconfigurable subsystems of the Space Station Freedom Environmental Control and Life Support System (ECLSS) present a tremendous technical challenge to Freedom's crew and ground support. ECLSS operation and problem analysis is time-consuming for crew members and difficult for current computerized control, monitoring, and diagnostic software. These challenges can be at least partially mitigated by the use of advanced techniques such as Model-Based Reasoning (MBR). This paper will provide an overview of MBR as it is being applied to Space Station Freedom ECLSS. It will report on work being done to produce intelligent systems to help design, control, monitor, and diagnose Freedom's ECLSS. Specifically, work on predictive monitoring, diagnosability, and diagnosis, with emphasis on the automated diagnosis of the regenerative water recovery and air revitalization processes will be discussed.

  3. Properties of induced seismicity at the geothermal reservoir Insheim, Germany

    NASA Astrophysics Data System (ADS)

    Olbert, Kai; Küperkoch, Ludger; Thomas, Meier

    2017-04-01

    Within the framework of the German MAGS2 Project the processing of induced events at the geothermal power plant Insheim, Germany, has been reassessed and evaluated. The power plant is located close to the western rim of the Upper Rhine Graben in a region with a strongly heterogeneous subsurface. Therefore, the location of seismic events particularly the depth estimation is challenging. The seismic network consisting of up to 50 stations has an aperture of approximately 15 km around the power plant. Consequently, the manual processing is time consuming. Using a waveform similarity detection algorithm, the existing dataset from 2012 to 2016 has been reprocessed to complete the catalog of induced seismic events. Based on the waveform similarity clusters of similar events have been detected. Automated P- and S-arrival time determination using an improved multi-component autoregressive prediction algorithm yields approximately 14.000 P- and S-arrivals for 758 events. Applying a dataset of manual picks as reference the automated picking algorithm has been optimized resulting in a standard deviation of the residuals between automated and manual picks of about 0.02s. The automated locations show uncertainties comparable to locations of the manual reference dataset. 90 % of the automated relocations fall within the error ellipsoid of the manual locations. The remaining locations are either badly resolved due to low numbers of picks or so well resolved that the automatic location is outside the error ellipsoid although located close to the manual location. The developed automated processing scheme proved to be a useful tool to supplement real-time monitoring. The event clusters are located at small patches of faults known from reflection seismic studies. The clusters are observed close to both the injection as well as the production wells.

  4. The dorsal anterior cingulate cortex is selective for pain: Results from large-scale reverse inference

    PubMed Central

    Lieberman, Matthew D.; Eisenberger, Naomi I.

    2015-01-01

    Dorsal anterior cingulate cortex (dACC) activation is commonly observed in studies of pain, executive control, conflict monitoring, and salience processing, making it difficult to interpret the dACC’s specific psychological function. Using Neurosynth, an automated brainmapping database [of over 10,000 functional MRI (fMRI) studies], we performed quantitative reverse inference analyses to explore the best general psychological account of the dACC function P(Ψ process|dACC activity). Results clearly indicated that the best psychological description of dACC function was related to pain processing—not executive, conflict, or salience processing. We conclude by considering that physical pain may be an instance of a broader class of survival-relevant goals monitored by the dACC, in contrast to more arbitrary temporary goals, which may be monitored by the supplementary motor area. PMID:26582792

  5. Processing and review interface for strong motion data (PRISM) software, version 1.0.0—Methodology and automated processing

    USGS Publications Warehouse

    Jones, Jeanne; Kalkan, Erol; Stephens, Christopher

    2017-02-23

    A continually increasing number of high-quality digital strong-motion records from stations of the National Strong-Motion Project (NSMP) of the U.S. Geological Survey (USGS), as well as data from regional seismic networks within the United States, call for automated processing of strong-motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong-motion records. When used without AQMS, PRISM provides batch-processing capabilities. The PRISM version 1.0.0 is platform independent (coded in Java), open source, and does not depend on any closed-source or proprietary software. The software consists of two major components: a record processing engine and a review tool that has a graphical user interface (GUI) to manually review, edit, and process records. To facilitate use by non-NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand-alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible in order to accommodate new processing techniques. This report provides a thorough description and examples of the record processing features supported by PRISM. All the computing features of PRISM have been thoroughly tested.

  6. The "hospital central laboratory": automation, integration and clinical usefulness.

    PubMed

    Zaninotto, Martina; Plebani, Mario

    2010-07-01

    Recent technological developments in laboratory medicine have led to a major challenge, maintaining a close connection between the search of efficiency through automation and consolidation and the assurance of effectiveness. The adoption of systems that automate most of the manual tasks characterizing routine activities has significantly improved the quality of laboratory performance; total laboratory automation being the paradigm of the idea that "human-less" robotic laboratories may allow for better operation and insuring less human errors. Furthermore, even if ongoing technological developments have considerably improved the productivity of clinical laboratories as well as reducing the turnaround time of the entire process, the value of qualified personnel remains a significant issue. Recent evidence confirms that automation allows clinical laboratories to improve analytical performances only if trained staff operate in accordance with well-defined standard operative procedures, thus assuring continuous monitoring of the analytical quality. In addition, laboratory automation may improve the appropriateness of test requests through the use of algorithms and reflex testing. This should allow the adoption of clinical and biochemical guidelines. In conclusion, in laboratory medicine, technology represents a tool for improving clinical effectiveness and patient outcomes, but it has to be managed by qualified laboratory professionals.

  7. Automated delineation and characterization of watersheds for more than 3,000 surface-water-quality monitoring stations active in 2010 in Texas

    USGS Publications Warehouse

    Archuleta, Christy-Ann M.; Gonzales, Sophia L.; Maltby, David R.

    2012-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Texas Commission on Environmental Quality, developed computer scripts and applications to automate the delineation of watershed boundaries and compute watershed characteristics for more than 3,000 surface-water-quality monitoring stations in Texas that were active during 2010. Microsoft Visual Basic applications were developed using ArcGIS ArcObjects to format the source input data required to delineate watershed boundaries. Several automated scripts and tools were developed or used to calculate watershed characteristics using Python, Microsoft Visual Basic, and the RivEX tool. Automated methods were augmented by the use of manual methods, including those done using ArcMap software. Watershed boundaries delineated for the monitoring stations are limited to the extent of the Subbasin boundaries in the USGS Watershed Boundary Dataset, which may not include the total watershed boundary from the monitoring station to the headwaters.

  8. Is partially automated driving a bad idea? Observations from an on-road study.

    PubMed

    Banks, Victoria A; Eriksson, Alexander; O'Donoghue, Jim; Stanton, Neville A

    2018-04-01

    The automation of longitudinal and lateral control has enabled drivers to become "hands and feet free" but they are required to remain in an active monitoring state with a requirement to resume manual control if required. This represents the single largest allocation of system function problem with vehicle automation as the literature suggests that humans are notoriously inefficient at completing prolonged monitoring tasks. To further explore whether partially automated driving solutions can appropriately support the driver in completing their new monitoring role, video observations were collected as part of an on-road study using a Tesla Model S being operated in Autopilot mode. A thematic analysis of video data suggests that drivers are not being properly supported in adhering to their new monitoring responsibilities and instead demonstrate behaviour indicative of complacency and over-trust. These attributes may encourage drivers to take more risks whilst out on the road. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Flight-deck automation - Promises and problems

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.; Curry, R. E.

    1980-01-01

    The paper analyzes the role of human factors in flight-deck automation, identifies problem areas, and suggests design guidelines. Flight-deck automation using microprocessor technology and display systems improves performance and safety while leading to a decrease in size, cost, and power consumption. On the other hand negative factors such as failure of automatic equipment, automation-induced error compounded by crew error, crew error in equipment set-up, failure to heed automatic alarms, and loss of proficiency must also be taken into account. Among the problem areas discussed are automation of control tasks, monitoring of complex systems, psychosocial aspects of automation, and alerting and warning systems. Guidelines are suggested for designing, utilising, and improving control and monitoring systems. Investigation into flight-deck automation systems is important as the knowledge gained can be applied to other systems such as air traffic control and nuclear power generation, but the many problems encountered with automated systems need to be analyzed and overcome in future research.

  10. Using Bayesian networks to support decision-focused information retrieval

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehner, P.; Elsaesser, C.; Seligman, L.

    This paper has described an approach to controlling the process of pulling data/information from distributed data bases in a way that is specific to a persons specific decision making context. Our prototype implementation of this approach uses a knowledge-based planner to generate a plan, an automatically constructed Bayesian network to evaluate the plan, specialized processing of the network to derive key information items that would substantially impact the evaluation of the plan (e.g., determine that replanning is needed), automated construction of Standing Requests for Information (SRIs) which are automated functions that monitor changes and trends in distributed data base thatmore » are relevant to the key information items. This emphasis of this paper is on how Bayesian networks are used.« less

  11. Digital Automation and Real-Time Monitoring of an Original Installation for "Wet Combustion" of Organic Wastes

    NASA Astrophysics Data System (ADS)

    Morozov, Yegor; Tikhomirov, Alexander A.; Saltykov, Mikhail; Trifonov, Sergey V.; Kudenko, D.. Yurii A.

    2016-07-01

    An original method for "wet combustion" of organic wastes, which is being developed at the IBP SB RAS, is a very promising approach for regeneration of nutrient solutions for plants in future spacecraft closed Bioregenerative Life Support Systems (BLSS). The method is quick, ecofriendly, does not require special conditions such as high pressure and temperature, and the resulting nitrogen stays in forms easy for further preparation of the fertilizer. An experimental testbed of a new-generation closed ecosystem is being currently run at the IBP SB RAS to examine compatibility of the latest technologies for accelerating the cycling. Integration of "wet combustion" of organic wastes into the information system of closed ecosystem experimental testbed has been studied as part of preparatory work. Digital automation and real-time monitoring of original "wet combustion" installation operation parameters have been implemented. The new system enabled remotely controlled or automatic work of the installation. Data are stored in standard easily processed formats, allowing further mathematical processing where necessary. During ongoing experiments on improving "wet combustion" of organic wastes, automatic monitoring can notice slight changes in process parameters and record them in more detail. The ultimate goal of the study is to include the "wet combustion" installation into future full-scale experiment with humans, thus reducing the time spent by the crew on life support issues while living in the BLSS. The work was carried out with the financial support of the Russian Scientific Foundation (project 14-14-00599).

  12. Automating usability of ATLAS Distributed Computing resources

    NASA Astrophysics Data System (ADS)

    Tupputi, S. A.; Di Girolamo, A.; Kouba, T.; Schovancová, J.; Atlas Collaboration

    2014-06-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  13. Some Hardware and Instrumentation Aspects of the Development of an Automation System for Jar Tests in Drinking Water Treatment

    PubMed Central

    2017-01-01

    The so-called Jar Test (JT) plays a vital role in the drinking water and wastewater treatments for establishing the dosage of flocculants and coagulant. This test is a well-proved laboratory instrumental procedure performed by trained personnel. In this work, a completely novel system for the automation and monitoring of a JT devoted to drinking water treatment is presented. It has been implemented using an industrial programmable controller and sensors and instruments specifically selected for this purpose. Once the parameters of the test have been entered, the stages that compose the JT (stirring, coagulant addition, etc.) are sequentially performed without human intervention. Moreover, all the involved measurements from sensors are collected and made accessible for continuous monitoring of the process. By means of the proposed system, the JT procedure is conducted fully automatically and can be locally and remotely monitored in real-time. Furthermore, the developed system constitutes a portable laboratory that offers advantageous features like scalability and transportability. The proposed system is described focusing on hardware and instrumentation aspects, and successful results are reported. PMID:29019943

  14. Automated CFD Parameter Studies on Distributed Parallel Computers

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.; Aftosmis, Michael; Pandya, Shishir; Tejnil, Edward; Ahmad, Jasim; Kwak, Dochan (Technical Monitor)

    2002-01-01

    The objective of the current work is to build a prototype software system which will automated the process of running CFD jobs on Information Power Grid (IPG) resources. This system should remove the need for user monitoring and intervention of every single CFD job. It should enable the use of many different computers to populate a massive run matrix in the shortest time possible. Such a software system has been developed, and is known as the AeroDB script system. The approach taken for the development of AeroDB was to build several discrete modules. These include a database, a job-launcher module, a run-manager module to monitor each individual job, and a web-based user portal for monitoring of the progress of the parameter study. The details of the design of AeroDB are presented in the following section. The following section provides the results of a parameter study which was performed using AeroDB for the analysis of a reusable launch vehicle (RLV). The paper concludes with a section on the lessons learned in this effort, and ideas for future work in this area.

  15. SHARP: A multi-mission artificial intelligence system for spacecraft telemetry monitoring and diagnosis

    NASA Technical Reports Server (NTRS)

    Lawson, Denise L.; James, Mark L.

    1989-01-01

    The Spacecraft Health Automated Reasoning Prototype (SHARP) is a system designed to demonstrate automated health and status analysis for multi-mission spacecraft and ground data systems operations. Telecommunications link analysis of the Voyager 2 spacecraft is the initial focus for the SHARP system demonstration which will occur during Voyager's encounter with the planet Neptune in August, 1989, in parallel with real time Voyager operations. The SHARP system combines conventional computer science methodologies with artificial intelligence techniques to produce an effective method for detecting and analyzing potential spacecraft and ground systems problems. The system performs real time analysis of spacecraft and other related telemetry, and is also capable of examining data in historical context. A brief introduction is given to the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory. The current method of operation for monitoring the Voyager Telecommunications subsystem is described, and the difficulties associated with the existing technology are highlighted. The approach taken in the SHARP system to overcome the current limitations is also described, as well as both the conventional and artificial intelligence solutions developed in SHARP.

  16. Automating ATLAS Computing Operations using the Site Status Board

    NASA Astrophysics Data System (ADS)

    J, Andreeva; Iglesias C, Borrego; S, Campana; Girolamo A, Di; I, Dzhunov; Curull X, Espinal; S, Gayazov; E, Magradze; M, Nowotka M.; L, Rinaldi; P, Saiz; J, Schovancova; A, Stewart G.; M, Wright

    2012-12-01

    The automation of operations is essential to reduce manpower costs and improve the reliability of the system. The Site Status Board (SSB) is a framework which allows Virtual Organizations to monitor their computing activities at distributed sites and to evaluate site performance. The ATLAS experiment intensively uses the SSB for the distributed computing shifts, for estimating data processing and data transfer efficiencies at a particular site, and for implementing automatic exclusion of sites from computing activities, in case of potential problems. The ATLAS SSB provides a real-time aggregated monitoring view and keeps the history of the monitoring metrics. Based on this history, usability of a site from the perspective of ATLAS is calculated. The paper will describe how the SSB is integrated in the ATLAS operations and computing infrastructure and will cover implementation details of the ATLAS SSB sensors and alarm system, based on the information in the SSB. It will demonstrate the positive impact of the use of the SSB on the overall performance of ATLAS computing activities and will overview future plans.

  17. Studying fish near ocean energy devices using underwater video

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzner, Shari; Hull, Ryan E.; Harker-Klimes, Genevra EL

    The effects of energy devices on fish populations are not well-understood, and studying the interactions of fish with tidal and instream turbines is challenging. To address this problem, we have evaluated algorithms to automatically detect fish in underwater video and propose a semi-automated method for ocean and river energy device ecological monitoring. The key contributions of this work are the demonstration of a background subtraction algorithm (ViBE) that detected 87% of human-identified fish events and is suitable for use in a real-time system to reduce data volume, and the demonstration of a statistical model to classify detections as fish ormore » not fish that achieved a correct classification rate of 85% overall and 92% for detections larger than 5 pixels. Specific recommendations for underwater video acquisition to better facilitate automated processing are given. The recommendations will help energy developers put effective monitoring systems in place, and could lead to a standard approach that simplifies the monitoring effort and advances the scientific understanding of the ecological impacts of ocean and river energy devices.« less

  18. Checking Flight Rules with TraceContract: Application of a Scala DSL for Trace Analysis

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Havelund, Klaus; Morris, Robert A.

    2011-01-01

    Typically during the design and development of a NASA space mission, rules and constraints are identified to help reduce reasons for failure during operations. These flight rules are usually captured in a set of indexed tables, containing rule descriptions, rationales for the rules, and other information. Flight rules can be part of manual operations procedures carried out by humans. However, they can also be automated, and either implemented as on-board monitors, or as ground based monitors that are part of a ground data system. In the case of automated flight rules, one considerable expense to be addressed for any mission is the extensive process by which system engineers express flight rules in prose, software developers translate these requirements into code, and then both experts verify that the resulting application is correct. This paper explores the potential benefits of using an internal Scala DSL for general trace analysis, named TRACECONTRACT, to write executable specifications of flight rules. TRACECONTRACT can generally be applied to analysis of for example log files or for monitoring executing systems online.

  19. Some Hardware and Instrumentation Aspects of the Development of an Automation System for Jar Tests in Drinking Water Treatment.

    PubMed

    Calderón, Antonio José; González, Isaías

    2017-10-11

    The so-called Jar Test (JT) plays a vital role in the drinking water and wastewater treatments for establishing the dosage of flocculants and coagulant. This test is a well-proved laboratory instrumental procedure performed by trained personnel. In this work, a completely novel system for the automation and monitoring of a JT devoted to drinking water treatment is presented. It has been implemented using an industrial programmable controller and sensors and instruments specifically selected for this purpose. Once the parameters of the test have been entered, the stages that compose the JT (stirring, coagulant addition, etc.) are sequentially performed without human intervention. Moreover, all the involved measurements from sensors are collected and made accessible for continuous monitoring of the process. By means of the proposed system, the JT procedure is conducted fully automatically and can be locally and remotely monitored in real-time. Furthermore, the developed system constitutes a portable laboratory that offers advantageous features like scalability and transportability. The proposed system is described focusing on hardware and instrumentation aspects, and successful results are reported.

  20. SHARP: A multi-mission AI system for spacecraft telemetry monitoring and diagnosis

    NASA Technical Reports Server (NTRS)

    Lawson, Denise L.; James, Mark L.

    1989-01-01

    The Spacecraft Health Automated Reasoning Prototype (SHARP) is a system designed to demonstrate automated health and status analysis for multi-mission spacecraft and ground data systems operations. Telecommunications link analysis of the Voyager II spacecraft is the initial focus for the SHARP system demonstration which will occur during Voyager's encounter with the planet Neptune in August, 1989, in parallel with real-time Voyager operations. The SHARP system combines conventional computer science methodologies with artificial intelligence techniques to produce an effective method for detecting and analyzing potential spacecraft and ground systems problems. The system performs real-time analysis of spacecraft and other related telemetry, and is also capable of examining data in historical context. A brief introduction is given to the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory. The current method of operation for monitoring the Voyager Telecommunications subsystem is described, and the difficulties associated with the existing technology are highlighted. The approach taken in the SHARP system to overcome the current limitations is also described, as well as both the conventional and artificial intelligence solutions developed in SHARP.

  1. Vibration based structural health monitoring of an arch bridge: From automated OMA to damage detection

    NASA Astrophysics Data System (ADS)

    Magalhães, F.; Cunha, A.; Caetano, E.

    2012-04-01

    In order to evaluate the usefulness of approaches based on modal parameters tracking for structural health monitoring of bridges, in September of 2007, a dynamic monitoring system was installed in a concrete arch bridge at the city of Porto, in Portugal. The implementation of algorithms to perform the continuous on-line identification of modal parameters based on structural responses to ambient excitation (automated Operational Modal Analysis) has permitted to create a very complete database with the time evolution of the bridge modal characteristics during more than 2 years. This paper describes the strategy that was followed to minimize the effects of environmental and operational factors on the bridge natural frequencies, enabling, in a subsequent stage, the identification of structural anomalies. Alternative static and dynamic regression models are tested and complemented by a Principal Components Analysis. Afterwards, the identification of damages is tried with control charts. At the end, it is demonstrated that the adopted processing methodology permits the detection of realistic damage scenarios, associated with frequency shifts around 0.2%, which were simulated with a numerical model.

  2. The Standard Autonomous File Server, A Customized, Off-the-Shelf Success Story

    NASA Technical Reports Server (NTRS)

    Semancik, Susan K.; Conger, Annette M.; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    The Standard Autonomous File Server (SAFS), which includes both off-the-shelf hardware and software, uses an improved automated file transfer process to provide a quicker, more reliable, prioritized file distribution for customers of near real-time data without interfering with the assets involved in the acquisition and processing of the data. It operates as a stand-alone solution, monitoring itself, and providing an automated fail-over process to enhance reliability. This paper describes the unique problems and lessons learned both during the COTS selection and integration into SAFS, and the system's first year of operation in support of NASA's satellite ground network. COTS was the key factor in allowing the two-person development team to deploy systems in less than a year, meeting the required launch schedule. The SAFS system has been so successful; it is becoming a NASA standard resource, leading to its nomination for NASA's Software of the Year Award in 1999.

  3. A Forest Fire Sensor Web Concept with UAVSAR

    NASA Astrophysics Data System (ADS)

    Lou, Y.; Chien, S.; Clark, D.; Doubleday, J.; Muellerschoen, R.; Zheng, Y.

    2008-12-01

    We developed a forest fire sensor web concept with a UAVSAR-based smart sensor and onboard automated response capability that will allow us to monitor fire progression based on coarse initial information provided by an external source. This autonomous disturbance detection and monitoring system combines the unique capabilities of imaging radar with high throughput onboard processing technology and onboard automated response capability based on specific science algorithms. In this forest fire sensor web scenario, a fire is initially located by MODIS/RapidFire or a ground-based fire observer. This information is transmitted to the UAVSAR onboard automated response system (CASPER). CASPER generates a flight plan to cover the alerted fire area and executes the flight plan. The onboard processor generates the fuel load map from raw radar data, used with wind and elevation information, predicts the likely fire progression. CASPER then autonomously alters the flight plan to track the fire progression, providing this information to the fire fighting team on the ground. We can also relay the precise fire location to other remote sensing assets with autonomous response capability such as Earth Observation-1 (EO-1)'s hyper-spectral imager to acquire the fire data.

  4. AAC Best Practice Using Automated Language Activity Monitoring.

    ERIC Educational Resources Information Center

    Hill, Katya; Romich, Barry

    This brief paper describes automated language activity monitoring (LAM), an augmentative and alternative communication (AAC) methodology for the collection, editing, and analysis of language data in structured or natural situations with people who have severe communication disorders. The LAM function records each language event (letters, words,…

  5. How does a collision warning system shape driver's brake response time? The influence of expectancy and automation complacency on real-life emergency braking.

    PubMed

    Ruscio, Daniele; Ciceri, Maria Rita; Biassoni, Federica

    2015-04-01

    Brake Reaction Time (BRT) is an important parameter for road safety. Previous research has shown that drivers' expectations can impact RT when facing hazardous situations, but driving with advanced driver assistance systems, can change the way BRT are considered. The interaction with a collision warning system can help faster more efficient responses, but at the same time can require a monitoring task and evaluation process that may lead to automation complacency. The aims of the present study are to test in a real-life setting whether automation compliancy can be generated by a collision warning system and what component of expectancy can impact the different tasks involved in an assisted BRT process. More specifically four component of expectancy were investigated: presence/absence of anticipatory information, previous direct experience, reliability of the device, and predictability of the hazard determined by repeated use of the warning system. Results supply indication on perception time and mental elaboration of the collision warning system alerts. In particular reliable warning quickened the decision making process, misleading warnings generated automation complacency slowing visual search for hazard detection, lack of directed experienced slowed the overall response while unexpected failure of the device lead to inattentional blindness and potential pseudo-accidents with surprise obstacle intrusion. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Trajectory-based change detection for automated characterization of forest disturbance dynamics

    Treesearch

    Robert E. Kennedy; Warren B. Cohen; Todd A. Schroeder

    2007-01-01

    Satellite sensors are well suited to monitoring changes on the Earth's surface through provision of consistent and repeatable measurements at a spatial scale appropriate for many processes causing change on the land surface. Here, we describe and test a new conceptual approach to change detection of forests using a dense temporal stack of Landsat Thematic Mapper (...

  7. Application of Intelligent Tutoring Technology to an Apparently Mechanical Task.

    ERIC Educational Resources Information Center

    Newman, Denis

    The increasing automation of many occupations leads to jobs that involve understanding and monitoring the operation of complex computer systems. One case is PATRIOT, an air defense surface-to-air missile system deployed by the U.S. Army. Radar information is processed and presented to the operators in highly abstract form. The system identifies…

  8. An automated image processing method for classification of diabetic retinopathy stages from conjunctival microvasculature images

    NASA Astrophysics Data System (ADS)

    Khansari, Maziyar M.; O'Neill, William; Penn, Richard; Blair, Norman P.; Chau, Felix; Shahidi, Mahnaz

    2017-03-01

    The conjunctiva is a densely vascularized tissue of the eye that provides an opportunity for imaging of human microcirculation. In the current study, automated fine structure analysis of conjunctival microvasculature images was performed to discriminate stages of diabetic retinopathy (DR). The study population consisted of one group of nondiabetic control subjects (NC) and 3 groups of diabetic subjects, with no clinical DR (NDR), non-proliferative DR (NPDR), or proliferative DR (PDR). Ordinary least square regression and Fisher linear discriminant analyses were performed to automatically discriminate images between group pairs of subjects. Human observers who were masked to the grouping of subjects performed image discrimination between group pairs. Over 80% and 70% of images of subjects with clinical and non-clinical DR were correctly discriminated by the automated method, respectively. The discrimination rates of the automated method were higher than human observers. The fine structure analysis of conjunctival microvasculature images provided discrimination of DR stages and can be potentially useful for DR screening and monitoring.

  9. Real-time monitoring of the budding index in Saccharomyces cerevisiae batch cultivations with in situ microscopy.

    PubMed

    Marbà-Ardébol, Anna-Maria; Emmerich, Jörn; Muthig, Michael; Neubauer, Peter; Junne, Stefan

    2018-05-15

    The morphology of yeast cells changes during budding, depending on the growth rate and cultivation conditions. A photo-optical microscope was adapted and used to observe such morphological changes of individual cells directly in the cell suspension. In order to obtain statistically representative samples of the population without the influence of sampling, in situ microscopy (ISM) was applied in the different phases of a Saccharomyces cerevisiae batch cultivation. The real-time measurement was performed by coupling a photo-optical probe to an automated image analysis based on a neural network approach. Automatic cell recognition and classification of budding and non-budding cells was conducted successfully. Deviations between automated and manual counting were considerably low. A differentiation of growth activity across all process stages of a batch cultivation in complex media became feasible. An increased homogeneity among the population during the growth phase was well observable. At growth retardation, the portion of smaller cells increased due to a reduced bud formation. The maturation state of the cells was monitored by determining the budding index as a ratio between the number of cells, which were detected with buds and the total number of cells. A linear correlation between the budding index as monitored with ISM and the growth rate was found. It is shown that ISM is a meaningful analytical tool, as the budding index can provide valuable information about the growth activity of a yeast cell, e.g. in seed breeding or during any other cultivation process. The determination of the single-cell size and shape distributions provided information on the morphological heterogeneity among the populations. The ability to track changes in cell morphology directly on line enables new perspectives for monitoring and control, both in process development and on a production scale.

  10. Application of SAR remote sensing and crop modeling for operational rice crop monitoring in South and South East Asian Countries

    NASA Astrophysics Data System (ADS)

    Setiyono, T. D.; Holecz, F.; Khan, N. I.; Barbieri, M.; Maunahan, A. A.; Gatti, L.; Quicho, E. D.; Pazhanivelan, S.; Campos-Taberner, M.; Collivignarelli, F.; Haro, J. G.; Intrman, A.; Phuong, D.; Boschetti, M.; Prasadini, P.; Busetto, L.; Minh, V. Q.; Tuan, V. Q.

    2017-12-01

    This study uses multi-temporal SAR imagery, automated image processing, rule-based classification and field observations to classify rice in multiple locations in South and South Asian countries and assimilate the information into ORYZA Crop Growth Simulation Model (CGSM) to monitor rice yield. The study demonstrates examples of operational application of this rice monitoring system in: (1) detecting drought impact on rice planting in Central Thailand and Tamil Nadu, India, (2) mapping heat stress impact on rice yield in Andhra Pradesh, India, and (3) generating historical rice yield data for districts in Red River Delta, Vietnam.

  11. Improving medical stores management through automation and effective communication.

    PubMed

    Kumar, Ashok; Cariappa, M P; Marwaha, Vishal; Sharma, Mukti; Arora, Manu

    2016-01-01

    Medical stores management in hospitals is a tedious and time consuming chore with limited resources tasked for the purpose and poor penetration of Information Technology. The process of automation is slow paced due to various inherent factors and is being challenged by the increasing inventory loads and escalating budgets for procurement of drugs. We carried out an indepth case study at the Medical Stores of a tertiary care health care facility. An iterative six step Quality Improvement (QI) process was implemented based on the Plan-Do-Study-Act (PDSA) cycle. The QI process was modified as per requirement to fit the medical stores management model. The results were evaluated after six months. After the implementation of QI process, 55 drugs of the medical store inventory which had expired since 2009 onwards were replaced with fresh stock by the suppliers as a result of effective communication through upgraded database management. Various pending audit objections were dropped due to the streamlined documentation and processes. Inventory management improved drastically due to automation, with disposal orders being initiated four months prior to the expiry of drugs and correct demands being generated two months prior to depletion of stocks. The monthly expense summary of drugs was now being done within ten days of the closing month. Improving communication systems within the hospital with vendor database management and reaching out to clinicians is important. Automation of inventory management requires to be simple and user-friendly, utilizing existing hardware. Physical stores monitoring is indispensable, especially due to the scattered nature of stores. Staff training and standardized documentation protocols are the other keystones for optimal medical store management.

  12. First On-Site Data Analysis System for Subaru/Suprime-Cam

    NASA Astrophysics Data System (ADS)

    Furusawa, Hisanori; Okura, Yuki; Mineo, Sogo; Takata, Tadafumi; Nakata, Fumiaki; Tanaka, Manobu; Katayama, Nobuhiko; Itoh, Ryosuke; Yasuda, Naoki; Miyazaki, Satoshi; Komiyama, Yutaka; Utsumi, Yousuke; Uchida, Tomohisa; Aihara, Hiroaki

    2011-03-01

    We developed an automated on-site quick analysis system for mosaic CCD data of Suprime-Cam, which is a wide-field camera mounted at the prime focus of the Subaru Telescope, Mauna Kea, Hawaii. The first version of the data-analysis system was constructed, and started to operate in general observations. This system is a new function of observing support at the Subaru Telescope to provide the Subaru user community with an automated on-site data evaluation, aiming at improvements of observers' productivity, especially in large imaging surveys. The new system assists the data evaluation tasks in observations by the continuous monitoring of the characteristics of every data frame during observations. The evaluation results and data frames processed by this system are also useful for reducing the data-processing time in a full analysis after an observation. The primary analysis functions implemented in the data-analysis system are composed of automated realtime analysis for data evaluation and on-demand analysis, which is executed upon request, including mosaicing analysis and flat making analysis. In data evaluation, which is controlled by the organizing software, the database keeps track of the analysis histories, as well as the evaluated values of data frames, including seeing and sky background levels; it also helps in the selection of frames for mosaicing and flat making analysis. We examined the system performance and confirmed an improvement in the data-processing time by a factor of 9 with the aid of distributed parallel data processing and on-memory data processing, which makes the automated data evaluation effective.

  13. Remote voice training: A case study on space shuttle applications, appendix C

    NASA Technical Reports Server (NTRS)

    Mollakarimi, Cindy; Hamid, Tamin

    1990-01-01

    The Tile Automation System includes applications of automation and robotics technology to all aspects of the Shuttle tile processing and inspection system. An integrated set of rapid prototyping testbeds was developed which include speech recognition and synthesis, laser imaging systems, distributed Ada programming environments, distributed relational data base architectures, distributed computer network architectures, multi-media workbenches, and human factors considerations. Remote voice training in the Tile Automation System is discussed. The user is prompted over a headset by synthesized speech for the training sequences. The voice recognition units and the voice output units are remote from the user and are connected by Ethernet to the main computer system. A supervisory channel is used to monitor the training sequences. Discussions include the training approaches as well as the human factors problems and solutions for this system utilizing remote training techniques.

  14. Advances in in situ inspection of automated fiber placement systems

    NASA Astrophysics Data System (ADS)

    Juarez, Peter D.; Cramer, K. Elliott; Seebo, Jeffrey P.

    2016-05-01

    Automated Fiber Placement (AFP) systems have been developed to help take advantage of the tailorability of composite structures in aerospace applications. AFP systems allow the repeatable placement of uncured, spool fed, preimpregnated carbon fiber tape (tows) onto substrates in desired thicknesses and orientations. This automated process can incur defects, such as overlapping tow lines, which can severely undermine the structural integrity of the part. Current defect detection and abatement methods are very labor intensive, and still mostly rely on human manual inspection. Proposed is a thermographic in situ inspection technique which monitors tow placement with an on board thermal camera using the preheated substrate as a through transmission heat source. An investigation of the concept is conducted, and preliminary laboratory results are presented. Also included will be a brief overview of other emerging technologies that tackle the same issue.

  15. Utility of an automated thermal-based approach for monitoring evapotranspiration

    USDA-ARS?s Scientific Manuscript database

    A very simple remote sensing-based model for water use monitoring is presented. The model acronym DATTUTDUT, (Deriving Atmosphere Turbulent Transport Useful To Dummies Using Temperature) is a Dutch word which loosely translates as “It’s unbelievable that it works”. DATTUTDUT is fully automated and o...

  16. Automated Iodine Monitoring System Development (AIMS). [shuttle prototype

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The operating principle of the automated iodine monitoring/controller system (AIMS) is described along with several design modifications. The iodine addition system is also discussed along with test setups and calibration; a facsimile of the optical/mechanical portion of the iodine monitor was fabricated and tested. The appendices include information on shuttle prototype AIMS, preliminary prime item development specifications, preliminary failure modes and effects analysis, and preliminary operating and maintenance instructions.

  17. Automated data selection method to improve robustness of diffuse optical tomography for breast cancer imaging

    PubMed Central

    Vavadi, Hamed; Zhu, Quing

    2016-01-01

    Imaging-guided near infrared diffuse optical tomography (DOT) has demonstrated a great potential as an adjunct modality for differentiation of malignant and benign breast lesions and for monitoring treatment response of breast cancers. However, diffused light measurements are sensitive to artifacts caused by outliers and errors in measurements due to probe-tissue coupling, patient and probe motions, and tissue heterogeneity. In general, pre-processing of the measurements is needed by experienced users to manually remove these outliers and therefore reduce imaging artifacts. An automated method of outlier removal, data selection, and filtering for diffuse optical tomography is introduced in this manuscript. This method consists of multiple steps to first combine several data sets collected from the same patient at contralateral normal breast and form a single robust reference data set using statistical tests and linear fitting of the measurements. The second step improves the perturbation measurements by filtering out outliers from the lesion site measurements using model based analysis. The results of 20 malignant and benign cases show similar performance between manual data processing and automated processing and improvement in tissue characterization of malignant to benign ratio by about 27%. PMID:27867711

  18. A prototype of an automated high resolution InSAR volcano-monitoring system in the MED-SUV project

    NASA Astrophysics Data System (ADS)

    Chowdhury, Tanvir A.; Minet, Christian; Fritz, Thomas

    2016-04-01

    Volcanic processes which produce a variety of geological and hydrological hazards are difficult to predict and capable of triggering natural disasters on regional to global scales. Therefore it is important to monitor volcano continuously and with a high spatial and temporal sampling rate. The monitoring of active volcanoes requires the reliable measurement of surface deformation before, during and after volcanic activities and it helps for the better understanding and modelling of the involved geophysical processes. Space-borne synthetic aperture radar (SAR) interferometry (InSAR), persistent scatterer interferometry (PSI) and small baseline subset algorithm (SBAS) provide a powerful tool for observing the eruptive activities and measuring the surface changes of millimetre accuracy. All the mentioned techniques with deformation time series extraction address the challenges by exploiting medium to large SAR image stacks. The process of selecting, ordering, downloading, storing, logging, extracting and preparing the data for processing is very time consuming has to be done manually for every single data-stack. In many cases it is even an iterative process which has to be done regularly and continuously. Therefore, data processing becomes slow which causes significant delays in data delivery. The SAR Satellite based High Resolution Data Acquisition System, which will be developed at DLR, will automate this entire time consuming tasks and allows an operational volcano monitoring system. Every 24 hours the system runs for searching new acquired scene over the volcanoes and keeps track of the data orders, log the status and download the provided data via ftp-transfer including E-Mail alert. Furthermore, the system will deliver specified reports and maps to a database for review and use by specialists. The user interaction will be minimized and iterative processes will be totally avoided. In this presentation, a prototype of SAR Satellite based High Resolution Data Acquisition System, which is developed and operated by DLR, will be described in detail. The workflow of the developed system is described which allow a meaningful contribution of SAR for monitoring volcanic eruptive activities. A more robust and efficient InSAR data processing in IWAP processor will be introduced in the framework of a remote sensing task of MED-SUV project. An application of the developed prototype system to a historic eruption of Mount Etna and Piton de la Fournaise will be depicted in the last part of the presentation.

  19. Automation study for space station subsystems and mission ground support

    NASA Technical Reports Server (NTRS)

    1985-01-01

    An automation concept for the autonomous operation of space station subsystems, i.e., electric power, thermal control, and communications and tracking are discussed. To assure that functions essential for autonomous operations are not neglected, an operations function (systems monitoring and control) is included in the discussion. It is recommended that automated speech recognition and synthesis be considered a basic mode of man/machine interaction for space station command and control, and that the data management system (DMS) and other systems on the space station be designed to accommodate fully automated fault detection, isolation, and recovery within the system monitoring function of the DMS.

  20. Automated iodine monitor system. [for aqueous solutions

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The feasibility of a direct spectrophotometric measurement of iodine in water was established. An iodine colorimeter, was built to demonstrate the practicality of this technique. The specificity of this method was verified when applied to an on-line system where a reference solution cannot be used, and a preliminary design is presented for an automated iodine measuring and controlling system meeting the desired specifications. An Automated iodine monitor/controller system based on this preliminary design was built, tested, and delivered to the Johnson Space Center.

  1. Technicians monitor USMP-4 experiments being prepared for flight on STS-87 in the SSPF

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Technicians are monitoring experiments on the United States Microgravity Payload-4 (USMP-4) in preparation for its scheduled launch aboard STS-87 on Nov. 19 from Kennedy Space Center (KSC). USMP-4 experiments are prepared in the Space Station Processing Facility at KSC. The large white vertical cylinder in the center of the photo is the Advanced Automated Directional Solidification Furnace (AADSF), which is a sophisticated materials science facility used for studying a common method of processing semiconductor crystals called directional solidification. The white horizontal tube to the right is the Isothermal Dendritic Growth Experiment (IDGE), which will be used to study the dendritic solidification of molten materials in the microgravity environment.

  2. Technicians monitor USMP-4 experiments being prepared for flight on STS-87 in the SSPF

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Technicians are monitoring experiments on the United States Microgravity Payload-4 (USMP-4) in preparation for its scheduled launch aboard STS-87 on Nov. 19 from Kennedy Space Center (KSC). USMP-4 experiments are prepared in the Space Station Processing Facility at KSC. The large white vertical cylinder at the right of the photo is the Advanced Automated Directional Solidification Furnace (AADSF ), which is a sophisticated materials science facility used for studying a common method of processing semiconductor crystals called directional solidification. The technician in the middle of the photo is leaning over MEPHISTO, a cooperative American-French investigation of the fundamentals of crystal growth.

  3. Two-dimensional thermal video analysis of offshore bird and bat flight

    DOE PAGES

    Matzner, Shari; Cullinan, Valerie I.; Duberstein, Corey A.

    2015-09-11

    Thermal infrared video can provide essential information about bird and bat presence and activity for risk assessment studies, but the analysis of recorded video can be time-consuming and may not extract all of the available information. Automated processing makes continuous monitoring over extended periods of time feasible, and maximizes the information provided by video. This is especially important for collecting data in remote locations that are difficult for human observers to access, such as proposed offshore wind turbine sites. We present guidelines for selecting an appropriate thermal camera based on environmental conditions and the physical characteristics of the target animals.more » We developed new video image processing algorithms that automate the extraction of bird and bat flight tracks from thermal video, and that characterize the extracted tracks to support animal identification and behavior inference. The algorithms use a video peak store process followed by background masking and perceptual grouping to extract flight tracks. The extracted tracks are automatically quantified in terms that could then be used to infer animal type and possibly behavior. The developed automated processing generates results that are reproducible and verifiable, and reduces the total amount of video data that must be retained and reviewed by human experts. Finally, we suggest models for interpreting thermal imaging information.« less

  4. Two-dimensional thermal video analysis of offshore bird and bat flight

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzner, Shari; Cullinan, Valerie I.; Duberstein, Corey A.

    Thermal infrared video can provide essential information about bird and bat presence and activity for risk assessment studies, but the analysis of recorded video can be time-consuming and may not extract all of the available information. Automated processing makes continuous monitoring over extended periods of time feasible, and maximizes the information provided by video. This is especially important for collecting data in remote locations that are difficult for human observers to access, such as proposed offshore wind turbine sites. We present guidelines for selecting an appropriate thermal camera based on environmental conditions and the physical characteristics of the target animals.more » We developed new video image processing algorithms that automate the extraction of bird and bat flight tracks from thermal video, and that characterize the extracted tracks to support animal identification and behavior inference. The algorithms use a video peak store process followed by background masking and perceptual grouping to extract flight tracks. The extracted tracks are automatically quantified in terms that could then be used to infer animal type and possibly behavior. The developed automated processing generates results that are reproducible and verifiable, and reduces the total amount of video data that must be retained and reviewed by human experts. Finally, we suggest models for interpreting thermal imaging information.« less

  5. A wireless smart sensor network for automated monitoring of cable tension

    NASA Astrophysics Data System (ADS)

    Sim, Sung-Han; Li, Jian; Jo, Hongki; Park, Jong-Woong; Cho, Soojin; Spencer, Billie F., Jr.; Jung, Hyung-Jo

    2014-02-01

    As cables are primary load carrying members in cable-stayed bridges, monitoring the tension forces of the cables provides valuable information regarding structural soundness. Incorporating wireless smart sensors with vibration-based tension estimation methods provides an efficient means of autonomous long-term monitoring of cable tensions. This study develops a wireless cable tension monitoring system using MEMSIC’s Imote2 smart sensors. The monitoring system features autonomous operation, sustainable energy harvesting and power consumption, and remote access using the internet. To obtain the tension force, an in-network data processing strategy associated with the vibration-based tension estimation method is implemented on the Imote2-based sensor network, significantly reducing the wireless data transmission and the power consumption. The proposed monitoring system has been deployed and validated on the Jindo Bridge, a cable-stayed bridge located in South Korea.

  6. SiMon: Simulation Monitor for Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  7. HYDRA: A Middleware-Oriented Integrated Architecture for e-Procurement in Supply Chains

    NASA Astrophysics Data System (ADS)

    Alor-Hernandez, Giner; Aguilar-Lasserre, Alberto; Juarez-Martinez, Ulises; Posada-Gomez, Ruben; Cortes-Robles, Guillermo; Garcia-Martinez, Mario Alberto; Gomez-Berbis, Juan Miguel; Rodriguez-Gonzalez, Alejandro

    The Service-Oriented Architecture (SOA) development paradigm has emerged to improve the critical issues of creating, modifying and extending solutions for business processes integration, incorporating process automation and automated exchange of information between organizations. Web services technology follows the SOA's principles for developing and deploying applications. Besides, Web services are considered as the platform for SOA, for both intra- and inter-enterprise communication. However, an SOA does not incorporate information about occurring events into business processes, which are the main features of supply chain management. These events and information delivery are addressed in an Event-Driven Architecture (EDA). Taking this into account, we propose a middleware-oriented integrated architecture that offers a brokering service for the procurement of products in a Supply Chain Management (SCM) scenario. As salient contributions, our system provides a hybrid architecture combining features of both SOA and EDA and a set of mechanisms for business processes pattern management, monitoring based on UML sequence diagrams, Web services-based management, event publish/subscription and reliable messaging service.

  8. Sensors of vibration and acoustic emission for monitoring of boring with skiving cutters

    NASA Astrophysics Data System (ADS)

    Shamarin, N. N.; Filippov, A. V.; Podgornyh, O. A.; Filippova, E. O.

    2017-01-01

    Diagnosing processing system conditions is a key area in automation of modern machinery production. The article presents the results of a preliminary experimental research of the boring process using conventional and skiving cutters under the conditions of the low stiffness processing system. Acoustic emission and vibration sensors are used for cutting process diagnosis. Surface roughness after machining is determined using a laser scanning microscope. As a result, it is found that the use of skiving cutters provides greater stability of the cutting process and lower surface roughness as compared with conventional cutters.

  9. Continuous welding of unidirectional fiber reinforced thermoplastic tape material

    NASA Astrophysics Data System (ADS)

    Schledjewski, Ralf

    2017-10-01

    Continuous welding techniques like thermoplastic tape placement with in situ consolidation offer several advantages over traditional manufacturing processes like autoclave consolidation, thermoforming, etc. However, still there is a need to solve several important processing issues before it becomes a viable economic process. Intensive process analysis and optimization has been carried out in the past through experimental investigation, model definition and simulation development. Today process simulation is capable to predict resulting consolidation quality. Effects of material imperfections or process parameter variations are well known. But using this knowledge to control the process based on online process monitoring and according adaption of the process parameters is still challenging. Solving inverse problems and using methods for automated code generation allowing fast implementation of algorithms on targets are required. The paper explains the placement technique in general. Process-material-property-relationships and typical material imperfections are described. Furthermore, online monitoring techniques and how to use them for a model based process control system are presented.

  10. Image Decoding of Photonic Crystal Beads Array in the Microfluidic Chip for Multiplex Assays

    PubMed Central

    Yuan, Junjie; Zhao, Xiangwei; Wang, Xiaoxia; Gu, Zhongze

    2014-01-01

    Along with the miniaturization and intellectualization of biomedical instruments, the increasing demand of health monitoring at anywhere and anytime elevates the need for the development of point of care testing (POCT). Photonic crystal beads (PCBs) as one kind of good encoded microcarriers can be integrated with microfluidic chips in order to realize cost-effective and high sensitive multiplex bioassays. However, there are difficulties in analyzing them towards automated analysis due to the characters of the PCBs and the unique detection manner. In this paper, we propose a strategy to take advantage of automated image processing for the color decoding of the PCBs array in the microfluidic chip for multiplex assays. By processing and alignment of two modal images of epi-fluorescence and epi-white light, every intact bead in the image is accurately extracted and decoded by PC colors, which stand for the target species. This method, which shows high robustness and accuracy under various configurations, eliminates the high hardware requirement of spectroscopy analysis and user-interaction software, and provides adequate supports for the general automated analysis of POCT based on PCBs array. PMID:25341876

  11. Timeliner: Automating Procedures on the ISS

    NASA Technical Reports Server (NTRS)

    Brown, Robert; Braunstein, E.; Brunet, Rick; Grace, R.; Vu, T.; Zimpfer, Doug; Dwyer, William K.; Robinson, Emily

    2002-01-01

    Timeliner has been developed as a tool to automate procedural tasks. These tasks may be sequential tasks that would typically be performed by a human operator, or precisely ordered sequencing tasks that allow autonomous execution of a control process. The Timeliner system includes elements for compiling and executing sequences that are defined in the Timeliner language. The Timeliner language was specifically designed to allow easy definition of scripts that provide sequencing and control of complex systems. The execution environment provides real-time monitoring and control based on the commands and conditions defined in the Timeliner language. The Timeliner sequence control may be preprogrammed, compiled from Timeliner "scripts," or it may consist of real-time, interactive inputs from system operators. In general, the Timeliner system lowers the workload for mission or process control operations. In a mission environment, scripts can be used to automate spacecraft operations including autonomous or interactive vehicle control, performance of preflight and post-flight subsystem checkouts, or handling of failure detection and recovery. Timeliner may also be used for mission payload operations, such as stepping through pre-defined procedures of a scientific experiment.

  12. Comparison of a brain-based adaptive system and a manual adaptable system for invoking automation.

    PubMed

    Bailey, Nathan R; Scerbo, Mark W; Freeman, Frederick G; Mikulka, Peter J; Scott, Lorissa A

    2006-01-01

    Two experiments are presented examining adaptive and adaptable methods for invoking automation. Empirical investigations of adaptive automation have focused on methods used to invoke automation or on automation-related performance implications. However, no research has addressed whether performance benefits associated with brain-based systems exceed those in which users have control over task allocations. Participants performed monitoring and resource management tasks as well as a tracking task that shifted between automatic and manual modes. In the first experiment, participants worked with an adaptive system that used their electroencephalographic signals to switch the tracking task between automatic and manual modes. Participants were also divided between high- and low-reliability conditions for the system-monitoring task as well as high- and low-complacency potential. For the second experiment, participants operated an adaptable system that gave them manual control over task allocations. Results indicated increased situation awareness (SA) of gauge instrument settings for individuals high in complacency potential using the adaptive system. In addition, participants who had control over automation performed more poorly on the resource management task and reported higher levels of workload. A comparison between systems also revealed enhanced SA of gauge instrument settings and decreased workload in the adaptive condition. The present results suggest that brain-based adaptive automation systems may enhance perceptual level SA while reducing mental workload relative to systems requiring user-initiated control. Potential applications include automated systems for which operator monitoring performance and high-workload conditions are of concern.

  13. Passive Seismic Monitoring for Rockfall at Yucca Mountain: Concept Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, J; Twilley, K; Murvosh, H

    2003-03-03

    For the purpose of proof-testing a system intended to remotely monitor rockfall inside a potential radioactive waste repository at Yucca Mountain, a system of seismic sub-arrays will be deployed and tested on the surface of the mountain. The goal is to identify and locate rockfall events remotely using automated data collecting and processing techniques. We install seismometers on the ground surface, generate seismic energy to simulate rockfall in underground space beneath the array, and interpret the surface response to discriminate and locate the event. Data will be analyzed using matched-field processing, a generalized beam forming method for localizing discrete signals.more » Software is being developed to facilitate the processing. To date, a three-component sub-array has been installed and successfully tested.« less

  14. Improved maintainability of space-based reusable rocket engines

    NASA Technical Reports Server (NTRS)

    Barkhoudarian, S.; Szemenyei, B.; Nelson, R. S.; Pauckert, R.; Harmon, T.

    1988-01-01

    Advanced, noninferential, noncontacting, in situ measurement technologies, combined with automated testing and expert systems, can provide continuous, automated health monitoring of critical space-based rocket engine components, requiring minimal disassembly and no manual data analysis, thus enhancing their maintainability. This paper concentrates on recent progress of noncontacting combustion chamber wall thickness condition-monitoring technologies.

  15. How gamma radiation processing systems are benefiting from the latest advances in information technology

    NASA Astrophysics Data System (ADS)

    Gibson, Wayne H.; Levesque, Daniel

    2000-03-01

    This paper discusses how gamma irradiation plants are putting the latest advances in computer and information technology to use for better process control, cost savings, and strategic advantages. Some irradiator operations are gaining significant benefits by integrating computer technology and robotics with real-time information processing, multi-user databases, and communication networks. The paper reports on several irradiation facilities that are making good use of client/server LANs, user-friendly graphics interfaces, supervisory control and data acquisition (SCADA) systems, distributed I/O with real-time sensor devices, trending analysis, real-time product tracking, dynamic product scheduling, and automated dosimetry reading. These plants are lowering costs by fast and reliable reconciliation of dosimetry data, easier validation to GMP requirements, optimizing production flow, and faster release of sterilized products to market. There is a trend in the manufacturing sector towards total automation using "predictive process control". Real-time verification of process parameters "on-the-run" allows control parameters to be adjusted appropriately, before the process strays out of limits. Applying this technology to the gamma radiation process, control will be based on monitoring the key parameters such as time, and making adjustments during the process to optimize quality and throughput. Dosimetry results will be used as a quality control measurement rather than as a final monitor for the release of the product. Results are correlated with the irradiation process data to quickly and confidently reconcile variations. Ultimately, a parametric process control system utilizing responsive control, feedback and verification will not only increase productivity and process efficiency, but can also result in operating within tighter dose control set points.

  16. Network-based production quality control

    NASA Astrophysics Data System (ADS)

    Kwon, Yongjin; Tseng, Bill; Chiou, Richard

    2007-09-01

    This study investigates the feasibility of remote quality control using a host of advanced automation equipment with Internet accessibility. Recent emphasis on product quality and reduction of waste stems from the dynamic, globalized and customer-driven market, which brings opportunities and threats to companies, depending on the response speed and production strategies. The current trends in industry also include a wide spread of distributed manufacturing systems, where design, production, and management facilities are geographically dispersed. This situation mandates not only the accessibility to remotely located production equipment for monitoring and control, but efficient means of responding to changing environment to counter process variations and diverse customer demands. To compete under such an environment, companies are striving to achieve 100%, sensor-based, automated inspection for zero-defect manufacturing. In this study, the Internet-based quality control scheme is referred to as "E-Quality for Manufacturing" or "EQM" for short. By its definition, EQM refers to a holistic approach to design and to embed efficient quality control functions in the context of network integrated manufacturing systems. Such system let designers located far away from the production facility to monitor, control and adjust the quality inspection processes as production design evolves.

  17. A Sensor Data Fusion System Based on k-Nearest Neighbor Pattern Classification for Structural Health Monitoring Applications

    PubMed Central

    Vitola, Jaime; Pozo, Francesc; Tibaduiza, Diego A.; Anaya, Maribel

    2017-01-01

    Civil and military structures are susceptible and vulnerable to damage due to the environmental and operational conditions. Therefore, the implementation of technology to provide robust solutions in damage identification (by using signals acquired directly from the structure) is a requirement to reduce operational and maintenance costs. In this sense, the use of sensors permanently attached to the structures has demonstrated a great versatility and benefit since the inspection system can be automated. This automation is carried out with signal processing tasks with the aim of a pattern recognition analysis. This work presents the detailed description of a structural health monitoring (SHM) system based on the use of a piezoelectric (PZT) active system. The SHM system includes: (i) the use of a piezoelectric sensor network to excite the structure and collect the measured dynamic response, in several actuation phases; (ii) data organization; (iii) advanced signal processing techniques to define the feature vectors; and finally; (iv) the nearest neighbor algorithm as a machine learning approach to classify different kinds of damage. A description of the experimental setup, the experimental validation and a discussion of the results from two different structures are included and analyzed. PMID:28230796

  18. A Sensor Data Fusion System Based on k-Nearest Neighbor Pattern Classification for Structural Health Monitoring Applications.

    PubMed

    Vitola, Jaime; Pozo, Francesc; Tibaduiza, Diego A; Anaya, Maribel

    2017-02-21

    Civil and military structures are susceptible and vulnerable to damage due to the environmental and operational conditions. Therefore, the implementation of technology to provide robust solutions in damage identification (by using signals acquired directly from the structure) is a requirement to reduce operational and maintenance costs. In this sense, the use of sensors permanently attached to the structures has demonstrated a great versatility and benefit since the inspection system can be automated. This automation is carried out with signal processing tasks with the aim of a pattern recognition analysis. This work presents the detailed description of a structural health monitoring (SHM) system based on the use of a piezoelectric (PZT) active system. The SHM system includes: (i) the use of a piezoelectric sensor network to excite the structure and collect the measured dynamic response, in several actuation phases; (ii) data organization; (iii) advanced signal processing techniques to define the feature vectors; and finally; (iv) the nearest neighbor algorithm as a machine learning approach to classify different kinds of damage. A description of the experimental setup, the experimental validation and a discussion of the results from two different structures are included and analyzed.

  19. Synthetic Aperture Radar (SAR)-based paddy rice monitoring system: Development and application in key rice producing areas in Tropical Asia

    NASA Astrophysics Data System (ADS)

    Setiyono, T. D.; Holecz, F.; Khan, N. I.; Barbieri, M.; Quicho, E.; Collivignarelli, F.; Maunahan, A.; Gatti, L.; Romuga, G. C.

    2017-01-01

    Reliable and regular rice information is essential part of many countries’ national accounting process but the existing system may not be sufficient to meet the information demand in the context of food security and policy. Synthetic Aperture Radar (SAR) imagery is highly suitable for detecting lowland paddy rice, especially in tropical region where pervasive cloud cover in the rainy seasons limits the use of optical imagery. This study uses multi-temporal X-band and C-band SAR imagery, automated image processing, rule-based classification and field observations to classify rice in multiple locations across Tropical Asia and assimilate the information into ORYZA Crop Growth Simulation model (CGSM) to generate high resolution yield maps. The resulting cultivated rice area maps had classification accuracies above 85% and yield estimates were within 81-93% agreement against district level reported yields. The study sites capture much of the diversity in water management, crop establishment and rice maturity durations and the study demonstrates the feasibility of rice detection, yield monitoring, and damage assessment in case of climate disaster at national and supra-national scales using multi-temporal SAR imagery combined with CGSM and automated methods.

  20. Automation and Process Improvement Enables a Small Team to Operate a Low Thrust Mission in Orbit Around the Asteroid Vesta

    NASA Technical Reports Server (NTRS)

    Weise, Timothy M

    2012-01-01

    NASA's Dawn mission to the asteroid Vesta and dwarf planet Ceres launched September 27, 2007 and arrived at Vesta in July of 2011. This mission uses ion propulsion to achieve the necessary delta-V to reach and maneuver at Vesta and Ceres. This paper will show how the evolution of ground system automation and process improvement allowed a relatively small engineering team to transition from cruise operations to asteroid operations while maintaining robust processes. The cruise to Vesta phase lasted almost 4 years and consisted of activities that were built with software tools, but each tool was open loop and required engineers to review the output to ensure consistency. Additionally, this same time period was characterized by the evolution from manually retrieved and reviewed data products to automatically generated data products and data value checking. Furthermore, the team originally took about three to four weeks to design and build about four weeks of spacecraft activities, with spacecraft contacts only once a week. Operations around the asteroid Vesta increased the tempo dramatically by transitioning from one contact a week to three or four contacts a week, to fourteen contacts a week (every 12 hours). This was accompanied by a similar increase in activity complexity as well as very fast turn around activity design and build cycles. The design process became more automated and the tools became closed loop, allowing the team to build more activities without sacrificing rigor. Additionally, these activities were dependent on the results of flight system performance, so more automation was added to analyze the flight data and provide results in a timely fashion to feed the design cycle. All of this automation and process improvement enabled up the engineers to focus on other aspects of spacecraft operations, including spacecraft health monitoring and anomaly resolution.

  1. Feasibility of the Scalable, Automated, Semipermanent Seismic Array (SASSA) to Monitor Possible Carbon Dioxide Migration

    NASA Astrophysics Data System (ADS)

    Livers, A. J.; Burnison, S. A.; Salako, O.; Barajas-Olalde, C.; Hamling, J. A.; Gorecki, C. D.

    2016-12-01

    The feasibility of monitoring potential carbon dioxide (CO2) migration in a reservoir using a sparse seismic array is being evaluated by the Energy & Environmental Research Center (EERC) at the Denbury Onshore LLC-operated Bell Creek oil field in Montana, which is undergoing commercial CO2 enhanced oil recovery (EOR). This new method may provide an economical means of continuously monitoring the CO2 plume edge and the CO2 reservoir boundaries and/or to interpret vertical or lateral out-of-reservoir CO2 migration. A 96-station scalable, automated, semipermanent seismic array (SASSA) was deployed in October 2015 to detect and track CO2 plume migration not by imaging, but by monitoring discrete source-receiver midpoints. Midpoints were strategically located within and around four injector-producer patterns covering approximately one square mile. Three-dimensional (3-D) geophysical ray tracing was used to determine surface receiver locations. Receivers used were FairfieldNodal Zland three-component, autonomous, battery-powered nodes. A GISCO ESS850 accelerated weight drop source located in a secure structure was remotely fired on a weekly basis for one calendar year, including a two-month period prior to initiation of CO2 injection to establish a baseline. Fifty shots were fired one day each week to facilitate increased signal-to-noise through novel receiver domain processing and vertical stacking. Receiver domain processing allowed for individualization of processing parameters to maximize signal enhancement and noise attenuation. Reflection events in the processed SASSA data correlate well to 3-D surface survey data collected in the field. Preliminary time-lapse data results for several individual SASSA receivers show a phase shift in the reflection events below the reservoir after injection, suggesting possible migration of the CO2 in the reservoir to the corresponding midpoint locations. This work is supported by the U.S. Department of Energy National Energy Technology Laboratory under Award No. FE0012665.

  2. The Earth Observation Monitor - Automated monitoring and alerting for spatial time-series data based on OGC web services

    NASA Astrophysics Data System (ADS)

    Eberle, J.; Hüttich, C.; Schmullius, C.

    2014-12-01

    Spatial time series data are freely available around the globe from earth observation satellites and meteorological stations for many years until now. They provide useful and important information to detect ongoing changes of the environment; but for end-users it is often too complex to extract this information out of the original time series datasets. This issue led to the development of the Earth Observation Monitor (EOM), an operational framework and research project to provide simple access, analysis and monitoring tools for global spatial time series data. A multi-source data processing middleware in the backend is linked to MODIS data from Land Processes Distributed Archive Center (LP DAAC) and Google Earth Engine as well as daily climate station data from NOAA National Climatic Data Center. OGC Web Processing Services are used to integrate datasets from linked data providers or external OGC-compliant interfaces to the EOM. Users can either use the web portal (webEOM) or the mobile application (mobileEOM) to execute these processing services and to retrieve the requested data for a given point or polygon in userfriendly file formats (CSV, GeoTiff). Beside providing just data access tools, users can also do further time series analyses like trend calculations, breakpoint detections or the derivation of phenological parameters from vegetation time series data. Furthermore data from climate stations can be aggregated over a given time interval. Calculated results can be visualized in the client and downloaded for offline usage. Automated monitoring and alerting of the time series data integrated by the user is provided by an OGC Sensor Observation Service with a coupled OGC Web Notification Service. Users can decide which datasets and parameters are monitored with a given filter expression (e.g., precipitation value higher than x millimeter per day, occurrence of a MODIS Fire point, detection of a time series anomaly). Datasets integrated in the SOS service are updated in near-realtime based on the linked data providers mentioned above. An alert is automatically pushed to the user if the new data meets the conditions of the registered filter expression. This monitoring service is available on the web portal with alerting by email and within the mobile app with alerting by email and push notification.

  3. A modular, prospective, semi-automated drug safety monitoring system for use in a distributed data environment.

    PubMed

    Gagne, Joshua J; Wang, Shirley V; Rassen, Jeremy A; Schneeweiss, Sebastian

    2014-06-01

    The aim of this study was to develop and test a semi-automated process for conducting routine active safety monitoring for new drugs in a network of electronic healthcare databases. We built a modular program that semi-automatically performs cohort identification, confounding adjustment, diagnostic checks, aggregation and effect estimation across multiple databases, and application of a sequential alerting algorithm. During beta-testing, we applied the system to five databases to evaluate nine examples emulating prospective monitoring with retrospective data (five pairs for which we expected signals, two negative controls, and two examples for which it was uncertain whether a signal would be expected): cerivastatin versus atorvastatin and rhabdomyolysis; paroxetine versus tricyclic antidepressants and gastrointestinal bleed; lisinopril versus angiotensin receptor blockers and angioedema; ciprofloxacin versus macrolide antibiotics and Achilles tendon rupture; rofecoxib versus non-selective non-steroidal anti-inflammatory drugs (ns-NSAIDs) and myocardial infarction; telithromycin versus azithromycin and hepatotoxicity; rosuvastatin versus atorvastatin and diabetes and rhabdomyolysis; and celecoxib versus ns-NSAIDs and myocardial infarction. We describe the program, the necessary inputs, and the assumed data environment. In beta-testing, the system generated four alerts, all among positive control examples (i.e., lisinopril and angioedema; rofecoxib and myocardial infarction; ciprofloxacin and tendon rupture; and cerivastatin and rhabdomyolysis). Sequential effect estimates for each example were consistent in direction and magnitude with existing literature. Beta-testing across nine drug-outcome examples demonstrated the feasibility of the proposed semi-automated prospective monitoring approach. In retrospective assessments, the system identified an increased risk of myocardial infarction with rofecoxib and an increased risk of rhabdomyolysis with cerivastatin years before these drugs were withdrawn from the market. Copyright © 2014 John Wiley & Sons, Ltd.

  4. Multi-method automated diagnostics of rotating machines

    NASA Astrophysics Data System (ADS)

    Kostyukov, A. V.; Boychenko, S. N.; Shchelkanov, A. V.; Burda, E. A.

    2017-08-01

    The automated machinery diagnostics and monitoring systems utilized within the petrochemical plants are an integral part of the measures taken to ensure safety and, as a consequence, the efficiency of these industrial facilities. Such systems are often limited in their functionality due to the specifics of the diagnostic techniques adopted. As the diagnostic techniques applied in each system are limited, and machinery defects can have different physical nature, it becomes necessary to combine several diagnostics and monitoring systems to control various machinery components. Such an approach is inconvenient, since it requires additional measures to bring the diagnostic results in a single view of the technical condition of production assets. In this case, we mean by a production facility a bonded complex of a process unit, a drive, a power source and lines. A failure of any of these components will cause an outage of the production asset, which is unacceptable. The purpose of the study is to test a combined use of vibration diagnostics and partial discharge techniques within the diagnostic systems of enterprises for automated control of the technical condition of rotating machinery during maintenance and at production facilities. The described solutions allow you to control the condition of mechanical and electrical components of rotating machines. It is shown that the functionality of the diagnostics systems can be expanded with minimal changes in technological chains of repair and operation of rotating machinery. Automation of such systems reduces the influence of the human factor on the quality of repair and diagnostics of the machinery.

  5. Automated plasma control with optical emission spectroscopy

    NASA Astrophysics Data System (ADS)

    Ward, P. P.

    Plasma etching and desmear processes for printed wiring board (PWB) manufacture are difficult to predict and control. Non-uniformity of most plasma processes and sensitivity to environmental changes make it difficult to maintain process stability from day to day. To assure plasma process performance, weight loss coupons or post-plasma destructive testing must be used. These techniques are not real-time methods however, and do not allow for immediate diagnosis and process correction. These tests often require scrapping some fraction of a batch to insure the integrity of the rest. Since these tests verify a successful cycle with post-plasma diagnostics, poor test results often determine that a batch is substandard and the resulting parts unusable. These tests are a costly part of the overall fabrication cost. A more efficient method of testing would allow for constant monitoring of plasma conditions and process control. Process anomalies should be detected and corrected before the parts being treated are damaged. Real time monitoring would allow for instantaneous corrections. Multiple site monitoring would allow for process mapping within one system or simultaneous monitoring of multiple systems. Optical emission spectroscopy conducted external to the plasma apparatus would allow for this sort of multifunctional analysis without perturbing the glow discharge. In this paper, optical emission spectroscopy for non-intrusive, in situ process control will be explored along with applications of this technique to for process control, failure analysis and endpoint determination in PWB manufacture.

  6. Distributed cyberinfrastructure tools for automated data processing of structural monitoring data

    NASA Astrophysics Data System (ADS)

    Zhang, Yilan; Kurata, Masahiro; Lynch, Jerome P.; van der Linden, Gwendolyn; Sederat, Hassan; Prakash, Atul

    2012-04-01

    The emergence of cost-effective sensing technologies has now enabled the use of dense arrays of sensors to monitor the behavior and condition of large-scale bridges. The continuous operation of dense networks of sensors presents a number of new challenges including how to manage such massive amounts of data that can be created by the system. This paper reports on the progress of the creation of cyberinfrastructure tools which hierarchically control networks of wireless sensors deployed in a long-span bridge. The internet-enabled cyberinfrastructure is centrally managed by a powerful database which controls the flow of data in the entire monitoring system architecture. A client-server model built upon the database provides both data-provider and system end-users with secured access to various levels of information of a bridge. In the system, information on bridge behavior (e.g., acceleration, strain, displacement) and environmental condition (e.g., wind speed, wind direction, temperature, humidity) are uploaded to the database from sensor networks installed in the bridge. Then, data interrogation services interface with the database via client APIs to autonomously process data. The current research effort focuses on an assessment of the scalability and long-term robustness of the proposed cyberinfrastructure framework that has been implemented along with a permanent wireless monitoring system on the New Carquinez (Alfred Zampa Memorial) Suspension Bridge in Vallejo, CA. Many data interrogation tools are under development using sensor data and bridge metadata (e.g., geometric details, material properties, etc.) Sample data interrogation clients including those for the detection of faulty sensors, automated modal parameter extraction.

  7. Localization-based super-resolution imaging meets high-content screening.

    PubMed

    Beghin, Anne; Kechkar, Adel; Butler, Corey; Levet, Florian; Cabillic, Marine; Rossier, Olivier; Giannone, Gregory; Galland, Rémi; Choquet, Daniel; Sibarita, Jean-Baptiste

    2017-12-01

    Single-molecule localization microscopy techniques have proven to be essential tools for quantitatively monitoring biological processes at unprecedented spatial resolution. However, these techniques are very low throughput and are not yet compatible with fully automated, multiparametric cellular assays. This shortcoming is primarily due to the huge amount of data generated during imaging and the lack of software for automation and dedicated data mining. We describe an automated quantitative single-molecule-based super-resolution methodology that operates in standard multiwell plates and uses analysis based on high-content screening and data-mining software. The workflow is compatible with fixed- and live-cell imaging and allows extraction of quantitative data like fluorophore photophysics, protein clustering or dynamic behavior of biomolecules. We demonstrate that the method is compatible with high-content screening using 3D dSTORM and DNA-PAINT based super-resolution microscopy as well as single-particle tracking.

  8. Mission operations technology

    NASA Astrophysics Data System (ADS)

    Varsi, Giulio

    In the last decade, the operation of a spacecraft after launch has emerged as a major component of the total cost of the mission. This trend is sustained by the increasing complexity, flexibility, and data gathering capability of the space assets and by their greater reliability and consequent longevity. The trend can, however, be moderated by the progressive transfer of selected functions from the ground to the spacecraft and by application, on the ground, of new technology. Advances in ground operations derive from the introduction in the mission operations environment of advanced microprocessor-based workstations in the class of a few million instructions per second and from the selective application of artificial intelligence technology. In the last few years a number of these applications have been developed, tested in operational settings and successfully demonstrated to users. Some are now being integrated in mission operations facilities. An analysis of mission operations indicates that the key areas are: concurrent control of multiple missions; automated/interactive production of command sequences of high integrity at low cost; automated monitoring of spacecraft health and automated aides for fault diagnosis; automated allocation of resources; automated processing of science data; and high-fidelity, high-speed spacecraft simulation. Examples of major advances in selected areas are described.

  9. Interdisciplinary Investigations in Support of Project DI-MOD

    NASA Technical Reports Server (NTRS)

    Starks, Scott A. (Principal Investigator)

    1996-01-01

    Various concepts from time series analysis are used as the basis for the development of algorithms to assist in the analysis and interpretation of remote sensed imagery. An approach to trend detection that is based upon the fractal analysis of power spectrum estimates is presented. Additionally, research was conducted toward the development of a software architecture to support processing tasks associated with databases housing a variety of data. An algorithmic approach which provides for the automation of the state monitoring process is presented.

  10. Matrix Failure Modes and Effects Analysis as a Knowledge Base for a Real Time Automated Diagnosis Expert System

    NASA Technical Reports Server (NTRS)

    Herrin, Stephanie; Iverson, David; Spukovska, Lilly; Souza, Kenneth A. (Technical Monitor)

    1994-01-01

    Failure Modes and Effects Analysis contain a wealth of information that can be used to create the knowledge base required for building automated diagnostic Expert systems. A real time monitoring and diagnosis expert system based on an actual NASA project's matrix failure modes and effects analysis was developed. This Expert system Was developed at NASA Ames Research Center. This system was first used as a case study to monitor the Research Animal Holding Facility (RAHF), a Space Shuttle payload that is used to house and monitor animals in orbit so the effects of space flight and microgravity can be studied. The techniques developed for the RAHF monitoring and diagnosis Expert system are general enough to be used for monitoring and diagnosis of a variety of other systems that undergo a Matrix FMEA. This automated diagnosis system was successfully used on-line and validated on the Space Shuttle flight STS-58, mission SLS-2 in October 1993.

  11. Analytical interference in the therapeutic drug monitoring of methotrexate.

    PubMed

    Oudart, Jean-Baptiste; Marquet, Benjamin; Feliu, Catherine; Gozalo, Claire; Djerada, Zoubir; Millart, Hervé

    2016-06-01

    High-dose of methotrexate chemotherapy is used in the treatment of some tumors. It presents several side effects that required therapeutic drug monitoring, which is commonly performed on 24, 48 and 72h after the beginning of the methotrexate infusion. Treatment of overexposure to methotrexate is based on injection of carboxypeptidase G2, which specifically degrades methotrexate into inactive metabolite: DAMPA. FPIA immunoassay on TDx automated analyzer (Abbott™) was used for therapeutic drug monitoring of methotrexate. This immunoassay presented a significant cross-reactivity between methotrexate and DAMPA, which widely overestimate the residual concentration compared to the gold standard HPLC/MS. TDx automated analyzer was substituted by a new immunoassay on Architect automated analyzer (Abbott™). However, this immunoassay has the same cross-reactivity, which needs to be careful when monitoring methotrexate after an injection of carboxypeptidase G2. In order to determine the most suitable assay for the therapeutic drug monitoring of methotrexate, the knowledge of injection of carboxypeptidase G2 remains essential.

  12. Improving medical stores management through automation and effective communication

    PubMed Central

    Kumar, Ashok; Cariappa, M.P.; Marwaha, Vishal; Sharma, Mukti; Arora, Manu

    2016-01-01

    Background Medical stores management in hospitals is a tedious and time consuming chore with limited resources tasked for the purpose and poor penetration of Information Technology. The process of automation is slow paced due to various inherent factors and is being challenged by the increasing inventory loads and escalating budgets for procurement of drugs. Methods We carried out an indepth case study at the Medical Stores of a tertiary care health care facility. An iterative six step Quality Improvement (QI) process was implemented based on the Plan–Do–Study–Act (PDSA) cycle. The QI process was modified as per requirement to fit the medical stores management model. The results were evaluated after six months. Results After the implementation of QI process, 55 drugs of the medical store inventory which had expired since 2009 onwards were replaced with fresh stock by the suppliers as a result of effective communication through upgraded database management. Various pending audit objections were dropped due to the streamlined documentation and processes. Inventory management improved drastically due to automation, with disposal orders being initiated four months prior to the expiry of drugs and correct demands being generated two months prior to depletion of stocks. The monthly expense summary of drugs was now being done within ten days of the closing month. Conclusion Improving communication systems within the hospital with vendor database management and reaching out to clinicians is important. Automation of inventory management requires to be simple and user-friendly, utilizing existing hardware. Physical stores monitoring is indispensable, especially due to the scattered nature of stores. Staff training and standardized documentation protocols are the other keystones for optimal medical store management. PMID:26900225

  13. Project TwEATs: a feasibility study testing the use of automated text messaging to monitor appetite ratings in a free-living population

    PubMed Central

    Schembre, Susan M.; Yuen, Jessica

    2011-01-01

    There are no standardized methods for monitoring appetite in free-living populations. Fifteen participants tested a computer-automated text-messaging system designed to track hunger ratings over seven days. Participants were sent text-messages (SMS) hourly and instructed to reply during waking hours with their current hunger rating. Of 168 SMS, 0.6-7.1% were undelivered, varying by mobile service provider, On average 12 SMS responses were received daily with minor variations by observation day or day of the week. Compliance was over 74% and 93% of the ratings were received within 30-minutes. Automated text-messaging is a feasible method to monitor appetite ratings in this population. PMID:21251941

  14. Automation is an Effective Way to Improve Quality of Verification (Calibration) of Measuring Instruments

    NASA Astrophysics Data System (ADS)

    Golobokov, M.; Danilevich, S.

    2018-04-01

    In order to assess calibration reliability and automate such assessment, procedures for data collection and simulation study of thermal imager calibration procedure have been elaborated. The existing calibration techniques do not always provide high reliability. A new method for analyzing the existing calibration techniques and developing new efficient ones has been suggested and tested. A type of software has been studied that allows generating instrument calibration reports automatically, monitoring their proper configuration, processing measurement results and assessing instrument validity. The use of such software allows reducing man-hours spent on finalization of calibration data 2 to 5 times and eliminating a whole set of typical operator errors.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coble, Jamie; Orton, Christopher; Schwantes, Jon

    Abstract—The Multi-Isotope Process (MIP) Monitor provides an efficient approach to monitoring the process conditions in used nuclear fuel reprocessing facilities to support process verification and validation. The MIP Monitor applies multivariate analysis to gamma spectroscopy of reprocessing streams in order to detect small changes in the gamma spectrum, which may indicate changes in process conditions. This research extends the MIP Monitor by characterizing a used fuel sample after initial dissolution according to the type of reactor of origin (pressurized or boiling water reactor), initial enrichment, burn up, and cooling time. Simulated gamma spectra were used to develop and test threemore » fuel characterization algorithms. The classification and estimation models employed are based on the partial least squares regression (PLS) algorithm. A PLS discriminate analysis model was developed which perfectly classified reactor type. Locally weighted PLS models were fitted on-the-fly to estimate continuous fuel characteristics. Burn up was predicted within 0.1% root mean squared percent error (RMSPE) and both cooling time and initial enrichment within approximately 2% RMSPE. This automated fuel characterization can be used to independently verify operator declarations of used fuel characteristics and inform the MIP Monitor anomaly detection routines at later stages of the fuel reprocessing stream to improve sensitivity to changes in operational parameters and material diversions.« less

  16. A Fully Redundant On-Line Mass Spectrometer System Used to Monitor Cryogenic Fuel Leaks on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Griffin, Timothy P.; Naylor, Guy R.; Haskell, William D.; Breznik, Greg S.; Mizell, Carolyn A.; Helms, William R.; Voska, N. (Technical Monitor)

    2002-01-01

    An on-line gas monitoring system was developed to replace the older systems used to monitor for cryogenic leaks on the Space Shuttles before launch. The system uses a mass spectrometer to monitor multiple locations in the process, which allows the system to monitor all gas constituents of interest in a nearly simultaneous manner. The system is fully redundant and meets all requirements for ground support equipment (GSE). This includes ruggedness to withstand launch on the Mobile Launcher Platform (MLP), ease of operation, and minimal operator intervention. The system can be fully automated so that an operator is notified when an unusual situation or fault is detected. User inputs are through personal computer using mouse and keyboard commands. The graphical user for detecting cryogenic leaks, many other gas constituents could be monitored using the Hazardous Gas Detection System (HGDS) 2000.

  17. Development of monitoring and control system for a mine main fan based on frequency converter

    NASA Astrophysics Data System (ADS)

    Zhang, Y. C.; Zhang, R. W.; Kong, X. Z.; Y Gong, J.; Chen, Q. G.

    2013-12-01

    In the process of mine exploitation, the requirement of air flow rate often changes. The procedure of traditional control mode of the fan is complex and it is hard to meet the worksite requirement for air. This system is based on Principal Computer (PC) monitoring system and high performance PLC control system. In this system, the frequency converter is adapted to adjust the fan speed and the air of worksite can be regulated steplessly. The function of the monitoring and control system contains on-line monitoring and centralized control. The system can monitor the parameters of fan in real-time, control the operation of frequency converter, as well as, control the fan and its accessory equipments. At the same time, the automation level of the system is highly, the field equipments can be monitored and controlled automatically. So, the system is an important safeguard for mine production.

  18. SU-G-BRB-04: Automated Output Factor Measurements Using Continuous Data Logging for Linac Commissioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, X; Li, S; Zheng, D

    Purpose: Linac commissioning is a time consuming and labor intensive process, the streamline of which is highly desirable. In particular, manual measurement of output factors for a variety of field sizes and energy greatly hinders the commissioning efficiency. In this study, automated measurement of output factors was demonstrated as ‘one-click’ using data logging of an electrometer. Methods: Beams to be measured were created in the recording and verifying (R&V) system and configured for continuous delivery. An electrometer with an automatic data logging feature enabled continuous data collection for all fields without human intervention. The electrometer saved data into a spreadsheetmore » every 0.5 seconds. A Matlab program was developed to analyze the excel data to monitor and check the data quality. Results: For each photon energy, output factors were measured for five configurations, including open field and four wedges. Each configuration includes 72 fields sizes, ranging from 4×4 to 20×30 cm{sup 2}. Using automation, it took 50 minutes to complete the measurement of 72 field sizes, in contrast to 80 minutes when using the manual approach. The automation avoided the necessity of redundant Linac status checks between fields as in the manual approach. In fact, the only limiting factor in such automation is Linac overheating. The data collection beams in the R&V system are reusable, and the simplified process is less error-prone. In addition, our Matlab program extracted the output factors faithfully from data logging, and the discrepancy between the automatic and manual measurement is within ±0.3%. For two separate automated measurements 30 days apart, consistency check shows a discrepancy within ±1% for 6MV photon with a 60 degree wedge. Conclusion: Automated output factor measurements can save time by 40% when compared with conventional manual approach. This work laid ground for further improvement for the automation of Linac commissioning.« less

  19. An OSEE Based Portable Surface Contamination Monitor

    NASA Technical Reports Server (NTRS)

    Perey, Daniel F.

    1997-01-01

    Many industrial and aerospace processes involving the joining of materials, require sufficient surface cleanliness to insure proper bonding. Processes as diverse as painting, welding, or the soldering of electronic circuits will be compromised if prior inspection and removal of surface contaminants is inadequate. As process requirements become more stringent and the number of different materials and identified contaminants increases, various instruments and techniques have been developed for improved inspection. One such technique based on the principle of Optically Stimulated Electron Emission (OSEE) has been explored for a number of years as a tool for surface contamination monitoring. Some of the benefits of OSEE are: it's non-contacting; requires little operator training; and has very high contamination sensitivity. This paper describes the development of a portable OSEE based surface contamination monitor. The instrument is suitable for both hand-held and robotic inspections with either manual or automated control of instrument operation. In addition, instrument output data is visually displayed to the operator and may be output to an external computer for archiving or analysis.

  20. Automatic Measuring System for Oil Stream Paraffin Deposits Parameters

    NASA Astrophysics Data System (ADS)

    Kopteva, A. V.; Koptev, V. Yu

    2018-03-01

    This paper describes a new method for monitoring oil pipelines, as well as a highly efficient and automated paraffin deposit monitoring method. When operating oil pipelines, there is an issue of paraffin, resin and salt deposits on the pipeline walls that come with the oil stream. It ultimately results in frequent transportation suspension to clean or even replace pipes and other equipment, thus shortening operation periods between repairs, creating emergency situations and increasing production expenses, badly affecting environment, damaging ecology and spoil underground water, killing animals, birds etc. Oil spills contaminate rivers, lakes, and ground waters. Oil transportation monitoring issues are still subject for further studying. Thus, there is the need to invent a radically new automated process control and management system, together with measurement means intellectualization. The measurement principle is based on the Lambert-Beer law that describes the dependence between the gamma-radiation frequency and the density together with the linear attenuation coefficient for a substance. Using the measuring system with high accuracy (± 0,2%), one can measure the thickness of paraffin deposits with an absolute accuracy of ± 5 mm, which is sufficient to ensure reliable operation of the pipeline system. Safety is a key advantage, when using the proposed control system.

  1. Automated turn pike using PLC and SCADA

    NASA Astrophysics Data System (ADS)

    Silpa Sreedhar, P.; Aiswarya, P.; Kathirvelan, J.

    2017-11-01

    We propose a smart turnpike based on Programmable Logic Controller (PLC) and Supervisory Control and Data Acquisition Systems (SCADA) in this paper. In this work, the basic idea is to measure the weight of the vehicles and classify them according to its weight to the respective lanes. It is difficult for the turnpike people to monitor the whole process all the time. So, this PLC based diversion system can be implemented in turnpikes to reduce the difficulties. This method will work based on weight sensors (piezo-resistive) whose output will be fed to a PLC, which will control the vehicle diversion. Using SCADA software, the whole process can be monitored from a remote area. The algorithm developed in this successfully installed in real time system.

  2. An IDEA of What's in the Air

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The Automatic Particle Fallout Monitor (APFM) is an automated instrument that assesses real-time particle contamination levels in a facility by directly imaging, sizing, and counting contamination particles. It allows personnel to respond to particle contamination before it becomes a major problem. For NASA, the APFM improves the ability to mitigate, avoid, and explain mission-compromising incidents of contamination occurring during payload processing, launch vehicle ground processing, and potentially, during flight operations. Commercial applications are in semiconductor processing and electronics fabrication, as well as aerospace, aeronautical, and medical industries. The product could also be used to measure the air quality of hotels, apartment complexes, and corporate buildings. IDEA sold and delivered its first four units to the United Space Alliance for the Space Shuttle Program at Kennedy. NASA used the APFM in the Kennedy Space Station Processing Facility to monitor contamination levels during the assembly of International Space Station components.

  3. LORAN-C data reduction at the US Naval Observatory

    NASA Technical Reports Server (NTRS)

    Chadsey, Harold

    1992-01-01

    As part of its mission and in cooperation with the U.S. Coast Guard, the U.S. Naval Observatory (USNO) monitors and reports the timing of the LORAN-C chains. The procedures for monitoring and processing the reported values have evolved with advances in monitoring equipment, computer interfaces and PCs. This paper discusses the current standardized procedures used by USNO to sort the raw data according to Group Repetition Interval (GRI) rate, to fit and smooth the data points, and, for chains remotely monitored, to tie the values to the USNO Master Clock. The results of these procedures are the LORAN time of transmission values, as references to UTC(USNO) (Universal Coordinated Time) for all LORAN chains. This information is available to users via USNO publications and the USNO Automated Data Service (ADS).

  4. Heuristic automation for decluttering tactical displays.

    PubMed

    St John, Mark; Smallman, Harvey S; Manes, Daniel I; Feher, Bela A; Morrison, Jeffrey G

    2005-01-01

    Tactical displays can quickly become cluttered with large numbers of symbols that can compromise effective monitoring. Here, we studied how heuristic automation can aid users by intelligently "decluttering" the display. In a realistic simulated naval air defense task, 27 experienced U.S. Navy users monitored a cluttered airspace and executed defensive responses against significant threats. An algorithm continuously evaluated aircraft for their levels of threat and decluttered the less threatening ones by dimming their symbols. Users appropriately distrusted and spot-checked the automation's assessments, and decluttering had very little effect on which aircraft were judged as significantly threatening. Nonetheless, decluttering improved the timeliness of responses to threatening aircraft by 25% as compared with a baseline display with no decluttering; it was especially beneficial for threats in more peripheral locations, and 25 of 27 participants preferred decluttering. Heuristic automation, when properly designed to guide users' attention by decluttering less important objects, may prove valuable in many cluttered monitoring situations, including air traffic management, crisis team management, and tactical situation awareness in general.

  5. Armored Combat Vehicles Science and Technology Plan

    DTIC Science & Technology

    1982-11-01

    APPLICATION OF SENSORS Investigate the seismic, acoustic, and electromagnetic signatures of military and intruder -type targets and the theoretical aspects...a prototype sampling system which has the capability to monitor ambieut air both outside and inside vehicles and provide an early warning to the crew...and through various processing modules provide automated functions for simultaneous tracking of targets and automitic recognition, 74 f’," SENSING

  6. High-Speed Observer: Automated Streak Detection in SSME Plumes

    NASA Technical Reports Server (NTRS)

    Rieckoff, T. J.; Covan, M.; OFarrell, J. M.

    2001-01-01

    A high frame rate digital video camera installed on test stands at Stennis Space Center has been used to capture images of Space Shuttle main engine plumes during test. These plume images are processed in real time to detect and differentiate anomalous plume events occurring during a time interval on the order of 5 msec. Such speed yields near instantaneous availability of information concerning the state of the hardware. This information can be monitored by the test conductor or by other computer systems, such as the integrated health monitoring system processors, for possible test shutdown before occurrence of a catastrophic engine failure.

  7. The Standard Autonomous File Server, a Customized, Off-the-Shelf Success Story

    NASA Technical Reports Server (NTRS)

    Semancik, Susan K.; Conger, Annette M.; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    The Standard Autonomous File Server (SAFS), which includes both off-the-shelf hardware and software, uses an improved automated file transfer process to provide a quicker, more reliable, prioritized file distribution for customers of near real-time data without interfering with the assets involved in the acquisition and processing of the data. It operates as a stand-alone solution, monitoring itself, and providing an automated fail-over process to enhance reliability. This paper will describe the unique problems and lessons learned both during the COTS selection and integration into SAFS, and the system's first year of operation in support of NASA's satellite ground network. COTS was the key factor in allowing the two-person development team to deploy systems in less than a year, meeting the required launch schedule. The SAFS system his been so successful, it is becoming a NASA standard resource, leading to its nomination for NASA's Software or the Year Award in 1999.

  8. Vacuum mechatronics

    NASA Technical Reports Server (NTRS)

    Hackwood, Susan; Belinski, Steven E.; Beni, Gerardo

    1989-01-01

    The discipline of vacuum mechatronics is defined as the design and development of vacuum-compatible computer-controlled mechanisms for manipulating, sensing and testing in a vacuum environment. The importance of vacuum mechatronics is growing with an increased application of vacuum in space studies and in manufacturing for material processing, medicine, microelectronics, emission studies, lyophylisation, freeze drying and packaging. The quickly developing field of vacuum mechatronics will also be the driving force for the realization of an advanced era of totally enclosed clean manufacturing cells. High technology manufacturing has increasingly demanding requirements for precision manipulation, in situ process monitoring and contamination-free environments. To remove the contamination problems associated with human workers, the tendency in many manufacturing processes is to move towards total automation. This will become a requirement in the near future for e.g., microelectronics manufacturing. Automation in ultra-clean manufacturing environments is evolving into the concept of self-contained and fully enclosed manufacturing. A Self Contained Automated Robotic Factory (SCARF) is being developed as a flexible research facility for totally enclosed manufacturing. The construction and successful operation of a SCARF will provide a novel, flexible, self-contained, clean, vacuum manufacturing environment. SCARF also requires very high reliability and intelligent control. The trends in vacuum mechatronics and some of the key research issues are reviewed.

  9. Spectral imaging applications: Remote sensing, environmental monitoring, medicine, military operations, factory automation and manufacturing

    NASA Technical Reports Server (NTRS)

    Gat, N.; Subramanian, S.; Barhen, J.; Toomarian, N.

    1996-01-01

    This paper reviews the activities at OKSI related to imaging spectroscopy presenting current and future applications of the technology. The authors discuss the development of several systems including hardware, signal processing, data classification algorithms and benchmarking techniques to determine algorithm performance. Signal processing for each application is tailored by incorporating the phenomenology appropriate to the process, into the algorithms. Pixel signatures are classified using techniques such as principal component analyses, generalized eigenvalue analysis and novel very fast neural network methods. The major hyperspectral imaging systems developed at OKSI include the Intelligent Missile Seeker (IMS) demonstration project for real-time target/decoy discrimination, and the Thermal InfraRed Imaging Spectrometer (TIRIS) for detection and tracking of toxic plumes and gases. In addition, systems for applications in medical photodiagnosis, manufacturing technology, and for crop monitoring are also under development.

  10. The Effects of Automated Prompting and Self-Monitoring on Homework Completion for a Student with Attention Deficit Hyperactivity Disorder

    ERIC Educational Resources Information Center

    Blicha, Amy; Belfiore, Phillip J.

    2013-01-01

    This study examined the effects of an intervention consisting of automated prompting and self-monitoring on the level of independent homework task completion for an elementary-age student with attention deficit hyperactivity disorder (ADHD). Instituting a single subject, within series ABAB design, the results showed a consistent increase and…

  11. Automated and continual determination of radio telescope reference points with sub-mm accuracy: results from a campaign at the Onsala Space Observatory

    NASA Astrophysics Data System (ADS)

    Lösler, Michael; Haas, Rüdiger; Eschelbach, Cornelia

    2013-08-01

    The Global Geodetic Observing System (GGOS) requires sub-mm accuracy, automated and continual determinations of the so-called local tie vectors at co-location stations. Co-location stations host instrumentation for several space geodetic techniques and the local tie surveys involve the relative geometry of the reference points of these instruments. Thus, these reference points need to be determined in a common coordinate system, which is a particular challenge for rotating equipment like radio telescopes for geodetic Very Long Baseline Interferometry. In this work we describe a concept to achieve automated and continual determinations of radio telescope reference points with sub-mm accuracy. We developed a monitoring system, including Java-based sensor communication for automated surveys, network adjustment and further data analysis. This monitoring system was tested during a monitoring campaign performed at the Onsala Space Observatory in the summer of 2012. The results obtained in this campaign show that it is possible to perform automated determination of a radio telescope reference point during normal operations of the telescope. Accuracies on the sub-mm level can be achieved, and continual determinations can be realized by repeated determinations and recursive estimation methods.

  12. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    NASA Astrophysics Data System (ADS)

    Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.

    2012-06-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker to centralize all communication between modules. The result is an intelligent system able to extract and compute relevant information from the flow of operational data to provide real-time feedback to human experts who can promptly react when needed. The paper presents the design and implementation of the AAL project, together with the results of its usage as automated monitoring assistant for the ATLAS data taking infrastructure.

  13. Early Results of Three-Year Monitoring of Red Wood Ants’ Behavioral Changes and Their Possible Correlation with Earthquake Events

    PubMed Central

    Berberich, Gabriele; Berberich, Martin; Grumpe, Arne; Wöhler, Christian; Schreiber, Ulrich

    2013-01-01

    Simple Summary For three years (2009–2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants’ behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the video streams. Based on this automated approach, a statistical analysis of the ant behavior will be carried out. Abstract Short-term earthquake predictions with an advance warning of several hours or days are currently not possible due to both incomplete understanding of the complex tectonic processes and inadequate observations. Abnormal animal behaviors before earthquakes have been reported previously, but create problems in monitoring and reliability. The situation is different with red wood ants (RWA; Formica rufa-group (Hymenoptera: Formicidae)). They have stationary mounds on tectonically active, gas-bearing fault systems. These faults may be potential earthquake areas. For three years (2009–2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras with both a color and an infrared sensor. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants’ behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the more than 45,000 hours of video streams. Based on this automated approach, a statistical analysis of the ants’ behavior will be carried out. In addition, other parameters (climate, geotectonic and biological), which may influence behavior, will be included in the analysis. PMID:26487310

  14. A Comparison of a Brain-Based Adaptive System and a Manual Adaptable System for Invoking Automation

    NASA Technical Reports Server (NTRS)

    Bailey, Nathan R.; Scerbo, Mark W.; Freeman, Frederick G.; Mikulka, Peter J.; Scott, Lorissa A.

    2004-01-01

    Two experiments are presented that examine alternative methods for invoking automation. In each experiment, participants were asked to perform simultaneously a monitoring task and a resource management task as well as a tracking task that changed between automatic and manual modes. The monitoring task required participants to detect failures of an automated system to correct aberrant conditions under either high or low system reliability. Performance on each task was assessed as well as situation awareness and subjective workload. In the first experiment, half of the participants worked with a brain-based system that used their EEG signals to switch the tracking task between automatic and manual modes. The remaining participants were yoked to participants from the adaptive condition and received the same schedule of mode switches, but their EEG had no effect on the automation. Within each group, half of the participants were assigned to either the low or high reliability monitoring task. In addition, within each combination of automation invocation and system reliability, participants were separated into high and low complacency potential groups. The results revealed no significant effects of automation invocation on the performance measures; however, the high complacency individuals demonstrated better situation awareness when working with the adaptive automation system. The second experiment was the same as the first with one important exception. Automation was invoked manually. Thus, half of the participants pressed a button to invoke automation for 10 s. The remaining participants were yoked to participants from the adaptable condition and received the same schedule of mode switches, but they had no control over the automation. The results showed that participants who could invoke automation performed more poorly on the resource management task and reported higher levels of subjective workload. Further, those who invoked automation more frequently performed more poorly on the tracking task and reported higher levels of subjective workload. and the adaptable condition in the second experiment revealed only one significant difference: the subjective workload was higher in the adaptable condition. Overall, the results show that a brain-based, adaptive automation system may facilitate situation awareness for those individuals who are more complacent toward automation. By contrast, requiring operators to invoke automation manually may have some detrimental impact on performance but does appear to increases subjective workload relative to an adaptive system.

  15. HiCAT Software Infrastructure: Safe hardware control with object oriented Python

    NASA Astrophysics Data System (ADS)

    Moriarty, Christopher; Brooks, Keira; Soummer, Remi

    2018-01-01

    High contrast imaging for Complex Aperture Telescopes (HiCAT) is a testbed designed to demonstrate coronagraphy and wavefront control for segmented on-axis space telescopes such as envisioned for LUVOIR. To limit the air movements in the testbed room, software interfaces for several different hardware components were developed to completely automate operations. When developing software interfaces for many different pieces of hardware, unhandled errors are commonplace and can prevent the software from properly closing a hardware resource. Some fragile components (e.g. deformable mirrors) can be permanently damaged because of this. We present an object oriented Python-based infrastructure to safely automate hardware control and optical experiments. Specifically, conducting high-contrast imaging experiments while monitoring humidity and power status along with graceful shutdown processes even for unexpected errors. Python contains a construct called a “context manager” that allows you define code to run when a resource is opened or closed. Context managers ensure that a resource is properly closed, even when unhandled errors occur. Harnessing the context manager design, we also use Python’s multiprocessing library to monitor humidity and power status without interrupting the experiment. Upon detecting a safety problem, the master process sends an event to the child process that triggers the context managers to gracefully close any open resources. This infrastructure allows us to queue up several experiments and safely operate the testbed without a human in the loop.

  16. Toward Expanding Tremor Observations in the Northern San Andreas Fault System in the 1990s

    NASA Astrophysics Data System (ADS)

    Damiao, L. G.; Dreger, D. S.; Nadeau, R. M.; Taira, T.; Guilhem, A.; Luna, B.; Zhang, H.

    2015-12-01

    The connection between tremor activity and active fault processes continues to expand our understanding of deep fault zone properties and deformation, the tectonic process, and the relationship of tremor to the occurrence of larger earthquakes. Compared to tremors in subduction zones, known tremor signals in California are ~5 to ~10 smaller in amplitude and duration. These characteristics, in addition to scarce geographic coverage, lack of continuous data (e.g., before mid-2001 at Parkfield), and absence of instrumentation sensitive enough to monitor these events have stifled tremor detection. The continuous monitoring of these events over a relatively short time period in limited locations may lead to a parochial view of the tremor phenomena and its relationship to fault, tectonic, and earthquake processes. To help overcome this, we have embarked on a project to expand the geographic and temporal scope of tremor observation along the Northern SAF system using available continuous seismic recordings from a broad array of 100s of surface seismic stations from multiple seismic networks. Available data for most of these stations also extends back into the mid-1990s. Processing and analysis of tremor signal from this large and low signal-to-noise dataset requires a heavily automated, data-science type approach and specialized techniques for identifying and extracting reliable data. We report here on the automated, envelope based methodology we have developed. We finally compare our catalog results with pre-existing tremor catalogs in the Parkfield area.

  17. Sequence-of-events-driven automation of the deep space network

    NASA Technical Reports Server (NTRS)

    Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.

    1996-01-01

    In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.

  18. Sequence-of-Events-Driven Automation of the Deep Space Network

    NASA Technical Reports Server (NTRS)

    Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.

    1996-01-01

    In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.

  19. Novel Automated Blood Separations Validate Whole Cell Biomarkers

    PubMed Central

    Burger, Douglas E.; Wang, Limei; Ban, Liqin; Okubo, Yoshiaki; Kühtreiber, Willem M.; Leichliter, Ashley K.; Faustman, Denise L.

    2011-01-01

    Background Progress in clinical trials in infectious disease, autoimmunity, and cancer is stymied by a dearth of successful whole cell biomarkers for peripheral blood lymphocytes (PBLs). Successful biomarkers could help to track drug effects at early time points in clinical trials to prevent costly trial failures late in development. One major obstacle is the inaccuracy of Ficoll density centrifugation, the decades-old method of separating PBLs from the abundant red blood cells (RBCs) of fresh blood samples. Methods and Findings To replace the Ficoll method, we developed and studied a novel blood-based magnetic separation method. The magnetic method strikingly surpassed Ficoll in viability, purity and yield of PBLs. To reduce labor, we developed an automated platform and compared two magnet configurations for cell separations. These more accurate and labor-saving magnet configurations allowed the lymphocytes to be tested in bioassays for rare antigen-specific T cells. The automated method succeeded at identifying 79% of patients with the rare PBLs of interest as compared with Ficoll's uniform failure. We validated improved upfront blood processing and show accurate detection of rare antigen-specific lymphocytes. Conclusions Improving, automating and standardizing lymphocyte detections from whole blood may facilitate development of new cell-based biomarkers for human diseases. Improved upfront blood processes may lead to broad improvements in monitoring early trial outcome measurements in human clinical trials. PMID:21799852

  20. The Salcher landslide observatory: a new long-term monitoring site in Austria

    NASA Astrophysics Data System (ADS)

    Canli, Ekrem; Engels, Alexander; Glade, Thomas; Schweigl, Joachim; Bertagnoli, Michael

    2016-04-01

    Landslides pose a significant hazard in the federal district of Lower Austria. The Geological Survey of Lower Austria is responsible for detailed site investigations as well as the planning and installation of protective measures. The most landslide prone area in Lower Austria is within the Rhenodanubian Flyschzone whose materials consist of alterations of fine grained layers (clayey shales, silty shales, marls) and sandstones. It exhibits over 6200 landslides within an area of approx. 1300 km². For areas susceptible to landsliding, protection works are not feasible or simply too costly. Therefore, monitoring systems have been installed in the past, most of them, however, are not operated automatically and require field visits for data readouts. Thus, it is difficult to establish any relation between initiating and controlling factors to gain a comprehensive understanding of the underlying process mechanism that is essential for any early warning applications. In this presentation, we present the design and first results of an automated landslide monitoring system in Gresten (Lower Austria). The deep-seated, slow moving Salcher landslide extends over approx. 8000 m² and is situated adjacent to residential buildings and infrastructure. This monitoring setup is designed to run for at least a decade to account for investigations of long term sliding dynamics and pattern. Historically the Salcher landslide has shown shorter phases with accelerated movements followed by longer phases with barely any movements. Those periods of inactivity commonly exceed regular project durations, thus it is important to cover longer periods. Such slope dynamics can be investigated throughout many parts in the world, thus this monitoring might allow to understand better also landslides with infrequent movement patterns. The monitoring setup consists of surface as well as subsurface installations. All installations are connected to permanent power supply, are taking the respective reading at a fixed time interval and are embedded within a WiFi network. All measured data is sent immediately to a server in Vienna and thus, all information is available in real-time. Surface monitoring devices cover a meteorological station measuring rainfall, temperature, radiation and air pressure and a permanent long-range Terrestrial Laserscanning (pTLS) station performing a high resolution scan of the entire landslide surface once a day. The subsurface devices include TDR probes and a fully automated geoelectrical monitoring profile for analyzing the spatial distribution of resistivity changes (attributed to changes in soil moisture) over the entire length of the landslide. Along this longitudinal profile, four piezometers are installed to monitor groundwater fluctuations. This is accompanied by an automated inclinometer chain for assessing horizontal displacements in the subsurface. The presentation will focus on the first results of the monitoring system and will highlight ongoing and future work tasks including data processing, analysis and visualization within a web-based platform. The overall goal of the described system is to enable authorized users and decision makers to utilize real-time data and analysis results to issue alarms if potentially hazardous changes are recorded.

  1. Using microwave Doppler radar in automated manufacturing applications

    NASA Astrophysics Data System (ADS)

    Smith, Gregory C.

    Since the beginning of the Industrial Revolution, manufacturers worldwide have used automation to improve productivity, gain market share, and meet growing or changing consumer demand for manufactured products. To stimulate further industrial productivity, manufacturers need more advanced automation technologies: "smart" part handling systems, automated assembly machines, CNC machine tools, and industrial robots that use new sensor technologies, advanced control systems, and intelligent decision-making algorithms to "see," "hear," "feel," and "think" at the levels needed to handle complex manufacturing tasks without human intervention. The investigator's dissertation offers three methods that could help make "smart" CNC machine tools and industrial robots possible: (1) A method for detecting acoustic emission using a microwave Doppler radar detector, (2) A method for detecting tool wear on a CNC lathe using a Doppler radar detector, and (3) An online non-contact method for detecting industrial robot position errors using a microwave Doppler radar motion detector. The dissertation studies indicate that microwave Doppler radar could be quite useful in automated manufacturing applications. In particular, the methods developed may help solve two difficult problems that hinder further progress in automating manufacturing processes: (1) Automating metal-cutting operations on CNC machine tools by providing a reliable non-contact method for detecting tool wear, and (2) Fully automating robotic manufacturing tasks by providing a reliable low-cost non-contact method for detecting on-line position errors. In addition, the studies offer a general non-contact method for detecting acoustic emission that may be useful in many other manufacturing and non-manufacturing areas, as well (e.g., monitoring and nondestructively testing structures, materials, manufacturing processes, and devices). By advancing the state of the art in manufacturing automation, the studies may help stimulate future growth in industrial productivity, which also promises to fuel economic growth and promote economic stability. The study also benefits the Department of Industrial Technology at Iowa State University and the field of Industrial Technology by contributing to the ongoing "smart" machine research program within the Department of Industrial Technology and by stimulating research into new sensor technologies within the University and within the field of Industrial Technology.

  2. Neurodegenerative changes in Alzheimer's disease: a comparative study of manual, semi-automated, and fully automated assessment using MRI

    NASA Astrophysics Data System (ADS)

    Fritzsche, Klaus H.; Giesel, Frederik L.; Heimann, Tobias; Thomann, Philipp A.; Hahn, Horst K.; Pantel, Johannes; Schröder, Johannes; Essig, Marco; Meinzer, Hans-Peter

    2008-03-01

    Objective quantification of disease specific neurodegenerative changes can facilitate diagnosis and therapeutic monitoring in several neuropsychiatric disorders. Reproducibility and easy-to-perform assessment are essential to ensure applicability in clinical environments. Aim of this comparative study is the evaluation of a fully automated approach that assesses atrophic changes in Alzheimer's disease (AD) and Mild Cognitive Impairment (MCI). 21 healthy volunteers (mean age 66.2), 21 patients with MCI (66.6), and 10 patients with AD (65.1) were enrolled. Subjects underwent extensive neuropsychological testing and MRI was conducted on a 1.5 Tesla clinical scanner. Atrophic changes were measured automatically by a series of image processing steps including state of the art brain mapping techniques. Results were compared with two reference approaches: a manual segmentation of the hippocampal formation and a semi-automated estimation of temporal horn volume, which is based upon interactive selection of two to six landmarks in the ventricular system. All approaches separated controls and AD patients significantly (10 -5 < p < 10 -4) and showed a slight but not significant increase of neurodegeneration for subjects with MCI compared to volunteers. The automated approach correlated significantly with the manual (r = -0.65, p < 10 -6) and semi automated (r = -0.83, p < 10 -13) measurements. It proved high accuracy and at the same time maximized observer independency, time reduction and thus usefulness for clinical routine.

  3. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  4. On-line monitoring of methanol and methyl formate in the exhaust gas of an industrial formaldehyde production plant by a mid-IR gas sensor based on tunable Fabry-Pérot filter technology.

    PubMed

    Genner, Andreas; Gasser, Christoph; Moser, Harald; Ofner, Johannes; Schreiber, Josef; Lendl, Bernhard

    2017-01-01

    On-line monitoring of key chemicals in an industrial production plant ensures economic operation, guarantees the desired product quality, and provides additional in-depth information on the involved chemical processes. For that purpose, rapid, rugged, and flexible measurement systems at reasonable cost are required. Here, we present the application of a flexible mid-IR filtometer for industrial gas sensing. The developed prototype consists of a modulated thermal infrared source, a temperature-controlled gas cell for absorption measurement and an integrated device consisting of a Fabry-Pérot interferometer and a pyroelectric mid-IR detector. The prototype was calibrated in the research laboratory at TU Wien for measuring methanol and methyl formate in the concentration ranges from 660 to 4390 and 747 to 4610 ppmV. Subsequently, the prototype was transferred and installed at the project partner Metadynea Austria GmbH and linked to their Process Control System via a dedicated micro-controller and used for on-line monitoring of the process off-gas. Up to five process streams were sequentially monitored in a fully automated manner. The obtained readings for methanol and methyl formate concentrations provided useful information on the efficiency and correct functioning of the process plant. Of special interest for industry is the now added capability to monitor the start-up phase and process irregularities with high time resolution (5 s).

  5. Simplifying operations with an uplink/downlink integration toolkit

    NASA Technical Reports Server (NTRS)

    Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine

    1994-01-01

    The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.

  6. Simplifying operations with an uplink/downlink integration toolkit

    NASA Astrophysics Data System (ADS)

    Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine

    1994-11-01

    The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.

  7. Digitise This! A Quick and Easy Remote Sensing Method to Monitor the Daily Extent of Dredge Plumes

    PubMed Central

    Evans, Richard D.; Murray, Kathy L.; Field, Stuart N.; Moore, James A. Y.; Shedrawi, George; Huntley, Barton G.; Fearns, Peter; Broomhall, Mark; McKinna, Lachlan I. W.; Marrable, Daniel

    2012-01-01

    Technological advancements in remote sensing and GIS have improved natural resource managers’ abilities to monitor large-scale disturbances. In a time where many processes are heading towards automation, this study has regressed to simple techniques to bridge a gap found in the advancement of technology. The near-daily monitoring of dredge plume extent is common practice using Moderate Resolution Imaging Spectroradiometer (MODIS) imagery and associated algorithms to predict the total suspended solids (TSS) concentration in the surface waters originating from floods and dredge plumes. Unfortunately, these methods cannot determine the difference between dredge plume and benthic features in shallow, clear water. This case study at Barrow Island, Western Australia, uses hand digitising to demonstrate the ability of human interpretation to determine this difference with a level of confidence and compares the method to contemporary TSS methods. Hand digitising was quick, cheap and required very little training of staff to complete. Results of ANOSIM R statistics show remote sensing derived TSS provided similar spatial results if they were thresholded to at least 3 mg L−1. However, remote sensing derived TSS consistently provided false-positive readings of shallow benthic features as Plume with a threshold up to TSS of 6 mg L−1, and began providing false-negatives (excluding actual plume) at a threshold as low as 4 mg L−1. Semi-automated processes that estimate plume concentration and distinguish between plumes and shallow benthic features without the arbitrary nature of human interpretation would be preferred as a plume monitoring method. However, at this stage, the hand digitising method is very useful and is more accurate at determining plume boundaries over shallow benthic features and is accessible to all levels of management with basic training. PMID:23240055

  8. A Distance Measure for Attention Focusing and Anomaly Detection in Systems Monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, R.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. Previous results on extending traditional anomaly detection techniques are summarized. The focus of this paper is a new technique for attention focusing.

  9. Development of a fully automated network system for long-term health-care monitoring at home.

    PubMed

    Motoi, K; Kubota, S; Ikarashi, A; Nogawa, M; Tanaka, S; Nemoto, T; Yamakoshi, K

    2007-01-01

    Daily monitoring of health condition at home is very important not only as an effective scheme for early diagnosis and treatment of cardiovascular and other diseases, but also for prevention and control of such diseases. From this point of view, we have developed a prototype room for fully automated monitoring of various vital signs. From the results of preliminary experiments using this room, it was confirmed that (1) ECG and respiration during bathing, (2) excretion weight and blood pressure, and (3) respiration and cardiac beat during sleep could be monitored with reasonable accuracy by the sensor system installed in bathtub, toilet and bed, respectively.

  10. Cognitive Support During High-Consequence Episodes of Care in Cardiovascular Surgery.

    PubMed

    Conboy, Heather M; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Christov, Stefan C; Goldman, Julian M; Yule, Steven J; Zenati, Marco A

    2017-03-01

    Despite significant efforts to reduce preventable adverse events in medical processes, such events continue to occur at unacceptable rates. This paper describes a computer science approach that uses formal process modeling to provide situationally aware monitoring and management support to medical professionals performing complex processes. These process models represent both normative and non-normative situations, and are validated by rigorous automated techniques such as model checking and fault tree analysis, in addition to careful review by experts. Context-aware Smart Checklists are then generated from the models, providing cognitive support during high-consequence surgical episodes. The approach is illustrated with a case study in cardiovascular surgery.

  11. A method for the automated long-term monitoring of three-spined stickleback Gasterosteus aculeatus shoal dynamics.

    PubMed

    Kleinhappel, T K; Al-Zoubi, A; Al-Diri, B; Burman, O; Dickinson, P; John, L; Wilkinson, A; Pike, T W

    2014-04-01

    This paper describes and evaluates a flexible, non-invasive tagging system for the automated identification and long-term monitoring of individual three-spined sticklebacks Gasterosteus aculeatus. The system is based on barcoded tags, which can be reliably and robustly detected and decoded to provide information on an individual's identity and location. Because large numbers of fish can be individually tagged, it can be used to monitor individual- and group-level dynamics within fish shoals. © 2014 The Fisheries Society of the British Isles.

  12. Image processing for identification and quantification of filamentous bacteria in in situ acquired images.

    PubMed

    Dias, Philipe A; Dunkel, Thiemo; Fajado, Diego A S; Gallegos, Erika de León; Denecke, Martin; Wiedemann, Philipp; Schneider, Fabio K; Suhr, Hajo

    2016-06-11

    In the activated sludge process, problems of filamentous bulking and foaming can occur due to overgrowth of certain filamentous bacteria. Nowadays, these microorganisms are typically monitored by means of light microscopy, commonly combined with staining techniques. As drawbacks, these methods are susceptible to human errors, subjectivity and limited by the use of discontinuous microscopy. The in situ microscope appears as a suitable tool for continuous monitoring of filamentous bacteria, providing real-time examination, automated analysis and eliminating sampling, preparation and transport of samples. In this context, a proper image processing algorithm is proposed for automated recognition and measurement of filamentous objects. This work introduces a method for real-time evaluation of images without any staining, phase-contrast or dilution techniques, differently from studies present in the literature. Moreover, we introduce an algorithm which estimates the total extended filament length based on geodesic distance calculation. For a period of twelve months, samples from an industrial activated sludge plant were weekly collected and imaged without any prior conditioning, replicating real environment conditions. Trends of filament growth rate-the most important parameter for decision making-are correctly identified. For reference images whose filaments were marked by specialists, the algorithm correctly recognized 72 % of the filaments pixels, with a false positive rate of at most 14 %. An average execution time of 0.7 s per image was achieved. Experiments have shown that the designed algorithm provided a suitable quantification of filaments when compared with human perception and standard methods. The algorithm's average execution time proved its suitability for being optimally mapped into a computational architecture to provide real-time monitoring.

  13. Science Goal Monitor: Science Goal Driven Automation for NASA Missions

    NASA Technical Reports Server (NTRS)

    Koratkar, Anuradha; Grosvenor, Sandy; Jung, John; Pell, Melissa; Matusow, David; Bailyn, Charles

    2004-01-01

    Infusion of automation technologies into NASA s future missions will be essential because of the need to: (1) effectively handle an exponentially increasing volume of scientific data, (2) successfully meet dynamic, opportunistic scientific goals and objectives, and (3) substantially reduce mission operations staff and costs. While much effort has gone into automating routine spacecraft operations to reduce human workload and hence costs, applying intelligent automation to the science side, i.e., science data acquisition, data analysis and reactions to that data analysis in a timely and still scientifically valid manner, has been relatively under-emphasized. In order to introduce science driven automation in missions, we must be able to: capture and interpret the science goals of observing programs, represent those goals in machine interpretable language; and allow spacecrafts onboard systems to autonomously react to the scientist's goals. In short, we must teach our platforms to dynamically understand, recognize, and react to the scientists goals. The Science Goal Monitor (SGM) project at NASA Goddard Space Flight Center is a prototype software tool being developed to determine the best strategies for implementing science goal driven automation in missions. The tools being developed in SGM improve the ability to monitor and react to the changing status of scientific events. The SGM system enables scientists to specify what to look for and how to react in descriptive rather than technical terms. The system monitors streams of science data to identify occurrences of key events previously specified by the scientist. When an event occurs, the system autonomously coordinates the execution of the scientist s desired reactions. Through SGM, we will improve om understanding about the capabilities needed onboard for success, develop metrics to understand the potential increase in science returns, and develop an operational prototype so that the perceived risks associated with increased use of automation can be reduced.

  14. An automated image processing routine for segmentation of cell cytoplasms in high-resolution autofluorescence images

    NASA Astrophysics Data System (ADS)

    Walsh, Alex J.; Skala, Melissa C.

    2014-02-01

    The heterogeneity of genotypes and phenotypes within cancers is correlated with disease progression and drug-resistant cellular sub-populations. Therefore, robust techniques capable of probing majority and minority cell populations are important both for cancer diagnostics and therapy monitoring. Herein, we present a modified CellProfiler routine to isolate cytoplasmic fluorescence signal on a single cell level from high resolution auto-fluorescence microscopic images.

  15. An Analysis of the Navy Regional Data Automation Center (NARDAC) chargeback System

    DTIC Science & Technology

    1986-09-01

    addition, operational control is concerned with performing predefined activities whereas management control relates to the organiza- tion’s goals and...In effect, the management control system monitors the progress of operations and alerts the "appropriate management level" when performance as measured...architecture, the financial control processes, and the audit function ( Brandon , 1978; Anderson, 1983). In an operating DP environment, however, non-financial

  16. Satellite temperature monitoring and prediction system

    NASA Technical Reports Server (NTRS)

    Barnett, U. R.; Martsolf, J. D.; Crosby, F. L.

    1980-01-01

    The paper describes the Florida Satellite Freeze Forecast System (SFFS) in its current state. All data collection options have been demonstrated, and data collected over a three year period have been stored for future analysis. Presently, specific minimum temperature forecasts are issued routinely from November through March. The procedures for issuing these forecast are discussed. The automated data acquisition and processing system is described, and the physical and statistical models employed are examined.

  17. In-flight control and communication architecture of the GLORIA imaging limb sounder on atmospheric research aircraft

    NASA Astrophysics Data System (ADS)

    Kretschmer, E.; Bachner, M.; Blank, J.; Dapp, R.; Ebersoldt, A.; Friedl-Vallon, F.; Guggenmoser, T.; Gulde, T.; Hartmann, V.; Lutz, R.; Maucher, G.; Neubert, T.; Oelhaf, H.; Preusse, P.; Schardt, G.; Schmitt, C.; Schönfeld, A.; Tan, V.

    2015-06-01

    The Gimballed Limb Observer for Radiance Imaging of the Atmosphere (GLORIA), a Fourier-transform-spectrometer-based limb spectral imager, operates on high-altitude research aircraft to study the transit region between the troposphere and the stratosphere. It is one of the most sophisticated systems to be flown on research aircraft in Europe, requiring constant monitoring and human intervention in addition to an automation system. To ensure proper functionality and interoperability on multiple platforms, a flexible control and communication system was laid out. The architectures of the communication system as well as the protocols used are reviewed. The integration of this architecture in the automation process as well as the scientific campaign flight application context are discussed.

  18. In-flight control and communication architecture of the GLORIA imaging limb-sounder on atmospheric research aircraft

    NASA Astrophysics Data System (ADS)

    Kretschmer, E.; Bachner, M.; Blank, J.; Dapp, R.; Ebersoldt, A.; Friedl-Vallon, F.; Guggenmoser, T.; Gulde, T.; Hartmann, V.; Lutz, R.; Maucher, G.; Neubert, T.; Oelhaf, H.; Preusse, P.; Schardt, G.; Schmitt, C.; Schönfeld, A.; Tan, V.

    2015-02-01

    The Gimballed Limb Observer for Radiance Imaging of the Atmosphere (GLORIA), a Fourier transform spectrometer based limb spectral imager, operates on high-altitude research aircraft to study the transit region between the troposphere and the stratosphere. It is one of the most sophisticated systems to be flown on research aircraft in Europe, requiring constant monitoring and human intervention in addition to an automation system. To ensure proper functionality and interoperability on multiple platforms, a flexible control and communication system was laid out. The architectures of the communication system as well as the protocols used are reviewed. The integration of this architecture in the automation process as well as the scientific campaign flight application context are discussed.

  19. An automated calibration laboratory - Requirements and design approach

    NASA Technical Reports Server (NTRS)

    O'Neil-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  20. Fault management for the Space Station Freedom control center

    NASA Technical Reports Server (NTRS)

    Clark, Colin; Jowers, Steven; Mcnenny, Robert; Culbert, Chris; Kirby, Sarah; Lauritsen, Janet

    1992-01-01

    This paper describes model based reasoning fault isolation in complex systems using automated digraph analysis. It discusses the use of the digraph representation as the paradigm for modeling physical systems and a method for executing these failure models to provide real-time failure analysis. It also discusses the generality, ease of development and maintenance, complexity management, and susceptibility to verification and validation of digraph failure models. It specifically describes how a NASA-developed digraph evaluation tool and an automated process working with that tool can identify failures in a monitored system when supplied with one or more fault indications. This approach is well suited to commercial applications of real-time failure analysis in complex systems because it is both powerful and cost effective.

  1. Meteorological data for selected sites along the Colorado River Corridor, Arizona, 2011-13

    USGS Publications Warehouse

    Caster, Joshua J.; Dealy, Timothy P.; Andrews, Timothy; Fairley, Helen C.; East, Amy E.; Sankey, Joel B.

    2014-01-01

    This report presents data from 14 automated weather stations collected as part of an ongoing monitoring program within the Grand Canyon National Park and Glen Canyon Recreation Area along the Colorado River Corridor in Arizona. Weather data presented in this document include precipitation, wind speed, maximum wind gusts, wind direction, barometric pressure, relative humidity, and air temperature collected by the Grand Canyon Monitoring and Research Center at 4-minute intervals between January 1, 2011, and December 31, 2013, using automated weather stations consisting of a data logger and a weather transmitter equipped with a piezoelectric sensor, ultrasonic transducers, and capacitive thermal and pressure sensors. Data collection was discontinuous because of station additions, station removals, changes in permits, and equipment failure. A large volume of data was collected for each station. These data are part of a larger research effort focused on physical processes affecting landscapes and archaeological-site stability in the Colorado River Corridor—both natural processes (including meteorological events) and those related to the Glen Canyon Dam operations. Meteorological conditions during the study interval were warmer and drier than is typical, due to ongoing drought conditions during the time period studied. The El Niño/Southern Oscillation was primarily in a neutral state during the reporting period.

  2. Bridge scour monitoring methods at three sites in Wisconsin

    USGS Publications Warehouse

    Walker, John F.; Hughes, Peter E.

    2005-01-01

    Of the nearly 11,500 bridges in Wisconsin, 89 have been assessed with critical scour conditions. The U.S. Geological Survey, in cooperation with the Wisconsin Department of Transportation, the Marathon County Highway Department, and the Jefferson County Highway Department, performed routine monitoring of streambed elevations for three bridges. Two monitoring approaches were employed: (1) manual monitoring using moderately simple equipment, and (2) automated monitoring, using moderately sophisticated electronic equipment. The results from all three sites demonstrate that both techniques can produce reasonable measurements of streambed elevation. The manual technique has a lower annual operating cost, and is useful for cases where documentation of long-term trends is desired. The automated technique has a higher annual operating cost and is useful for real-time monitoring of episodic events with short time durations. 

  3. Automated measurement of office, home and ambulatory blood pressure in atrial fibrillation.

    PubMed

    Kollias, Anastasios; Stergiou, George S

    2014-01-01

    1. Hypertension and atrial fibrillation (AF) often coexist and are strong risk factors for stroke. Current guidelines for blood pressure (BP) measurement in AF recommend repeated measurements using the auscultatory method, whereas the accuracy of the automated devices is regarded as questionable. This review presents the current evidence on the feasibility and accuracy of automated BP measurement in the presence of AF and the potential for automated detection of undiagnosed AF during such measurements. 2. Studies evaluating the use of automated BP monitors in AF are limited and have significant heterogeneity in methodology and protocols. Overall, the oscillometric method is feasible for static (office or home) and ambulatory use and appears to be more accurate for systolic than diastolic BP measurement. 3. Given that systolic hypertension is particularly common and important in the elderly, the automated BP measurement method may be acceptable for self-home and ambulatory monitoring, but not for professional office or clinic measurement. 4. An embedded algorithm for the detection of asymptomatic AF during routine automated BP measurement with high diagnostic accuracy has been developed and appears to be a useful screening tool for elderly hypertensives. © 2013 Wiley Publishing Asia Pty Ltd.

  4. Laboratory Automation and Intra-Laboratory Turnaround Time: Experience at the University Hospital Campus Bio-Medico of Rome.

    PubMed

    Angeletti, Silvia; De Cesaris, Marina; Hart, Jonathan George; Urbano, Michele; Vitali, Massimiliano Andrea; Fragliasso, Fulvio; Dicuonzo, Giordano

    2015-12-01

    Intra-laboratory turnaround time (TAT) is a key indicator of laboratory performance. Improving TAT is a complex task requiring staff education, equipment acquisition, and adequate TAT monitoring. The aim of the present study was to evaluate the intra-laboratory TAT after laboratory automation implementation (June 2013-June 2014) and to compare it to that in the preautomation period (July 2012-May 2013). Intra-laboratory TAT was evaluated both as the mean TAT registered and the percentage of outlier (OP) exams. The mean TAT was 36, 38, and 34 min during the study periods, respectively. These values respected the goal TAT established at 45 min. The OP, calculated at 45 min as well as at 60 min, decreased from 26 to 21 and from 11 to 5, respectively. From a focused analysis on blood count cell, troponin I, and prothrombin (PT) test, TAT improvement was more evident for tests requiring longer preanalytical process. The follow-up of TAT from June 2013 to June 2014 revealed the reduction of the mean TAT as well as of the OP exams after automation implementation and that automation more strongly affects the test in the preanalytical phase including centrifugation of the sample, such as troponin I and PT. © 2015 Society for Laboratory Automation and Screening.

  5. Automated monitoring of recovered water quality

    NASA Technical Reports Server (NTRS)

    Misselhorn, J. E.; Hartung, W. H.; Witz, S. W.

    1974-01-01

    Laboratory prototype water quality monitoring system provides automatic system for online monitoring of chemical, physical, and bacteriological properties of recovered water and for signaling malfunction in water recovery system. Monitor incorporates whenever possible commercially available sensors suitably modified.

  6. Real-Time Human Ambulation, Activity, and Physiological Monitoring: Taxonomy of Issues, Techniques, Applications, Challenges and Limitations

    PubMed Central

    Khusainov, Rinat; Azzi, Djamel; Achumba, Ifeyinwa E.; Bersch, Sebastian D.

    2013-01-01

    Automated methods of real-time, unobtrusive, human ambulation, activity, and wellness monitoring and data analysis using various algorithmic techniques have been subjects of intense research. The general aim is to devise effective means of addressing the demands of assisted living, rehabilitation, and clinical observation and assessment through sensor-based monitoring. The research studies have resulted in a large amount of literature. This paper presents a holistic articulation of the research studies and offers comprehensive insights along four main axes: distribution of existing studies; monitoring device framework and sensor types; data collection, processing and analysis; and applications, limitations and challenges. The aim is to present a systematic and most complete study of literature in the area in order to identify research gaps and prioritize future research directions. PMID:24072027

  7. The influence of drilling process automation on improvement of blasting works quality in open pit mining

    NASA Astrophysics Data System (ADS)

    Bodlak, Maciej; Dmytryk, Dominik; Mertuszka, Piotr; Szumny, Marcin; Tomkiewicz, Grzegorz

    2018-01-01

    The article describes the monitoring system of blasthole drilling process called HNS (Hole Navigation System), which was used in blasting works performed by Maxam Poland Ltd. Developed by Atlas Copco's, the HNS system - using satellite data - allows for a very accurate mapping of the designed grid of blastholes. The article presents the results of several conducted measurements of ground vibrations triggered by blasting, designed and performed using traditional technology and using the HNS system and shows first observations in this matter.

  8. Computer software configuration description, 241-AY and 241-AZ tank farm MICON automation system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winkelman, W.D.

    This document describes the configuration process, choices and conventions used during the configuration activities, and issues involved in making changes to the configuration. Includes the master listings of the Tag definitions, which should be revised to authorize any changes. Revision 2 incorporates minor changes to ensure the document setpoints accurately reflect limits (including exhaust stack flow of 800 scfm) established in OSD-T-151-00019. The MICON DCS software controls and monitors the instrumentation and equipment associated with plant systems and processes.

  9. Biopharmaceutical production: Applications of surface plasmon resonance biosensors.

    PubMed

    Thillaivinayagalingam, Pranavan; Gommeaux, Julien; McLoughlin, Michael; Collins, David; Newcombe, Anthony R

    2010-01-15

    Surface plasmon resonance (SPR) permits the quantitative analysis of therapeutic antibody concentrations and impurities including bacteria, Protein A, Protein G and small molecule ligands leached from chromatography media. The use of surface plasmon resonance has gained popularity within the biopharmaceutical industry due to the automated, label free, real time interaction that may be exploited when using this method. The application areas to assess protein interactions and develop analytical methods for biopharmaceutical downstream process development, quality control, and in-process monitoring are reviewed. 2009 Elsevier B.V. All rights reserved.

  10. Fluorescence Spectroscopy and Chemometric Modeling for Bioprocess Monitoring

    PubMed Central

    Faassen, Saskia M.; Hitzmann, Bernd

    2015-01-01

    On-line sensors for the detection of crucial process parameters are desirable for the monitoring, control and automation of processes in the biotechnology, food and pharma industry. Fluorescence spectroscopy as a highly developed and non-invasive technique that enables the on-line measurements of substrate and product concentrations or the identification of characteristic process states. During a cultivation process significant changes occur in the fluorescence spectra. By means of chemometric modeling, prediction models can be calculated and applied for process supervision and control to provide increased quality and the productivity of bioprocesses. A range of applications for different microorganisms and analytes has been proposed during the last years. This contribution provides an overview of different analysis methods for the measured fluorescence spectra and the model-building chemometric methods used for various microbial cultivations. Most of these processes are observed using the BioView® Sensor, thanks to its robustness and insensitivity to adverse process conditions. Beyond that, the PLS-method is the most frequently used chemometric method for the calculation of process models and prediction of process variables. PMID:25942644

  11. Communication Security for Control Systems in Smart Grid

    NASA Astrophysics Data System (ADS)

    Robles, Rosslin John; Kim, Tai-Hoon

    As an example of Control System, Supervisory Control and Data Acquisition systems can be relatively simple, such as one that monitors environmental conditions of a small office building, or incredibly complex, such as a system that monitors all the activity in a nuclear power plant or the activity of a municipal water system. SCADA systems are basically Process Control Systems, designed to automate systems such as traffic control, power grid management, waste processing etc. Connecting SCADA to the Internet can provide a lot of advantages in terms of control, data viewing and generation. SCADA infrastructures like electricity can also be a part of a Smart Grid. Connecting SCADA to a public network can bring a lot of security issues. To answer the security issues, a SCADA communication security solution is proposed.

  12. Advanced technologies for maintenance of electrical systems and equipment at the Savannah River Site Defense Waste Processing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Husler, R.O.; Weir, T.J.

    1991-01-01

    An enhanced maintenance program is being established to characterize and monitor cables, components, and process response at the Savannah River Site, Defense Waste Processing Facility. This facility was designed and constructed to immobilize the radioactive waste currently stored in underground storage tanks and is expected to begin operation in 1993. The plant is initiating the program to baseline and monitor instrument and control (I C) and electrical equipment, remote process equipment, embedded instrument and control cables, and in-cell jumper cables used in the facility. This program is based on the electronic characterization and diagnostic (ECAD) system which was modified tomore » include process response analysis and to meet rigid Department of Energy equipment requirements. The system consists of computer-automated, state-of-the-art electronics. The data that are gathered are stored in a computerized database for analysis, trending, and troubleshooting. It is anticipated that the data which are gathered and trended will aid in life extension for the facility.« less

  13. Advanced technologies for maintenance of electrical systems and equipment at the Savannah River Site Defense Waste Processing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Husler, R.O.; Weir, T.J.

    1991-12-31

    An enhanced maintenance program is being established to characterize and monitor cables, components, and process response at the Savannah River Site, Defense Waste Processing Facility. This facility was designed and constructed to immobilize the radioactive waste currently stored in underground storage tanks and is expected to begin operation in 1993. The plant is initiating the program to baseline and monitor instrument and control (I&C) and electrical equipment, remote process equipment, embedded instrument and control cables, and in-cell jumper cables used in the facility. This program is based on the electronic characterization and diagnostic (ECAD) system which was modified to includemore » process response analysis and to meet rigid Department of Energy equipment requirements. The system consists of computer-automated, state-of-the-art electronics. The data that are gathered are stored in a computerized database for analysis, trending, and troubleshooting. It is anticipated that the data which are gathered and trended will aid in life extension for the facility.« less

  14. Model-Based Infrared Metrology for Advanced Technology Nodes and 300 mm Wafer Processing

    NASA Astrophysics Data System (ADS)

    Rosenthal, Peter A.; Duran, Carlos; Tower, Josh; Mazurenko, Alex; Mantz, Ulrich; Weidner, Peter; Kasic, Alexander

    2005-09-01

    The use of infrared spectroscopy for production semiconductor process monitoring has evolved recently from primarily unpatterned, i.e. blanket test wafer measurements in a limited historical application space of blanket epitaxial, BPSG, and FSG layers to new applications involving patterned product wafer measurements, and new measurement capabilities. Over the last several years, the semiconductor industry has adopted a new set of materials associated with copper/low-k interconnects, and new structures incorporating exotic materials including silicon germanium, SOI substrates and high aspect ratio trenches. The new device architectures and more chemically sophisticated materials have raised new process control and metrology challenges that are not addressed by current measurement technology. To address the challenges we have developed a new infrared metrology tool designed for emerging semiconductor production processes, in a package compatible with modern production and R&D environments. The tool incorporates recent advances in reflectance instrumentation including highly accurate signal processing, optimized reflectometry optics, and model-based calibration and analysis algorithms. To meet the production requirements of the modern automated fab, the measurement hardware has been integrated with a fully automated 300 mm platform incorporating front opening unified pod (FOUP) interfaces, automated pattern recognition and high throughput ultra clean robotics. The tool employs a suite of automated dispersion-model analysis algorithms capable of extracting a variety of layer properties from measured spectra. The new tool provides excellent measurement precision, tool matching, and a platform for deploying many new production and development applications. In this paper we will explore the use of model based infrared analysis as a tool for characterizing novel bottle capacitor structures employed in high density dynamic random access memory (DRAM) chips. We will explore the capability of the tool for characterizing multiple geometric parameters associated with the manufacturing process that are important to the yield and performance of advanced bottle DRAM devices.

  15. Environment Monitor

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Viking landers touched down on Mars equipped with a variety of systems to conduct automated research, each carrying a compact but highly sophisticated instrument for analyzing Martian soil and atmosphere. Instrument called a Gas Chromatography/Mass Spectrometer (GC/MS) had to be small, lightweight, shock resistant, highly automated and extremely sensitive, yet require minimal electrical power. Viking Instruments Corporation commercialized this technology and targeted their primary market as environmental monitoring, especially toxic and hazardous waste site monitoring. Waste sites often contain chemicals in complex mixtures, and the conventional method of site characterization, taking samples on-site and sending them to a laboratory for analysis is time consuming and expensive. Other terrestrial applications are explosive detection in airports, drug detection, industrial air monitoring, medical metabolic monitoring and for military, chemical warfare agents.

  16. Multi-static Serial LiDAR for Surveillance and Identification of Marine Life at MHK Installations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alsenas, Gabriel; Dalgleish, Fraser; Ouyang, Bing

    Final Report for project DE-EE0006787: Multi-static Serial LiDAR for Surveillance and Identification of Marine Life at MHK Installations. This project developed and tested an optical monitoring system prototype that will be suitable for marine and hydrokinetic (MHK) full project lifecycle observation (baseline, commissioning, and decommissioning), with automated real-time classification of marine animals. This system can be deployed to collect pre-installation baseline species observations at a proposed deployment site with minimal post-processing overhead. To satisfy deployed MHK project species of concern (e.g. Endangered Species Act-listed) monitoring requirements, the system provides automated tracking and notification of the presence of managed animals withinmore » established perimeters of MHK equipment and provides high resolution imagery of their behavior through a wide range of conditions. During a project’s decommissioning stage, the system can remain installed to provide resource managers with post-installation data. Our technology, known as an Unobtrusive Multi-static Serial LiDAR Imager (UMSLI), is a technology transfer of underwater distributed LiDAR imaging technology that preserves the advantages of traditional optical and acoustic solutions while overcoming associated disadvantages for MHK environmental monitoring applications. This new approach is a purposefully-designed, reconfigurable adaptation of an existing technology that can be easily mounted on or around different classes of MHK equipment. The system uses low average power red (638nm) laser illumination to be invisible and eye-safe to marine animals and is compact and cost effective. The equipment is designed for long term, maintenance-free operations, to inherently generate a sparse primary dataset that only includes detected anomalies (animal presence information), and to allow robust real-time automated animal classification/identification with a low data bandwidth requirement. Advantages of the technology over others currently being used or being considered for MHK monitoring include: Unlike a conventional camera, the depth of field is near-infinite and limited by attenuation (approximately 5-8 m) rather than focal properties of a lens; Operation in an adaptive mode which can project a sparse grid of pulses with higher peak power for longer range detection (>10 meters) and track animals within a zone of interest with high resolution imagery for identification of marine life at closer range (<5m); System detection limit and Signal-to-Noise-Ratio is superior to a camera, due to rejection of both backscattering component and ambient solar background; Multiple wide-angle pulsed laser illuminators and bucket detectors can be flexibly configured to cover a 4pi steradian (i.e. omnidirectional) scene volume, while also retrieving 3D features of animal targets from timing information; Process and classification framework centered around a novel active learning and incremental classification classifier that enables accurate identification of a variety of marine animals automatically; A two-tiered monitoring architecture and invisible watermarking-based data archiving and retrieving approach ensures significant data reduction while preserving high fidelity monitoring. A methodology to train and optimize the classifier for target species of concern to optimize site monitoring effectiveness. This technological innovation addresses a high priority regulatory requirement to observe marine life interaction near MHK projects. Our solution improves resource manager confidence that any interactions between marine animals and equipment are observed in a cost-effective and automated manner. Without EERE funding, this novel application of multi-static LiDAR would not have been available to the MHK community for environmental monitoring.« less

  17. Online measurement of bead geometry in GMAW-based additive manufacturing using passive vision

    NASA Astrophysics Data System (ADS)

    Xiong, Jun; Zhang, Guangjun

    2013-11-01

    Additive manufacturing based on gas metal arc welding is an advanced technique for depositing fully dense components with low cost. Despite this fact, techniques to achieve accurate control and automation of the process have not yet been perfectly developed. The online measurement of the deposited bead geometry is a key problem for reliable control. In this work a passive vision-sensing system, comprising two cameras and composite filtering techniques, was proposed for real-time detection of the bead height and width through deposition of thin walls. The nozzle to the top surface distance was monitored for eliminating accumulated height errors during the multi-layer deposition process. Various image processing algorithms were applied and discussed for extracting feature parameters. A calibration procedure was presented for the monitoring system. Validation experiments confirmed the effectiveness of the online measurement system for bead geometry in layered additive manufacturing.

  18. Heterogeneous access and processing of EO-Data on a Cloud based Infrastructure delivering operational Products

    NASA Astrophysics Data System (ADS)

    Niggemann, F.; Appel, F.; Bach, H.; de la Mar, J.; Schirpke, B.; Dutting, K.; Rucker, G.; Leimbach, D.

    2015-04-01

    To address the challenges of effective data handling faced by Small and Medium Sized Enterprises (SMEs) a cloud-based infrastructure for accessing and processing of Earth Observation(EO)-data has been developed within the project APPS4GMES(www.apps4gmes.de). To gain homogenous multi mission data access an Input Data Portal (IDP) been implemented on this infrastructure. The IDP consists of an Open Geospatial Consortium (OGC) conformant catalogue, a consolidation module for format conversion and an OGC-conformant ordering framework. Metadata of various EO-sources and with different standards is harvested and transferred to an OGC conformant Earth Observation Product standard and inserted into the catalogue by a Metadata Harvester. The IDP can be accessed for search and ordering of the harvested datasets by the services implemented on the cloud infrastructure. Different land-surface services have been realised by the project partners, using the implemented IDP and cloud infrastructure. Results of these are customer ready products, as well as pre-products (e.g. atmospheric corrected EO data), serving as a basis for other services. Within the IDP an automated access to ESA's Sentinel-1 Scientific Data Hub has been implemented. Searching and downloading of the SAR data can be performed in an automated way. With the implementation of the Sentinel-1 Toolbox and own software, for processing of the datasets for further use, for example for Vista's snow monitoring, delivering input for the flood forecast services, can also be performed in an automated way. For performance tests of the cloud environment a sophisticated model based atmospheric correction and pre-classification service has been implemented. Tests conducted an automated synchronised processing of one entire Landsat 8 (LS-8) coverage for Germany and performance comparisons to standard desktop systems. Results of these tests, showing a performance improvement by the factor of six, proved the high flexibility and computing power of the cloud environment. To make full use of the cloud capabilities a possibility for automated upscaling of the hardware resources has been implemented. Together with the IDP infrastructure fast and automated processing of various satellite sources to deliver market ready products can be realised, thus increasing customer needs and numbers can be satisfied without loss of accuracy and quality.

  19. Evaluating Cell Processes, Quality, and Biomarkers in Pluripotent Stem Cells Using Video Bioinformatics

    PubMed Central

    Lin, Sabrina C.; Bays, Brett C.; Omaiye, Esther; Bhanu, Bir; Talbot, Prue

    2016-01-01

    There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells. PMID:26848582

  20. Evaluating Cell Processes, Quality, and Biomarkers in Pluripotent Stem Cells Using Video Bioinformatics.

    PubMed

    Zahedi, Atena; On, Vincent; Lin, Sabrina C; Bays, Brett C; Omaiye, Esther; Bhanu, Bir; Talbot, Prue

    2016-01-01

    There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells.

  1. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    NASA Astrophysics Data System (ADS)

    Schovancová, J.; Campana, S.; Di Girolamo, A.; Jézéquel, S.; Ueda, I.; Wenaus, T.; Atlas Collaboration

    2014-06-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources. During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visualization bits across the different tools. A rich family of various filtering and searching options enhancing available user interfaces comes naturally with the data and visualization layer separation. With a variety of reliable monitoring data accessible through standardized interfaces, the possibility of automating actions under well defined conditions correlating multiple data sources has become feasible. In this contribution we discuss also about the automated exclusion of degraded resources and their automated recovery in various activities.

  2. Process development for automated solar cell and module production. Task 4: Automated array assembly

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use was developed. The process sequence was then critically analyzed from a technical and economic standpoint to determine the technological readiness of certain process steps for implementation. The steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect, both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development.

  3. An Architecture for Automated Fire Detection Early Warning System Based on Geoprocessing Service Composition

    NASA Astrophysics Data System (ADS)

    Samadzadegan, F.; Saber, M.; Zahmatkesh, H.; Joze Ghazi Khanlou, H.

    2013-09-01

    Rapidly discovering, sharing, integrating and applying geospatial information are key issues in the domain of emergency response and disaster management. Due to the distributed nature of data and processing resources in disaster management, utilizing a Service Oriented Architecture (SOA) to take advantages of workflow of services provides an efficient, flexible and reliable implementations to encounter different hazardous situation. The implementation specification of the Web Processing Service (WPS) has guided geospatial data processing in a Service Oriented Architecture (SOA) platform to become a widely accepted solution for processing remotely sensed data on the web. This paper presents an architecture design based on OGC web services for automated workflow for acquisition, processing remotely sensed data, detecting fire and sending notifications to the authorities. A basic architecture and its building blocks for an automated fire detection early warning system are represented using web-based processing of remote sensing imageries utilizing MODIS data. A composition of WPS processes is proposed as a WPS service to extract fire events from MODIS data. Subsequently, the paper highlights the role of WPS as a middleware interface in the domain of geospatial web service technology that can be used to invoke a large variety of geoprocessing operations and chaining of other web services as an engine of composition. The applicability of proposed architecture by a real world fire event detection and notification use case is evaluated. A GeoPortal client with open-source software was developed to manage data, metadata, processes, and authorities. Investigating feasibility and benefits of proposed framework shows that this framework can be used for wide area of geospatial applications specially disaster management and environmental monitoring.

  4. Ariadne's Thread: A Robust Software Solution Leading to Automated Absolute and Relative Quantification of SRM Data.

    PubMed

    Nasso, Sara; Goetze, Sandra; Martens, Lennart

    2015-09-04

    Selected reaction monitoring (SRM) MS is a highly selective and sensitive technique to quantify protein abundances in complex biological samples. To enhance the pace of SRM large studies, a validated, robust method to fully automate absolute quantification and to substitute for interactive evaluation would be valuable. To address this demand, we present Ariadne, a Matlab software. To quantify monitored targets, Ariadne exploits metadata imported from the transition lists, and targets can be filtered according to mProphet output. Signal processing and statistical learning approaches are combined to compute peptide quantifications. To robustly estimate absolute abundances, the external calibration curve method is applied, ensuring linearity over the measured dynamic range. Ariadne was benchmarked against mProphet and Skyline by comparing its quantification performance on three different dilution series, featuring either noisy/smooth traces without background or smooth traces with complex background. Results, evaluated as efficiency, linearity, accuracy, and precision of quantification, showed that Ariadne's performance is independent of data smoothness and complex background presence and that Ariadne outperforms mProphet on the noisier data set and improved 2-fold Skyline's accuracy and precision for the lowest abundant dilution with complex background. Remarkably, Ariadne could statistically distinguish from each other all different abundances, discriminating dilutions as low as 0.1 and 0.2 fmol. These results suggest that Ariadne offers reliable and automated analysis of large-scale SRM differential expression studies.

  5. Human Papillomavirus (HPV) Genotyping: Automation and Application in Routine Laboratory Testing

    PubMed Central

    Torres, M; Fraile, L; Echevarria, JM; Hernandez Novoa, B; Ortiz, M

    2012-01-01

    A large number of assays designed for genotyping human papillomaviruses (HPV) have been developed in the last years. They perform within a wide range of analytical sensitivity and specificity values for the different viral types, and are used either for diagnosis, epidemiological studies, evaluation of vaccines and implementing and monitoring of vaccination programs. Methods for specific genotyping of HPV-16 and HPV-18 are also useful for the prevention of cervical cancer in screening programs. Some commercial tests are, in addition, fully or partially automated. Automation of HPV genotyping presents advantages such as the simplicity of the testing procedure for the operator, the ability to process a large number of samples in a short time, and the reduction of human errors from manual operations, allowing a better quality assurance and a reduction of cost. The present review collects information about the current HPV genotyping tests, with special attention to practical aspects influencing their use in clinical laboratories. PMID:23248734

  6. SYRIAC: The systematic review information automated collection system a data warehouse for facilitating automated biomedical text classification.

    PubMed

    Yang, Jianji J; Cohen, Aaron M; Cohen, Aaron; McDonagh, Marian S

    2008-11-06

    Automatic document classification can be valuable in increasing the efficiency in updating systematic reviews (SR). In order for the machine learning process to work well, it is critical to create and maintain high-quality training datasets consisting of expert SR inclusion/exclusion decisions. This task can be laborious, especially when the number of topics is large and source data format is inconsistent.To approach this problem, we build an automated system to streamline the required steps, from initial notification of update in source annotation files to loading the data warehouse, along with a web interface to monitor the status of each topic. In our current collection of 26 SR topics, we were able to standardize almost all of the relevance judgments and recovered PMIDs for over 80% of all articles. Of those PMIDs, over 99% were correct in a manual random sample study. Our system performs an essential function in creating training and evaluation data sets for SR text mining research.

  7. SYRIAC: The SYstematic Review Information Automated Collection System A Data Warehouse for Facilitating Automated Biomedical Text Classification

    PubMed Central

    Yang, Jianji J.; Cohen, Aaron M.; McDonagh, Marian S.

    2008-01-01

    Automatic document classification can be valuable in increasing the efficiency in updating systematic reviews (SR). In order for the machine learning process to work well, it is critical to create and maintain high-quality training datasets consisting of expert SR inclusion/exclusion decisions. This task can be laborious, especially when the number of topics is large and source data format is inconsistent. To approach this problem, we build an automated system to streamline the required steps, from initial notification of update in source annotation files to loading the data warehouse, along with a web interface to monitor the status of each topic. In our current collection of 26 SR topics, we were able to standardize almost all of the relevance judgments and recovered PMIDs for over 80% of all articles. Of those PMIDs, over 99% were correct in a manual random sample study. Our system performs an essential function in creating training and evaluation datasets for SR text mining research. PMID:18999194

  8. Satellite freeze forecast system: Executive summary

    NASA Technical Reports Server (NTRS)

    Martsolf, J. D. (Principal Investigator)

    1983-01-01

    A satellite-based temperature monitoring and prediction system consisting of a computer controlled acquisition, processing, and display system and the ten automated weather stations called by that computer was developed and transferred to the national weather service. This satellite freeze forecasting system (SFFS) acquires satellite data from either one of two sources, surface data from 10 sites, displays the observed data in the form of color-coded thermal maps and in tables of automated weather station temperatures, computes predicted thermal maps when requested and displays such maps either automatically or manually, archives the data acquired, and makes comparisons with historical data. Except for the last function, SFFS handles these tasks in a highly automated fashion if the user so directs. The predicted thermal maps are the result of two models, one a physical energy budget of the soil and atmosphere interface and the other a statistical relationship between the sites at which the physical model predicts temperatures and each of the pixels of the satellite thermal map.

  9. A scale-up field experiment for the monitoring of a burning process using chemical, audio, and video sensors.

    PubMed

    Stavrakakis, P; Agapiou, A; Mikedi, K; Karma, S; Statheropoulos, M; Pallis, G C; Pappa, A

    2014-01-01

    Fires are becoming more violent and frequent resulting in major economic losses and long-lasting effects on communities and ecosystems; thus, efficient fire monitoring is becoming a necessity. A novel triple multi-sensor approach was developed for monitoring and studying the burning of dry forest fuel in an open field scheduled experiment; chemical, optical, and acoustical sensors were combined to record the fire spread. The results of this integrated field campaign for real-time monitoring of the fire event are presented and discussed. Chemical analysis, despite its limitations, corresponded to the burning process with a minor time delay. Nevertheless, the evolution profile of CO2, CO, NO, and O2 were detected and monitored. The chemical monitoring of smoke components enabled the observing of the different fire phases (flaming, smoldering) based on the emissions identified in each phase. The analysis of fire acoustical signals presented accurate and timely response to the fire event. In the same content, the use of a thermographic camera, for monitoring the biomass burning, was also considerable (both profiles of the intensities of average gray and red component greater than 230) and presented similar promising potentials to audio results. Further work is needed towards integrating sensors signals for automation purposes leading to potential applications in real situations.

  10. AUTOMATED TECHNIQUE FOR FLOW MEASUREMENTS FROM MARIOTTE RESERVOIRS.

    USGS Publications Warehouse

    Constantz, Jim; Murphy, Fred

    1987-01-01

    The mariotte reservoir supplies water at a constant hydraulic pressure by self-regulation of its internal gas pressure. Automated outflow measurements from mariotte reservoirs are generally difficult because of the reservoir's self-regulation mechanism. This paper describes an automated flow meter specifically designed for use with mariotte reservoirs. The flow meter monitors changes in the mariotte reservoir's gas pressure during outflow to determine changes in the reservoir's water level. The flow measurement is performed by attaching a pressure transducer to the top of a mariotte reservoir and monitoring gas pressure changes during outflow with a programmable data logger. The advantages of the new automated flow measurement techniques include: (i) the ability to rapidly record a large range of fluxes without restricting outflow, and (ii) the ability to accurately average the pulsing flow, which commonly occurs during outflow from the mariotte reservoir.

  11. Publicly Available Online Tool Facilitates Real-Time Monitoring Of Vaccine Conversations And Sentiments.

    PubMed

    Bahk, Chi Y; Cumming, Melissa; Paushter, Louisa; Madoff, Lawrence C; Thomson, Angus; Brownstein, John S

    2016-02-01

    Real-time monitoring of mainstream and social media can inform public health practitioners and policy makers about vaccine sentiment and hesitancy. We describe a publicly available platform for monitoring vaccination-related content, called the Vaccine Sentimeter. With automated data collection from 100,000 mainstream media sources and Twitter, natural-language processing for automated filtering, and manual curation to ensure accuracy, the Vaccine Sentimeter offers a global real-time view of vaccination conversations online. To assess the system's utility, we followed two events: polio vaccination in Pakistan after a news story about a Central Intelligence Agency vaccination ruse and subsequent attacks on health care workers, and a controversial episode in a television program about adverse events following human papillomavirus vaccination. For both events, increased online activity was detected and characterized. For the first event, Twitter response to the attacks on health care workers decreased drastically after the first attack, in contrast to mainstream media coverage. For the second event, the mainstream and social media response was largely positive about the HPV vaccine, but antivaccine conversations persisted longer than the provaccine reaction. Using the Vaccine Sentimeter could enable public health professionals to detect increased online activity or sudden shifts in sentiment that could affect vaccination uptake. Project HOPE—The People-to-People Health Foundation, Inc.

  12. Piezoelectric MEMS resonators for monitoring grape must fermentation

    NASA Astrophysics Data System (ADS)

    Toledo, J.; Jiménez-Márquez, F.; Úbeda, J.; Ruiz-Díez, V.; Pfusterschmied, G.; Schmid, U.; Sánchez-Rojas, J. L.

    2016-10-01

    The traditional procedure followed by winemakers for monitoring grape must fermentation is not automated, has not enough accuracy or has only been tested in discrete must samples. In order to contribute to the automation and improvement of the wine fermentation process, we have designed an AlN-based piezoelectric microresonator, serving as a density sensor and being excited in the 4th-order roof tile-shaped vibration mode. Furthermore, conditioning circuits were designed to convert the one-port impedance of the resonator into a resonant two-port transfer function. This allowed us to design a Phase Locked Loop-based oscillator circuit, implemented with a commercial lock-in amplifier with an oscillation frequency determined by the vibrating mode. We were capable of measuring the fermentation kinetics by both tracking the resonance frequency and by determining the quality factor measurements of the microresonator. Moreover, the resonator was calibrated with an artificial model solution of grape must and then applied for the monitoring of real grape must fermentation. Our results demonstrate the high potential of MEMS resonators to detect the decrease in sugar and the increase in ethanol concentrations during the grape must fermentation with a resolution of 100 μg/ml and a sensitivity of 0.16 Hz/μg/ml as upper limits.

  13. Data Pre-Processing for Label-Free Multiple Reaction Monitoring (MRM) Experiments

    PubMed Central

    Chung, Lisa M.; Colangelo, Christopher M.; Zhao, Hongyu

    2014-01-01

    Multiple Reaction Monitoring (MRM) conducted on a triple quadrupole mass spectrometer allows researchers to quantify the expression levels of a set of target proteins. Each protein is often characterized by several unique peptides that can be detected by monitoring predetermined fragment ions, called transitions, for each peptide. Concatenating large numbers of MRM transitions into a single assay enables simultaneous quantification of hundreds of peptides and proteins. In recognition of the important role that MRM can play in hypothesis-driven research and its increasing impact on clinical proteomics, targeted proteomics such as MRM was recently selected as the Nature Method of the Year. However, there are many challenges in MRM applications, especially data pre‑processing where many steps still rely on manual inspection of each observation in practice. In this paper, we discuss an analysis pipeline to automate MRM data pre‑processing. This pipeline includes data quality assessment across replicated samples, outlier detection, identification of inaccurate transitions, and data normalization. We demonstrate the utility of our pipeline through its applications to several real MRM data sets. PMID:24905083

  14. Data Pre-Processing for Label-Free Multiple Reaction Monitoring (MRM) Experiments.

    PubMed

    Chung, Lisa M; Colangelo, Christopher M; Zhao, Hongyu

    2014-06-05

    Multiple Reaction Monitoring (MRM) conducted on a triple quadrupole mass spectrometer allows researchers to quantify the expression levels of a set of target proteins. Each protein is often characterized by several unique peptides that can be detected by monitoring predetermined fragment ions, called transitions, for each peptide. Concatenating large numbers of MRM transitions into a single assay enables simultaneous quantification of hundreds of peptides and proteins. In recognition of the important role that MRM can play in hypothesis-driven research and its increasing impact on clinical proteomics, targeted proteomics such as MRM was recently selected as the Nature Method of the Year. However, there are many challenges in MRM applications, especially data pre‑processing where many steps still rely on manual inspection of each observation in practice. In this paper, we discuss an analysis pipeline to automate MRM data pre‑processing. This pipeline includes data quality assessment across replicated samples, outlier detection, identification of inaccurate transitions, and data normalization. We demonstrate the utility of our pipeline through its applications to several real MRM data sets.

  15. A Portable Surface Contamination Monitor Based on the Principle of Optically Stimulated Electron Emission (OSEE)

    NASA Technical Reports Server (NTRS)

    Perey, D. F.

    1996-01-01

    Many industrial and aerospace processes involving the joining of materials, require sufficient surface cleanliness to insure proper bonding. Processes as diverse as painting, welding, or the soldering of electronic circuits will be compromised if prior inspection and removal of surface contaminants is inadequate. As process requirements become more stringent and the number of different materials and identified contaminants increases, various instruments and techniques have been developed for improved inspection. One such technique, based on the principle of Optically Stimulated Electron Emission (OSEE), has been explored for a number of years as a tool for surface contamination monitoring. Some of the benefits of OSEE are: it is non-contacting; requires little operator training; and has very high contamination sensitivity. This paper describes the development of a portable OSEE based surface contamination monitor. The instrument is suitable for both hand-held and robotic inspections with either manual or automated control of instrument operation. In addition, instrument output data is visually displayed to the operator and may be sent to an external computer for archiving or analysis.

  16. 2006 Pathogen and Toxin Concentration Systems for Water Monitoring

    DTIC Science & Technology

    2012-07-24

    design and construct a compact, portable automated device enabling the simultaneous concentration of protozoa , bacteria, bacterial spores, algae and...portable automated device enabling the simultaneous concentration of protozoa , bacteria, bacterial spores, algae and viruses from large volumes of various...construct a compact, portable automated device enabling the simultaneous concentration of protozoa , bacteria, bacterial spores, algae and viruses

  17. The Automated Logistics Element Planning System (ALEPS)

    NASA Technical Reports Server (NTRS)

    Schwaab, Douglas G.

    1991-01-01

    The design and functions of ALEPS (Automated Logistics Element Planning System) is a computer system that will automate planning and decision support for Space Station Freedom Logistical Elements (LEs) resupply and return operations. ALEPS provides data management, planning, analysis, monitoring, interfacing, and flight certification for support of LE flight load planning activities. The prototype ALEPS algorithm development is described.

  18. Monitoring-Based Model for Personalizing the Clinical Process of Crohn’s Disease

    PubMed Central

    de Ramón-Fernández, Alberto; Ruiz-Fernández, Daniel; Vives-Boix, Víctor

    2017-01-01

    Crohn’s disease is a chronic pathology belonging to the group of inflammatory bowel diseases. Patients suffering from Crohn’s disease must be supervised by a medical specialist for the rest of their lives; furthermore, each patient has its own characteristics and is affected by the disease in a different way, so health recommendations and treatments cannot be generalized and should be individualized for a specific patient. To achieve this personalization in a cost-effective way using technology, we propose a model based on different information flows: control, personalization, and monitoring. As a result of the model and to perform a functional validation, an architecture based on services and a prototype of the system has been defined. In this prototype, a set of different devices and technologies to monitor variables from patients and their environment has been integrated. Artificial intelligence algorithms are also included to reduce the workload related to the review and analysis of the information gathered. Due to the continuous and automated monitoring of the Crohn’s patient, this proposal can help in the personalization of the Crohn’s disease clinical process. PMID:28678162

  19. Monitoring WLCG with lambda-architecture: a new scalable data store and analytics platform for monitoring at petabyte scale.

    NASA Astrophysics Data System (ADS)

    Magnoni, L.; Suthakar, U.; Cordeiro, C.; Georgiou, M.; Andreeva, J.; Khan, A.; Smith, D. R.

    2015-12-01

    Monitoring the WLCG infrastructure requires the gathering and analysis of a high volume of heterogeneous data (e.g. data transfers, job monitoring, site tests) coming from different services and experiment-specific frameworks to provide a uniform and flexible interface for scientists and sites. The current architecture, where relational database systems are used to store, to process and to serve monitoring data, has limitations in coping with the foreseen increase in the volume (e.g. higher LHC luminosity) and the variety (e.g. new data-transfer protocols and new resource-types, as cloud-computing) of WLCG monitoring events. This paper presents a new scalable data store and analytics platform designed by the Support for Distributed Computing (SDC) group, at the CERN IT department, which uses a variety of technologies each one targeting specific aspects of big-scale distributed data-processing (commonly referred as lambda-architecture approach). Results of data processing on Hadoop for WLCG data activities monitoring are presented, showing how the new architecture can easily analyze hundreds of millions of transfer logs in a few minutes. Moreover, a comparison of data partitioning, compression and file format (e.g. CSV, Avro) is presented, with particular attention given to how the file structure impacts the overall MapReduce performance. In conclusion, the evolution of the current implementation, which focuses on data storage and batch processing, towards a complete lambda-architecture is discussed, with consideration of candidate technology for the serving layer (e.g. Elasticsearch) and a description of a proof of concept implementation, based on Apache Spark and Esper, for the real-time part which compensates for batch-processing latency and automates problem detection and failures.

  20. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  1. Controlling Air Traffic (Simulated) in the Presence of Automation (CATS PAu) 1995: A Study of Measurement Techniques for Situation Awareness in Air Traffic Control

    NASA Technical Reports Server (NTRS)

    French, Jennifer R.

    1995-01-01

    As automated systems proliferate in aviation systems, human operators are taking on less and less of an active role in the jobs they once performed, often reducing what should be important jobs to tasks barely more complex than monitoring machines. When operators are forced into these roles, they risk slipping into hazardous states of awareness, which can lead to reduced skills, lack of vigilance, and the inability to react quickly and competently when there is a machine failure. Using Air Traffic Control (ATC) as a model, the present study developed tools for conducting tests focusing on levels of automation as they relate to situation awareness. Subjects participated in a two-and-a-half hour experiment that consisted of a training period followed by a simulation of air traffic control similar to the system presently used by the FAA, then an additional simulation employing automated assistance. Through an iterative design process utilizing numerous revisions and three experimental sessions, several measures for situational awareness in a simulated Air Traffic Control System were developed and are prepared for use in future experiments.

  2. Automated data collection equipment for monitoring highway condition.

    DOT National Transportation Integrated Search

    2005-06-01

    This study was conducted to evaluate automated vehicle mounted equipment to collect data on the needs of : Oregons highway inventory. Four vendors accepted invitations to evaluate their equipment. Although ODOT had : conducted a similar evaluation...

  3. Automated validation of a computer operating system

    NASA Technical Reports Server (NTRS)

    Dervage, M. M.; Milberg, B. A.

    1970-01-01

    Programs apply selected input/output loads to complex computer operating system and measure performance of that system under such loads. Technique lends itself to checkout of computer software designed to monitor automated complex industrial systems.

  4. The feasibility of automated online flow cytometry for in-situ monitoring of microbial dynamics in aquatic ecosystems

    PubMed Central

    Besmer, Michael D.; Weissbrodt, David G.; Kratochvil, Bradley E.; Sigrist, Jürg A.; Weyland, Mathias S.; Hammes, Frederik

    2014-01-01

    Fluorescent staining coupled with flow cytometry (FCM) is often used for the monitoring, quantification and characterization of bacteria in engineered and environmental aquatic ecosystems including seawater, freshwater, drinking water, wastewater, and industrial bioreactors. However, infrequent grab sampling hampers accurate characterization and subsequent understanding of microbial dynamics in all of these ecosystems. A logic technological progression is high throughput and full automation of the sampling, staining, measurement, and data analysis steps. Here we assess the feasibility and applicability of automated FCM by means of actual data sets produced with prototype instrumentation. As proof-of-concept we demonstrate examples of microbial dynamics in (i) flowing tap water from a municipal drinking water supply network and (ii) river water from a small creek subject to two rainfall events. In both cases, automated measurements were done at 15-min intervals during 12–14 consecutive days, yielding more than 1000 individual data points for each ecosystem. The extensive data sets derived from the automated measurements allowed for the establishment of baseline data for each ecosystem, as well as for the recognition of daily variations and specific events that would most likely be missed (or miss-characterized) by infrequent sampling. In addition, the online FCM data from the river water was combined and correlated with online measurements of abiotic parameters, showing considerable potential for a better understanding of cause-and-effect relationships in aquatic ecosystems. Although several challenges remain, the successful operation of an automated online FCM system and the basic interpretation of the resulting data sets represent a breakthrough toward the eventual establishment of fully automated online microbiological monitoring technologies. PMID:24917858

  5. Development and improvement of the operating diagnostics systems of NPO CKTI works for turbine of thermal and nuclear power plants

    NASA Astrophysics Data System (ADS)

    Kovalev, I. A.; Rakovskii, V. G.; Isakov, N. Yu.; Sandovskii, A. V.

    2016-03-01

    The work results on the development and improvement of the techniques, algorithms, and software-hardware of continuous operating diagnostics systems of rotating units and parts of turbine equipment state are presented. In particular, to ensure the full remote service of monitored turbine equipment using web technologies, the web version of the software of the automated systems of vibration-based diagnostics (ASVD VIDAS) was developed. The experience in the automated analysis of data obtained by ASVD VIDAS form the basis of the new algorithm of early detection of such dangerous defects as rotor deflection, crack in the rotor, and strong misalignment of supports. The program-technical complex of monitoring and measuring the deflection of medium pressure rotor (PTC) realizing this algorithm will alert the electric power plant staff during a deflection and indicate its value. This will give the opportunity to take timely measures to prevent the further extension of the defect. Repeatedly, recorded cases of full or partial destruction of shrouded shelves of rotor blades of the last stages of low-pressure cylinders of steam turbines defined the need to develop a version of the automated system of blade diagnostics (ASBD SKALA) for shrouded stages. The processing, analysis, presentation, and backup of data characterizing the mechanical state of blade device are carried out with a newly developed controller of the diagnostics system. As a result of the implementation of the works, the diagnosed parameters determining the operation security of rotating elements of equipment was expanded and the new tasks on monitoring the state of units and parts of turbines were solved. All algorithmic solutions and hardware-software implementations mentioned in the article were tested on the test benches and applied at some power plants.

  6. Principles of control automation of soil compacting machine operating mechanism

    NASA Astrophysics Data System (ADS)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The relevance of the qualitative compaction of soil bases in the erection of embankment and foundations in building and structure construction is given.The quality of the compactible gravel and sandy soils provides the bearing capability and, accordingly, the strength and durability of constructed buildings.It has been established that the compaction quality depends on many external actions, such as surface roughness and soil moisture; granulometry, chemical composition and degree of elasticity of originalfilled soil for compaction.The analysis of technological processes of soil bases compaction of foreign and domestic information sources showed that the solution of such important problem as a continuous monitoring of soil compaction actual degree in the process of machine operation carry out only with the use of modern means of automation. An effective vibrodynamic method of gravel and sand material sealing for the building structure foundations for various applications was justified and suggested.The method of continuous monitoring the soil compaction by measurement of the amplitudes and frequencies of harmonic oscillations on the compactible surface was determined, which allowed to determine the basic elements of facilities of soil compacting machine monitoring system of operating, etc. mechanisms: an accelerometer, a bandpass filter, a vibro-harmonics, an on-board microcontroller. Adjustable parameters have been established to improve the soil compaction degree and the soil compacting machine performance, and the adjustable parameter dependences on the overall indexhave been experimentally determined, which is the soil compaction degree.A structural scheme of automatic control of the soil compacting machine control mechanism and theoperation algorithm has been developed.

  7. Localized, macromolecular transport for thin, adherent, single cells via an automated, single cell electroporation biomanipulator.

    PubMed

    Sakaki, Kelly; Esmaeilsabzali, Hadi; Massah, Shabnam; Prefontaine, Gratien G; Dechev, Nikolai; Burke, Robert D; Park, Edward J

    2013-11-01

    Single cell electroporation (SCE), via microcapillary, is an effective method for molecular, transmembrane transport used to gain insight on cell processes with minimal preparation. Although possessing great potential, SCE is difficult to execute and the technology spans broad fields within cell biology and engineering. The technical complexities, the focus and expertise demanded during manual operation, and the lack of an automated SCE platform limit the widespread use of this technique, thus the potential of SCE has not been realized. In this study, an automated biomanipulator for SCE is presented. Our system is capable of delivering molecules into the cytoplasm of extremely thin cellular features of adherent cells. The intent of the system is to abstract the technical challenges and exploit the accuracy and repeatability of automated instrumentation, leaving only the focus of the experimental design to the operator. Each sequence of SCE including cell and SCE site localization, tip-membrane contact detection, and SCE has been automated. Positions of low-contrast cells are localized and "SCE sites" for microcapillary tip placement are determined using machine vision. In addition, new milestones within automated cell manipulation have been achieved. The system described herein has the capability of automated SCE of "thin" cell features less than 10 μm in thickness. Finally, SCE events are anticipated using visual feedback, while monitoring fluorescing dye entering the cytoplasm of a cell. The execution is demonstrated by inserting a combination of a fluorescing dye and a reporter gene into NIH/3T3 fibroblast cells.

  8. Standardized Automated CO2/H2O Flux Systems for Individual Research Groups and Flux Networks

    NASA Astrophysics Data System (ADS)

    Burba, George; Begashaw, Israel; Fratini, Gerardo; Griessbaum, Frank; Kathilankal, James; Xu, Liukang; Franz, Daniela; Joseph, Everette; Larmanou, Eric; Miller, Scott; Papale, Dario; Sabbatini, Simone; Sachs, Torsten; Sakai, Ricardo; McDermitt, Dayle

    2017-04-01

    In recent years, spatial and temporal flux data coverage improved significantly, and on multiple scales, from a single station to continental networks, due to standardization, automation, and management of data collection, and better handling of the extensive amounts of generated data. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are required to effectively and efficiently handle the entire process. Such tools are needed to maximize time dedicated to authoring publications and answering research questions, and to minimize time and expenses spent on data acquisition, processing, and quality control. Thus, these tools should produce standardized verifiable datasets and provide a way to cross-share the standardized data with external collaborators to leverage available funding, promote data analyses and publications. LI-COR gas analyzers are widely used in past and present flux networks such as AmeriFlux, ICOS, AsiaFlux, OzFlux, NEON, CarboEurope, and FluxNet-Canada, etc. These analyzers have gone through several major improvements over the past 30 years. However, in 2016, a three-prong development was completed to create an automated flux system which can accept multiple sonic anemometer and datalogger models, compute final and complete fluxes on-site, merge final fluxes with supporting weather soil and radiation data, monitor station outputs and send automated alerts to researchers, and allow secure sharing and cross-sharing of the station and data access. Two types of these research systems were developed: open-path (LI-7500RS) and enclosed-path (LI-7200RS). Key developments included: • Improvement of gas analyzer performance • Standardization and automation of final flux calculations onsite, and in real-time • Seamless integration with latest site management and data sharing tools In terms of the gas analyzer performance, the RS analyzers are based on established LI-7500/A and LI-7200 models, and the improvements focused on increased stability in the presence of contamination, refining temperature control and compensation, and providing more accurate fast gas concentration measurements. In terms of the flux calculations, improvements focused on automating the on-site flux calculations using EddyPro® software run by a weatherized fully digital microcomputer, SmartFlux2. In terms of site management and data sharing, the development focused on web-based software, FluxSuite, which allows real-time station monitoring and data access by multiple users. The presentation will describe details for the key developments and will include results from field tests of the RS gas analyzer models in comparison with older models and control reference instruments.

  9. Managing Multi-center Flow Cytometry Data for Immune Monitoring

    PubMed Central

    White, Scott; Laske, Karoline; Welters, Marij JP; Bidmon, Nicole; van der Burg, Sjoerd H; Britten, Cedrik M; Enzor, Jennifer; Staats, Janet; Weinhold, Kent J; Gouttefangeas, Cécile; Chan, Cliburn

    2014-01-01

    With the recent results of promising cancer vaccines and immunotherapy1–5, immune monitoring has become increasingly relevant for measuring treatment-induced effects on T cells, and an essential tool for shedding light on the mechanisms responsible for a successful treatment. Flow cytometry is the canonical multi-parameter assay for the fine characterization of single cells in solution, and is ubiquitously used in pre-clinical tumor immunology and in cancer immunotherapy trials. Current state-of-the-art polychromatic flow cytometry involves multi-step, multi-reagent assays followed by sample acquisition on sophisticated instruments capable of capturing up to 20 parameters per cell at a rate of tens of thousands of cells per second. Given the complexity of flow cytometry assays, reproducibility is a major concern, especially for multi-center studies. A promising approach for improving reproducibility is the use of automated analysis borrowing from statistics, machine learning and information visualization21–23, as these methods directly address the subjectivity, operator-dependence, labor-intensive and low fidelity of manual analysis. However, it is quite time-consuming to investigate and test new automated analysis techniques on large data sets without some centralized information management system. For large-scale automated analysis to be practical, the presence of consistent and high-quality data linked to the raw FCS files is indispensable. In particular, the use of machine-readable standard vocabularies to characterize channel metadata is essential when constructing analytic pipelines to avoid errors in processing, analysis and interpretation of results. For automation, this high-quality metadata needs to be programmatically accessible, implying the need for a consistent Application Programming Interface (API). In this manuscript, we propose that upfront time spent normalizing flow cytometry data to conform to carefully designed data models enables automated analysis, potentially saving time in the long run. The ReFlow informatics framework was developed to address these data management challenges. PMID:26085786

  10. Bayesian Inference for Signal-Based Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Moore, D.

    2015-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. SIG-VISA (Signal-based Vertically Integrated Seismic Analysis) is a system for global seismic monitoring through Bayesian inference on seismic signals. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of recent geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a global network of stations. We demonstrate recent progress in scaling up SIG-VISA to efficiently process the data stream of global signals recorded by the International Monitoring System (IMS), including comparisons against existing processing methods that show increased sensitivity from our signal-based model and in particular the ability to locate events (including aftershock sequences that can tax analyst processing) precisely from waveform correlation effects. We also provide a Bayesian analysis of an alleged low-magnitude event near the DPRK test site in May 2010 [1] [2], investigating whether such an event could plausibly be detected through automated processing in a signal-based monitoring system. [1] Zhang, Miao and Wen, Lianxing. "Seismological Evidence for a Low-Yield Nuclear Test on 12 May 2010 in North Korea". Seismological Research Letters, January/February 2015. [2] Richards, Paul. "A Seismic Event in North Korea on 12 May 2010". CTBTO SnT 2015 oral presentation, video at https://video-archive.ctbto.org/index.php/kmc/preview/partner_id/103/uiconf_id/4421629/entry_id/0_ymmtpps0/delivery/http

  11. Real-time ECG monitoring and arrhythmia detection using Android-based mobile devices.

    PubMed

    Gradl, Stefan; Kugler, Patrick; Lohmuller, Clemens; Eskofier, Bjoern

    2012-01-01

    We developed an application for Android™-based mobile devices that allows real-time electrocardiogram (ECG) monitoring and automated arrhythmia detection by analyzing ECG parameters. ECG data provided by pre-recorded files or acquired live by accessing a Shimmer™ sensor node via Bluetooth™ can be processed and evaluated. The application is based on the Pan-Tompkins algorithm for QRS-detection and contains further algorithm blocks to detect abnormal heartbeats. The algorithm was validated using the MIT-BIH Arrhythmia and MIT-BIH Supraventricular Arrhythmia databases. More than 99% of all QRS complexes were detected correctly by the algorithm. Overall sensitivity for abnormal beat detection was 89.5% with a specificity of 80.6%. The application is available for download and may be used for real-time ECG-monitoring on mobile devices.

  12. Automated electrohysterographic detection of uterine contractions for monitoring of pregnancy: feasibility and prospects.

    PubMed

    Muszynski, C; Happillon, T; Azudin, K; Tylcz, J-B; Istrate, D; Marque, C

    2018-05-08

    Preterm birth is a major public health problem in developed countries. In this context, we have conducted research into outpatient monitoring of uterine electrical activity in women at risk of preterm delivery. The objective of this preliminary study was to perform automated detection of uterine contractions (without human intervention or tocographic signal, TOCO) by processing the EHG recorded on the abdomen of pregnant women. The feasibility and accuracy of uterine contraction detection based on EHG processing were tested and compared to expert decision using external tocodynamometry (TOCO) . The study protocol was approved by local Ethics Committees under numbers ID-RCB 2016-A00663-48 for France and VSN 02-0006-V2 for Iceland. Two populations of women were included (threatened preterm birth and labour) in order to test our system of recognition of the various types of uterine contractions. EHG signal acquisition was performed according to a standardized protocol to ensure optimal reproducibility of EHG recordings. A system of 18 Ag/AgCl surface electrodes was used by placing 16 recording electrodes between the woman's pubis and umbilicus according to a 4 × 4 matrix. TOCO was recorded simultaneously with EHG recording. EHG signals were analysed in real-time by calculation of the nonlinear correlation coefficient H 2 . A curve representing the number of correlated pairs of signals according to the value of H 2 calculated between bipolar signals was then plotted. High values of H 2 indicated the presence of an event that may correspond to a contraction. Two tests were performed after detection of an event (fusion and elimination of certain events) in order to increase the contraction detection rate. The EHG database contained 51 recordings from pregnant women, with a total of 501 contractions previously labelled by analysis of the corresponding tocographic recording. The percentage recognitions obtained by application of the method based on coefficient H 2 was 100% with 782% of false alarms. Addition of fusion and elimination tests to the previously obtained detections allowed the false alarm rate to be divided by 8.5, while maintaining an excellent detection rate (96%). These preliminary results appear to be encouraging for monitoring of uterine contractions by algorithm-based automated detection to process the electrohysterographic signal (EHG). This compact recording system, based on the use of surface electrodes attached to the skin, appears to be particularly suitable for outpatient monitoring of uterine contractions, possibly at home, allowing telemonitoring of pregnancies. One of the advantages of EHG processing is that useful information concerning contraction efficiency can be extracted from this signal, which is not possible with the TOCO signal.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, L. A.; Chizari, S.; Panas, R. M.

    The aim of this research is to demonstrate a holographically driven photopolymerization process for joining colloidal particles to create planar microstructures fixed to a substrate, which can be monitored with real-time measurement. Holographic optical tweezers (HOT) have been used to arrange arrays of microparticles prior to this work; here we introduce a new photopolymerization process for rapidly joining simultaneously handled microspheres in a plane. Additionally, we demonstrate a new process control technique for efficiently identifying when particles have been successfully joined by measuring a sufficient reduction in the particles’ Brownian motion. Furthermore, this technique and our demonstrated joining approach enablemore » HOT technology to take critical steps toward automated additive fabrication of microstructures.« less

  14. Protocols for the Investigation of Information Processing in Human Assessment of Fundamental Movement Skills.

    PubMed

    Ward, Brodie J; Thornton, Ashleigh; Lay, Brendan; Rosenberg, Michael

    2017-01-01

    Fundamental movement skill (FMS) assessment remains an important tool in classifying individuals' level of FMS proficiency. The collection of FMS performances for assessment and monitoring has remained unchanged over the last few decades, but new motion capture technologies offer opportunities to automate this process. To achieve this, a greater understanding of the human process of movement skill assessment is required. The authors present the rationale and protocols of a project in which they aim to investigate the visual search patterns and information extraction employed by human assessors during FMS assessment, as well as the implementation of the Kinect system for FMS capture.

  15. The Automated Instrumentation and Monitoring System (AIMS): Design and Architecture. 3.2

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Schmidt, Melisa; Schulbach, Cathy; Bailey, David (Technical Monitor)

    1997-01-01

    Whether a researcher is designing the 'next parallel programming paradigm', another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of such information can help computer and software architects to capture, and therefore, exploit behavioral variations among/within various parallel programs to take advantage of specific hardware characteristics. A software tool-set that facilitates performance evaluation of parallel applications on multiprocessors has been put together at NASA Ames Research Center under the sponsorship of NASA's High Performance Computing and Communications Program over the past five years. The Automated Instrumentation and Monitoring Systematic has three major software components: a source code instrumentor which automatically inserts active event recorders into program source code before compilation; a run-time performance monitoring library which collects performance data; and a visualization tool-set which reconstructs program execution based on the data collected. Besides being used as a prototype for developing new techniques for instrumenting, monitoring and presenting parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Currently, the execution of FORTRAN and C programs on the Intel Paragon and PALM workstations can be automatically instrumented and monitored. Performance data thus collected can be displayed graphically on various workstations. The process of performance tuning with AIMS will be illustrated using various NAB Parallel Benchmarks. This report includes a description of the internal architecture of AIMS and a listing of the source code.

  16. Design of a novel automated methanol feed system for pilot-scale fermentation of Pichia pastoris.

    PubMed

    Hamaker, Kent H; Johnson, Daniel C; Bellucci, Joseph J; Apgar, Kristie R; Soslow, Sherry; Gercke, John C; Menzo, Darrin J; Ton, Christopher

    2011-01-01

    Large-scale fermentation of Pichia pastoris requires a large volume of methanol feed during the induction phase. However, a large volume of methanol feed is difficult to use in the processing suite because of the inconvenience of constant monitoring, manual manipulation steps, and fire and explosion hazards. To optimize and improve safety of the methanol feed process, a novel automated methanol feed system has been designed and implemented for industrial fermentation of P. pastoris. Details of the design of the methanol feed system are described. The main goals of the design were to automate the methanol feed process and to minimize the hazardous risks associated with storing and handling large quantities of methanol in the processing area. The methanol feed system is composed of two main components: a bulk feed (BF) system and up to three portable process feed (PF) systems. The BF system automatically delivers methanol from a central location to the portable PF system. The PF system provides precise flow control of linear, step, or exponential feed of methanol to the fermenter. Pilot-scale fermentations with linear and exponential methanol feeds were conducted using two Mut(+) (methanol utilization plus) strains, one expressing a recombinant therapeutic protein and the other a monoclonal antibody. Results show that the methanol feed system is accurate, safe, and efficient. The feed rates for both linear and exponential feed methods were within ± 5% of the set points, and the total amount of methanol fed was within 1% of the targeted volume. Copyright © 2011 American Institute of Chemical Engineers (AIChE).

  17. Selecting automation for the clinical chemistry laboratory.

    PubMed

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  18. Human-rating Automated and Robotic Systems - (How HAL Can Work Safely with Astronauts)

    NASA Technical Reports Server (NTRS)

    Baroff, Lynn; Dischinger, Charlie; Fitts, David

    2009-01-01

    Long duration human space missions, as planned in the Vision for Space Exploration, will not be possible without applying unprecedented levels of automation to support the human endeavors. The automated and robotic systems must carry the load of routine housekeeping for the new generation of explorers, as well as assist their exploration science and engineering work with new precision. Fortunately, the state of automated and robotic systems is sophisticated and sturdy enough to do this work - but the systems themselves have never been human-rated as all other NASA physical systems used in human space flight have. Our intent in this paper is to provide perspective on requirements and architecture for the interfaces and interactions between human beings and the astonishing array of automated systems; and the approach we believe necessary to create human-rated systems and implement them in the space program. We will explain our proposed standard structure for automation and robotic systems, and the process by which we will develop and implement that standard as an addition to NASA s Human Rating requirements. Our work here is based on real experience with both human system and robotic system designs; for surface operations as well as for in-flight monitoring and control; and on the necessities we have discovered for human-systems integration in NASA's Constellation program. We hope this will be an invitation to dialog and to consideration of a new issue facing new generations of explorers and their outfitters.

  19. Ship-borne measurements of microbial enzymatic activity: A rapid biochemical indicator for microbial water quality monitoring

    NASA Astrophysics Data System (ADS)

    Stadler, Philipp; Loken, Luke; Crawford, John; Schramm, Paul; Sorsa, Kirsti; Kuhn, Catherine; Savio, Domenico; Striegl, Rob; Butman, David; Stanley, Emily; Farnleitner, Andreas H.; Zessner, Matthias

    2017-04-01

    Contamination of aquatic ecosystems by human and animal wastes is a global concern for water quality. Disclosing fate and transport processes of fecal indicator organism (FIO) in large water bodies is a big challenge due to material intensive and time consuming methods used in microbiological water quality monitoring. In respect of utilization of large surface water resources there is a dearth of rapid microbiological methods that allow a near-real time health related water quality monitoring to be implemented into early warning systems. The detection of enzymatic activities has been proposed as a rapid surrogate for microbiological pollution monitoring of water and water resources (Cabral, 2010; Farnleitner et al., 2001, 2002). Methods such as the beta-D-Glucuronidase assay (GLUC), targeting FIO such as E. coli, were established. New automated enzymatic assays have been implemented during the last years into on-site monitoring stations, ranging from ground- to surface waters (Ryzinska-Paier et al., 2014; Stadler et al., 2017, 2016). While these automated enzymatic methods cannot completely replace assays for culture-based FIO enumeration, they yielded significant information on pollution events and temporal dynamics on a catchment specific basis, but were restricted to stationary measurements. For the first time we conducted ship-borne and automated measurements of enzymatic GLUC activity on large fresh water bodies, including the Columbia River, the Mississippi River and Lake Mendota. Not only are automated enzymatic assays technically feasible from a mobile vessel, but also can be used to localize point sources of potential microbial fecal contamination, such as tributaries or storm drainages. Spatial and temporal patterns of enzymatic activity were disclosed and the habitat specific correlation with microbiological standard assays for FIO determined due to reference samples. The integration of rapid and automated enzymatic assays into well-established systems for ship-borne measurements of physico-chemical parameters, such as the FLAMe (Crawford et al., 2015), paves new ground for data interpretation and process understanding. Cabral, J.P.S., 2010. Water Microbiology. Bacterial Pathogens and Water. Int. J. Environ. Res. Public. Health 7, 3657-3703. doi:10.3390/ijerph7103657 Crawford, J.T., Loken, L.C., Casson, N.J., Smith, C., Stone, A.G., Winslow, L.A., 2015. High-speed limnology: using advanced sensors to investigate spatial variability in biogeochemistry and hydrology. Environ. Sci. Technol. 49, 442-450. doi:10.1021/es504773x Farnleitner, A. h., Hocke, L., Beiwl, C., Kavka, G. c., Zechmeister, T., Kirschner, A. k. t., Mach, R. l., 2001. Rapid enzymatic detection of Escherichia coli contamination in polluted river water. Lett. Appl. Microbiol. 33, 246-250. doi:10.1046/j.1472-765x.2001.00990.x Farnleitner, A.H., Hocke, L., Beiwl, C., Kavka, G.G., Mach, R.L., 2002. Hydrolysis of 4-methylumbelliferyl-β-d-glucuronide in differing sample fractions of river waters and its implication for the detection of fecal pollution. Water Res. 36, 975-981. doi:10.1016/S0043-1354(01)00288-3 Ryzinska-Paier, G., Lendenfeld, T., Correa, K., Stadler, P., Blaschke, A.P., Mach, R.L., Stadler, H., Kirschner, A.K.T., Farnleitner, A.H., 2014. A sensitive and robust method for automated on-line monitoring of enzymatic activities in water and water resources. Water Sci. Technol. J. Int. Assoc. Water Pollut. Res. 69, 1349-1358. doi:10.2166/wst.2014.032 Stadler, P., Blöschl, G., Vogl, W., Koschelnik, J., Epp, M., Lackner, M., Oismüller, M., Kumpan, M., Nemeth, L., Strauss, P., Sommer, R., Ryzinska-Paier, G., Farnleitner, A.H., Zessner, M., 2016. Real-time monitoring of beta-d-glucuronidase activity in sediment laden streams: A comparison of prototypes. Water Res. 101, 252-261. doi:10.1016/j.watres.2016.05.072 Stadler, P., Farnleitner, A.H., Zessner, M., 2017. Development and evaluation of a self-cleaning custom-built auto sampler controlled by a low-cost RaspberryPi microcomputer for online enzymatic activity measurements. Talanta 162, 390-397. doi:10.1016/j.talanta.2016.10.031

  20. An Advanced NSSS Integrity Monitoring System for Shin-Kori Nuclear Units 3 and 4

    NASA Astrophysics Data System (ADS)

    Oh, Yang Gyun; Galin, Scott R.; Lee, Sang Jeong

    2010-12-01

    The advanced design features of NSSS (Nuclear Steam Supply System) Integrity Monitoring System for Shin-Kori Nuclear Units 3 and 4 are summarized herein. During the overall system design and detailed component design processes, many design improvements have been made for the system. The major design changes are: 1) the application of a common software platform for all subsystems, 2) the implementation of remote access, control and monitoring capabilities, and 3) the equipment redesign and rearrangement that has simplified the system architecture. Changes give an effect on cabinet size, number of cables, cyber-security, graphic user interfaces, and interfaces with other monitoring systems. The system installation and operation for Shin-Kori Nuclear Units 3 and 4 will be more convenient than those for previous Korean nuclear units in view of its remote control capability, automated test functions, improved user interface functions, and much less cabling.

  1. Attention focusing and anomaly detection in systems monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, Richard J.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. The focus of this paper is a new technique for attention focusing. The technique involves reasoning about the distance between two frequency distributions, and is used to detect both anomalous system parameters and 'broken' causal dependencies. These two forms of information together isolate the locus of anomalous behavior in the system being monitored.

  2. [Personal computer-based computer monitoring system of the anesthesiologist (2-year experience in development and use)].

    PubMed

    Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I

    1995-01-01

    Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.

  3. Lightweight fuzzy processes in clinical computing.

    PubMed

    Hurdle, J F

    1997-09-01

    In spite of advances in computing hardware, many hospitals still have a hard time finding extra capacity in their production clinical information system to run artificial intelligence (AI) modules, for example: to support real-time drug-drug or drug-lab interactions; to track infection trends; to monitor compliance with case specific clinical guidelines; or to monitor/ control biomedical devices like an intelligent ventilator. Historically, adding AI functionality was not a major design concern when a typical clinical system is originally specified. AI technology is usually retrofitted 'on top of the old system' or 'run off line' in tandem with the old system to ensure that the routine work load would still get done (with as little impact from the AI side as possible). To compound the burden on system performance, most institutions have witnessed a long and increasing trend for intramural and extramural reporting, (e.g. the collection of data for a quality-control report in microbiology, or a meta-analysis of a suite of coronary artery bypass grafts techniques, etc.) and these place an ever-growing burden on typical the computer system's performance. We discuss a promising approach to adding extra AI processing power to a heavily-used system based on the notion 'lightweight fuzzy processing (LFP)', that is, fuzzy modules designed from the outset to impose a small computational load. A formal model for a useful subclass of fuzzy systems is defined below and is used as a framework for the automated generation of LFPs. By seeking to reduce the arithmetic complexity of the model (a hand-crafted process) and the data complexity of the model (an automated process), we show how LFPs can be generated for three sample datasets of clinical relevance.

  4. Modelling and representation issues in automated feature extraction from aerial and satellite images

    NASA Astrophysics Data System (ADS)

    Sowmya, Arcot; Trinder, John

    New digital systems for the processing of photogrammetric and remote sensing images have led to new approaches to information extraction for mapping and Geographic Information System (GIS) applications, with the expectation that data can become more readily available at a lower cost and with greater currency. Demands for mapping and GIS data are increasing as well for environmental assessment and monitoring. Hence, researchers from the fields of photogrammetry and remote sensing, as well as computer vision and artificial intelligence, are bringing together their particular skills for automating these tasks of information extraction. The paper will review some of the approaches used in knowledge representation and modelling for machine vision, and give examples of their applications in research for image understanding of aerial and satellite imagery.

  5. Automation of extrusion of porous cable products based on a digital controller

    NASA Astrophysics Data System (ADS)

    Chostkovskii, B. K.; Mitroshin, V. N.

    2017-07-01

    This paper presents a new approach to designing an automated system for monitoring and controlling the process of applying porous insulation material on a conductive cable core, which is based on using structurally and parametrically optimized digital controllers of an arbitrary order instead of calculating typical PID controllers using known methods. The digital controller is clocked by signals from the clock length sensor of a measuring wheel, instead of a timer signal, and this provides the robust properties of the system with respect to the changing insulation speed. Digital controller parameters are tuned to provide the operating parameters of the manufactured cable using a simulation model of stochastic extrusion and are minimized by moving a regular simplex in the parameter space of the tuned controller.

  6. An automated calibration laboratory for flight research instrumentation: Requirements and a proposed design approach

    NASA Technical Reports Server (NTRS)

    Oneill-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  7. The Development of an Automated Device for Asthma Monitoring for Adolescents: Methodologic Approach and User Acceptability

    PubMed Central

    Miner, Sarah; Sterling, Mark; Halterman, Jill S; Fairbanks, Eileen

    2014-01-01

    Background Many adolescents suffer serious asthma related morbidity that can be prevented by adequate self-management of the disease. The accurate symptom monitoring by patients is the most fundamental antecedent to effective asthma management. Nonetheless, the adequacy and effectiveness of current methods of symptom self-monitoring have been challenged due to the individuals’ fallible symptom perception, poor adherence, and inadequate technique. Recognition of these limitations led to the development of an innovative device that can facilitate continuous and accurate monitoring of asthma symptoms with minimal disruption of daily routines, thus increasing acceptability to adolescents. Objective The objectives of this study were to: (1) describe the development of a novel symptom monitoring device for teenagers (teens), and (2) assess their perspectives on the usability and acceptability of the device. Methods Adolescents (13-17 years old) with and without asthma participated in the evolution of an automated device for asthma monitoring (ADAM), which comprised three phases, including development (Phase 1, n=37), validation/user acceptability (Phase 2, n=84), and post hoc validation (Phase 3, n=10). In Phase 1, symptom algorithms were identified based on the acoustic analysis of raw symptom sounds and programmed into a popular mobile system, the iPod. Phase 2 involved a 7 day trial of ADAM in vivo, and the evaluation of user acceptance using an acceptance survey and individual interviews. ADAM was further modified and enhanced in Phase 3. Results Through ADAM, incoming audio data were digitized and processed in two steps involving the extraction of a sequence of descriptive feature vectors, and the processing of these sequences by a hidden Markov model-based Viterbi decoder to differentiate symptom sounds from background noise. The number and times of detected symptoms were stored and displayed in the device. The sensitivity (true positive) of the updated cough algorithm was 70% (21/30), and, on average, 2 coughs per hour were identified as false positive. ADAM also kept track of the their activity level throughout the day using the mobile system’s built in accelerometer function. Overall, the device was well received by participants who perceived it as attractive, convenient, and helpful. The participants recognized the potential benefits of the device in asthma care, and were eager to use it for their asthma management. Conclusions ADAM can potentially automate daily symptom monitoring with minimal intrusiveness and maximal objectivity. The users’ acceptance of the device based on its recognized convenience, user-friendliness, and usefulness in increasing symptom awareness underscores ADAM’s potential to overcome the issues of symptom monitoring including poor adherence, inadequate technique, and poor symptom perception in adolescents. Further refinement of the algorithm is warranted to improve the accuracy of the device. Future study is also needed to assess the efficacy of the device in promoting self-management and asthma outcomes. PMID:25100184

  8. Xenon International Automated Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-08-05

    The Xenon International Automated Control software monitors, displays status, and allows for manual operator control as well as fully automatic control of multiple commercial and PNNL designed hardware components to generate and transmit atmospheric radioxenon concentration measurements every six hours.

  9. Massachusetts Institute of Technology Consortium Agreement

    DTIC Science & Technology

    1999-03-01

    This is the third progress report of the M.I.T. Home Automation and Healthcare Consortium-Phase Two. It covers majority of the new findings, concepts...research projects of home automation and healthcare, ranging from human modeling, patient monitoring, and diagnosis to new sensors and actuators, physical...aids, human-machine interface and home automation infrastructure. This report contains several patentable concepts, algorithms, and designs.

  10. Automation and control of off-planet oxygen production processes

    NASA Technical Reports Server (NTRS)

    Marner, W. J.; Suitor, J. W.; Schooley, L. S.; Cellier, F. E.

    1990-01-01

    This paper addresses several aspects of the automation and control of off-planet production processes. First, a general approach to process automation and control is discussed from the viewpoint of translating human process control procedures into automated procedures. Second, the control issues for the automation and control of off-planet oxygen processes are discussed. Sensors, instruments, and components are defined and discussed in the context of off-planet applications, and the need for 'smart' components is clearly established.

  11. MessageSpace: a messaging system for health research

    NASA Astrophysics Data System (ADS)

    Escobar, Rodrigo D.; Akopian, David; Parra-Medina, Deborah; Esparza, Laura

    2013-03-01

    Mobile Health (mHealth) has emerged as a promising direction for delivery of healthcare services via mobile communication devices such as cell phones. Examples include texting-based interventions for chronic disease monitoring, diabetes management, control of hypertension, smoking cessation, monitoring medication adherence, appointment keeping and medical test result delivery; as well as improving patient-provider communication, health information communication, data collection and access to health records. While existing messaging systems very well support bulk messaging and some polling applications, they are not designed for data collection and processing of health research oriented studies. For that reason known studies based on text-messaging campaigns have been constrained in participant numbers. In order to empower healthcare promotion and education research, this paper presents a system dedicated for healthcare research. It is designed for convenient communication with various study groups, feedback collection and automated processing.

  12. A Camera and Multi-Sensor Automated Station Design for Polar Physical and Biological Systems Monitoring: AMIGOS

    NASA Astrophysics Data System (ADS)

    Bohlander, J. A.; Ross, R.; Scambos, T.; Haran, T. M.; Bauer, R. J.

    2012-12-01

    The Automated Meteorology - Ice/Indigenous species - Geophysics Observation System (AMIGOS) consists of a set of measurement instruments and camera(s) controlled by a single-board computer with a simplified Linux operating system and an Iridium satellite modem supporting two-way communication. Primary features of the system relevant to polar operations are low power requirements, daily data uploading, reprogramming, tolerance for low temperatures, and various approaches for automatic resets and recovery from low power or cold shut-down. Instruments include a compact weather station, C/A or dual-frequency GPS, solar flux and reflectivity sensors, sonic snow gages, simplified radio-echo-sounder, and resistance thermometer string in the firn column. In the current state of development, there are two basic designs. One is intended for in situ observations of glacier conditions. The other design supports a high-resolution camera for monitoring biological or geophysical systems from short distances (100 m to 20 km). The stations have been successfully used in several locations for operational support, monitoring rapid ice changes in response to climate change or iceberg drift, and monitoring penguin colony activity. As of June, 2012, there are 9 AMIGOS systems installed, all on the Antarctic continent. The stations are a working prototype for a planned series of upgraded stations, currently termed 'Sentinels'. These stations would carry further instrumentation, communications, and processing capability to investigate ice - ocean interaction from ice tongue, ice shelf, or fjord coastline areas.

  13. A Synopsis of Technical Issues of Concern for Monitoring Trace Elements in Highway and Urban Runoff

    USGS Publications Warehouse

    Breault, Robert F.; Granato, Gregory E.

    2000-01-01

    Trace elements, which are regulated for aquatic life protection, are a primary concern in highway- and urban-runoff studies because stormwater runoff may transport these constituents from the land surface to receiving waters. Many of these trace elements are essential for biological activity and become detrimental only when geologic or anthropogenic sources exceed concentrations beyond ranges typical of the natural environment. The Federal Highway Administration and State Transportation Agencies are concerned about the potential effects of highway runoff on the watershed scale and for the management and protection of watersheds. Transportation agencies need information that is documented as valid, current, and scientifically defensible to support planning and management decisions. There are many technical issues of concern for monitoring trace elements; therefore, trace-element data commonly are considered suspect, and the responsibility to provide data-quality information to support the validity of reported results rests with the data-collection agency. Paved surfaces are fundamentally different physically, hydraulically, and chemically from the natural surfaces typical of most freshwater systems that have been the focus of many traceelement- monitoring studies. Existing scientific conceptions of the behavior of trace elements in the environment are based largely upon research on natural systems, rather than on systems typical of pavement runoff. Additionally, the logistics of stormwater sampling are difficult because of the great uncertainty in the occurrence and magnitude of storm events. Therefore, trace-element monitoring programs may be enhanced if monitoring and sampling programs are automated. Automation would standardize the process and provide a continuous record of the variations in flow and water-quality characteristics. Great care is required to collect and process samples in a manner that will minimize potential contamination or attenuation of trace elements and other sources of bias and variability in the sampling process. Trace elements have both natural and anthropogenic sources that may affect the sampling process, including the sample-collection and handling materials used in many trace-element monitoring studies. Trace elements also react with these materials within the timescales typical for collection, processing and analysis of runoff samples. To study the characteristics and potential effects of trace elements in highway and urban runoff, investigators typically sample one or more operationally defined matrixes including: whole water, dissolved (filtered water), suspended sediment, bottom sediment, biological tissue, and contaminant sources. The sampling and analysis of each of these sample matrixes can provide specific information about the occurrence and distribution of trace elements in runoff and receiving waters. There are, however, technical concerns specific to each matrix that must be understood and addressed through use of proper collection and processing protocols. Valid protocols are designed to minimize inherent problems and to maximize the accuracy, precision, comparability, and representativeness of data collected. Documentation, including information about monitoring protocols, quality assurance and quality control efforts, and ancillary data also is necessary to establish data quality. This documentation is especially important for evaluation of historical traceelement monitoring data, because trace-element monitoring protocols and analysis methods have been constantly changing over the past 30 years.

  14. Preliminary investigation of submerged aquatic vegetation mapping using hyperspectral remote sensing.

    PubMed

    William, David J; Rybicki, Nancy B; Lombana, Alfonso V; O'Brien, Tim M; Gomez, Richard B

    2003-01-01

    The use of airborne hyperspectral remote sensing imagery for automated mapping of submerged aquatic vegetation (SAV) in the tidal Potomac River was investigated for near to real-time resource assessment and monitoring. Airborne hyperspectral imagery and field spectrometer measurements were obtained in October of 2000. A spectral library database containing selected ground-based and airborne sensor spectra was developed for use in image processing. The spectral library is used to automate the processing of hyperspectral imagery for potential real-time material identification and mapping. Field based spectra were compared to the airborne imagery using the database to identify and map two species of SAV (Myriophyllum spicatum and Vallisneria americana). Overall accuracy of the vegetation maps derived from hyperspectral imagery was determined by comparison to a product that combined aerial photography and field based sampling at the end of the SAV growing season. The algorithms and databases developed in this study will be useful with the current and forthcoming space-based hyperspectral remote sensing systems.

  15. Progress on Platforms, Sensors and Applications with Unmanned Aerial Vehicles in soil science and geomorphology

    NASA Astrophysics Data System (ADS)

    Anders, Niels; Suomalainen, Juha; Seeger, Manuel; Keesstra, Saskia; Bartholomeus, Harm; Paron, Paolo

    2014-05-01

    The recent increase of performance and endurance of electronically controlled flying platforms, such as multi-copters and fixed-wing airplanes, and decreasing size and weight of different sensors and batteries leads to increasing popularity of Unmanned Aerial Systems (UAS) for scientific purposes. Modern workflows that implement UAS include guided flight plan generation, 3D GPS navigation for fully automated piloting, and automated processing with new techniques such as "Structure from Motion" photogrammetry. UAS are often equipped with normal RGB cameras, multi- and hyperspectral sensors, radar, or other sensors, and provide a cheap and flexible solution for creating multi-temporal data sets. UAS revolutionized multi-temporal research allowing new applications related to change analysis and process monitoring. The EGU General Assembly 2014 is hosting a session on platforms, sensors and applications with UAS in soil science and geomorphology. This presentation briefly summarizes the outcome of this session, addressing the current state and future challenges of small-platform data acquisition in soil science and geomorphology.

  16. Automated system for acquisition and image processing for the control and monitoring boned nopal

    NASA Astrophysics Data System (ADS)

    Luevano, E.; de Posada, E.; Arronte, M.; Ponce, L.; Flores, T.

    2013-11-01

    This paper describes the design and fabrication of a system for acquisition and image processing to control the removal of thorns nopal vegetable (Opuntia ficus indica) in an automated machine that uses pulses of a laser of Nd: YAG. The areolas, areas where thorns grow on the bark of the Nopal, are located applying segmentation algorithms to the images obtained by a CCD. Once the position of the areolas is known, coordinates are sent to a motors system that controls the laser to interact with all areolas and remove the thorns of the nopal. The electronic system comprises a video decoder, memory for image and software storage, and digital signal processor for system control. The firmware programmed tasks on acquisition, preprocessing, segmentation, recognition and interpretation of the areolas. This system achievement identifying areolas and generating table of coordinates of them, which will be send the motor galvo system that controls the laser for removal

  17. Advertisement recognition using mode voting acoustic fingerprint

    NASA Astrophysics Data System (ADS)

    Fahmi, Reza; Abedi Firouzjaee, Hosein; Janalizadeh Choobbasti, Ali; Mortazavi Najafabadi, S. H. E.; Safavi, Saeid

    2017-12-01

    Emergence of media outlets and public relations tools such as TV, radio and the Internet since the 20th century provided the companies with a good platform for advertising their goods and services. Advertisement recognition is an important task that can help companies measure the efficiency of their advertising campaigns in the market and make it possible to compare their performance with competitors in order to get better business insights. Advertisement recognition is usually performed manually with help of human labor or is done through automated methods that are mainly based on heuristics features, these methods usually lack abilities such as scalability, being able to be generalized and be used in different situations. In this paper, we present an automated method for advertisement recognition based on audio processing method that could make this process fairly simple and eliminate the human factor out of the equation. This method has ultimately been used in Miras information technology in order to monitor 56 TV channels to detect all ad video clips broadcast over some networks.

  18. PRISM Software: Processing and Review Interface for Strong‐Motion Data

    USGS Publications Warehouse

    Jones, Jeanne M.; Kalkan, Erol; Stephens, Christopher D.; Ng, Peter

    2017-01-01

    A continually increasing number of high‐quality digital strong‐motion records from stations of the National Strong Motion Project (NSMP) of the U.S. Geological Survey, as well as data from regional seismic networks within the United States, calls for automated processing of strong‐motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong‐motion records. When used without AQMS, PRISM provides batch‐processing capabilities. The PRISM software is platform independent (coded in Java), open source, and does not depend on any closed‐source or proprietary software. The software consists of two major components: a record processing engine composed of modules for each processing step, and a review tool, which is a graphical user interface for manual review, edit, and processing. To facilitate use by non‐NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand‐alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible to accommodate implementation of new processing techniques. All the computing features have been thoroughly tested.

  19. HPLC-Assisted Automated Oligosaccharide Synthesis: Implementation of the Autosampler as a Mode of the Reagent Delivery.

    PubMed

    Pistorio, Salvatore G; Nigudkar, Swati S; Stine, Keith J; Demchenko, Alexei V

    2016-10-07

    The development of a useful methodology for simple, scalable, and transformative automation of oligosaccharide synthesis that easily interfaces with existing methods is reported. The automated synthesis can now be performed using accessible equipment where the reactants and reagents are delivered by the pump or the autosampler and the reactions can be monitored by the UV detector. The HPLC-based platform for automation is easy to setup and adapt to different systems and targets.

  20. Automated electric power management and control for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Dolce, James L.; Mellor, Pamela A.; Kish, James A.

    1990-01-01

    A comprehensive automation design is being developed for Space Station Freedom's electric power system. It strives to increase station productivity by applying expert systems and conventional algorithms to automate power system operation. An integrated approach to the power system command and control problem is defined and used to direct technology development in: diagnosis, security monitoring and analysis, battery management, and cooperative problem-solving for resource allocation. The prototype automated power system is developed using simulations and test-beds.

  1. Automated Plasma Spray (APS) process feasibility study

    NASA Technical Reports Server (NTRS)

    Fetheroff, C. W.; Derkacs, T.; Matay, I. M.

    1981-01-01

    An automated plasma spray (APS) process was developed to apply two layer (NiCrAlY and ZrO2-12Y2O3) thermal barrier coatings to aircraft and stationary gas turbine engine blade airfoils. The APS process hardware consists of four subsystems: a mechanical positioning subsystem incorporating two interlaced six degree of freedom assemblies (one for coating deposition and one for coating thickness monitoring); a noncoherent optical metrology subsystem (for in process gaging of the coating thickness buildup at specified points on the specimen); a microprocessor based adaptive system controller (to achieve the desired overall thickness profile on the specimen); and commerical plasma spray equipment. Over fifty JT9D first stage aircraft turbine blade specimens, ten W501B utility turbine blade specimens and dozens of cylindrical specimens were coated with the APS process in preliminary checkout and evaluation studies. The best of the preliminary turbine blade specimens achieved an overall coating thickness uniformity of 53 micrometers (2.1 mils), much better than is achievable manually. Comparative evaluations of coating thickness uniformity for manually sprayed and APS coated specimens were performed. One of the preliminary turbine blade evaluation specimens was subjected to a torch test and metallographic evaluation. Some cylindrical specimens coated with the APS process survived up to 2000 cycles in subsequent burner rig testing.

  2. Automated System Calibration and Verification of the Position Measurements for the Los Alamos Isotope Production Facility and the Switchyard Kicker Facilities

    NASA Astrophysics Data System (ADS)

    Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.

    2004-11-01

    The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.

  3. Canadian macromolecular crystallography facility: a suite of fully automated beamlines.

    PubMed

    Grochulski, Pawel; Fodje, Michel; Labiuk, Shaunivan; Gorin, James; Janzen, Kathryn; Berg, Russ

    2012-06-01

    The Canadian light source is a 2.9 GeV national synchrotron radiation facility located on the University of Saskatchewan campus in Saskatoon. The small-gap in-vacuum undulator illuminated beamline, 08ID-1, together with the bending magnet beamline, 08B1-1, constitute the Canadian Macromolecular Crystallography Facility (CMCF). The CMCF provides service to more than 50 Principal Investigators in Canada and the United States. Up to 25% of the beam time is devoted to commercial users and the general user program is guaranteed up to 55% of the useful beam time through a peer-review process. CMCF staff provides "Mail-In" crystallography service to users with the highest scored proposals. Both beamlines are equipped with very robust end-stations including on-axis visualization systems, Rayonix 300 CCD series detectors and Stanford-type robotic sample auto-mounters. MxDC, an in-house developed beamline control system, is integrated with a data processing module, AutoProcess, allowing full automation of data collection and data processing with minimal human intervention. Sample management and remote monitoring of experiments is enabled through interaction with a Laboratory Information Management System developed at the facility.

  4. Automated Euler and Navier-Stokes Database Generation for a Glide-Back Booster

    NASA Technical Reports Server (NTRS)

    Chaderjian, Neal M.; Rogers, Stuart E.; Aftosmis, Mike J.; Pandya, Shishir A.; Ahmad, Jasim U.; Tejnil, Edward

    2004-01-01

    The past two decades have seen a sustained increase in the use of high fidelity Computational Fluid Dynamics (CFD) in basic research, aircraft design, and the analysis of post-design issues. As the fidelity of a CFD method increases, the number of cases that can be readily and affordably computed greatly diminishes. However, computer speeds now exceed 2 GHz, hundreds of processors are currently available and more affordable, and advances in parallel CFD algorithms scale more readily with large numbers of processors. All of these factors make it feasible to compute thousands of high fidelity cases. However, there still remains the overwhelming task of monitoring the solution process. This paper presents an approach to automate the CFD solution process. A new software tool, AeroDB, is used to compute thousands of Euler and Navier-Stokes solutions for a 2nd generation glide-back booster in one week. The solution process exploits a common job-submission grid environment, the NASA Information Power Grid (IPG), using 13 computers located at 4 different geographical sites. Process automation and web-based access to a MySql database greatly reduces the user workload, removing much of the tedium and tendency for user input errors. The AeroDB framework is shown. The user submits/deletes jobs, monitors AeroDB's progress, and retrieves data and plots via a web portal. Once a job is in the database, a job launcher uses an IPG resource broker to decide which computers are best suited to run the job. Job/code requirements, the number of CPUs free on a remote system, and queue lengths are some of the parameters the broker takes into account. The Globus software provides secure services for user authentication, remote shell execution, and secure file transfers over an open network. AeroDB automatically decides when a job is completed. Currently, the Cart3D unstructured flow solver is used for the Euler equations, and the Overflow structured overset flow solver is used for the Navier-Stokes equations. Other codes can be readily included into the AeroDB framework.

  5. Instrument Quality Control.

    PubMed

    Jayakody, Chatura; Hull-Ryde, Emily A

    2016-01-01

    Well-defined quality control (QC) processes are used to determine whether a certain procedure or action conforms to a widely accepted standard and/or set of guidelines, and are important components of any laboratory quality assurance program (Popa-Burke et al., J Biomol Screen 14: 1017-1030, 2009). In this chapter, we describe QC procedures useful for monitoring the accuracy and precision of laboratory instrumentation, most notably automated liquid dispensers. Two techniques, gravimetric QC and photometric QC, are highlighted in this chapter. When used together, these simple techniques provide a robust process for evaluating liquid handler accuracy and precision, and critically underpin high-quality research programs.

  6. AN AUTOMATED MONITORING SYSTEM FOR FISH PHYSIOLOGY AND TOXICOLOGY

    EPA Science Inventory

    This report describes a data acquisition and control (DAC) system that was constructed to manage selected physiological measurements and sample control for aquatic physiology and toxicology. Automated DAC was accomplished with a microcomputer running menu-driven software develope...

  7. Automated analysis of oxidative metabolites

    NASA Technical Reports Server (NTRS)

    Furner, R. L. (Inventor)

    1974-01-01

    An automated system for the study of drug metabolism is described. The system monitors the oxidative metabolites of aromatic amines and of compounds which produce formaldehyde on oxidative dealkylation. It includes color developing compositions suitable for detecting hyroxylated aromatic amines and formaldehyde.

  8. Intelligent Work Process Engineering System

    NASA Technical Reports Server (NTRS)

    Williams, Kent E.

    2003-01-01

    Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligent processes and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.

  9. Expertise Development With Different Types of Automation: A Function of Different Cognitive Abilities.

    PubMed

    Jipp, Meike

    2016-02-01

    I explored whether different cognitive abilities (information-processing ability, working-memory capacity) are needed for expertise development when different types of automation (information vs. decision automation) are employed. It is well documented that expertise development and the employment of automation lead to improved performance. Here, it is argued that a learner's ability to reason about an activity may be hindered by the employment of information automation. Additional feedback needs to be processed, thus increasing the load on working memory and decelerating expertise development. By contrast, the employment of decision automation may stimulate reasoning, increase the initial load on information-processing ability, and accelerate expertise development. Authors of past research have not investigated the interrelations between automation assistance, individual differences, and expertise development. Sixty-one naive learners controlled simulated air traffic with two types of automation: information automation and decision automation. Their performance was captured across 16 trials. Well-established tests were used to assess information-processing ability and working-memory capacity. As expected, learners' performance benefited from expertise development and decision automation. Furthermore, individual differences moderated the effect of the type of automation on expertise development: The employment of only information automation increased the load on working memory during later expertise development. The employment of decision automation initially increased the need to process information. These findings highlight the importance of considering individual differences and expertise development when investigating human-automation interaction. The results are relevant for selecting automation configurations for expertise development. © 2015, Human Factors and Ergonomics Society.

  10. System for evaluating weld quality using eddy currents

    DOEpatents

    Todorov, Evgueni I.; Hay, Jacob

    2017-12-12

    Electromagnetic and eddy current techniques for fast automated real-time and near real-time inspection and monitoring systems for high production rate joining processes. An eddy current system, array and method for the fast examination of welds to detect anomalies such as missed seam (MS) and lack of penetration (LOP) the system, array and methods capable of detecting and sizing surface and slightly subsurface flaws at various orientations in connection with at least the first and second weld pass.

  11. Incorporation of Automated ISR Systems by the 75th Ranger Regiment

    DTIC Science & Technology

    2003-06-06

    Intelligence, describes the intelligence challenges faced by a commander, the command estimate process, military intelligence unit capabilities...feed- capture stills INTERVIEW SUBJECT #2 Strategic Range N/S N/S N/S Voice Recognitio n for TGT/Comb at ID INTERVIEW SUBJECT #3 100- 1000+ N/S...intercept- monitor- jam freqs. Send live feed- capture stills INTERVIEW SUBJECT #2 Strategic Range N/S N/S N/S Voice Recognitio n for TGT/Comb at ID

  12. Performance of an image analysis processing system for hen tracking in an environmental preference chamber.

    PubMed

    Kashiha, Mohammad Amin; Green, Angela R; Sales, Tatiana Glogerley; Bahr, Claudia; Berckmans, Daniel; Gates, Richard S

    2014-10-01

    Image processing systems have been widely used in monitoring livestock for many applications, including identification, tracking, behavior analysis, occupancy rates, and activity calculations. The primary goal of this work was to quantify image processing performance when monitoring laying hens by comparing length of stay in each compartment as detected by the image processing system with the actual occurrences registered by human observations. In this work, an image processing system was implemented and evaluated for use in an environmental animal preference chamber to detect hen navigation between 4 compartments of the chamber. One camera was installed above each compartment to produce top-view images of the whole compartment. An ellipse-fitting model was applied to captured images to detect whether the hen was present in a compartment. During a choice-test study, mean ± SD success detection rates of 95.9 ± 2.6% were achieved when considering total duration of compartment occupancy. These results suggest that the image processing system is currently suitable for determining the response measures for assessing environmental choices. Moreover, the image processing system offered a comprehensive analysis of occupancy while substantially reducing data processing time compared with the time-intensive alternative of manual video analysis. The above technique was used to monitor ammonia aversion in the chamber. As a preliminary pilot study, different levels of ammonia were applied to different compartments while hens were allowed to navigate between compartments. Using the automated monitor tool to assess occupancy, a negative trend of compartment occupancy with ammonia level was revealed, though further examination is needed. ©2014 Poultry Science Association Inc.

  13. Design of a National Retail Data Monitor for Public Health Surveillance

    PubMed Central

    Wagner, Michael M.; Robinson, J. Michael; Tsui, Fu-Chiang; Espino, Jeremy U.; Hogan, William R.

    2003-01-01

    The National Retail Data Monitor receives data daily from 10,000 stores, including pharmacies, that sell health care products. These stores belong to national chains that process sales data centrally and utilize Universal Product Codes and scanners to collect sales information at the cash register. The high degree of retail sales data automation enables the monitor to collect information from thousands of store locations in near to real time for use in public health surveillance. The monitor provides user interfaces that display summary sales data on timelines and maps. Algorithms monitor the data automatically on a daily basis to detect unusual patterns of sales. The project provides the resulting data and analyses, free of charge, to health departments nationwide. Future plans include continued enrollment and support of health departments, developing methods to make the service financially self-supporting, and further refinement of the data collection system to reduce the time latency of data receipt and analysis. PMID:12807802

  14. Manned spacecraft automation and robotics

    NASA Technical Reports Server (NTRS)

    Erickson, Jon D.

    1987-01-01

    The Space Station holds promise of being a showcase user and driver of advanced automation and robotics technology. The author addresses the advances in automation and robotics from the Space Shuttle - with its high-reliability redundancy management and fault tolerance design and its remote manipulator system - to the projected knowledge-based systems for monitoring, control, fault diagnosis, planning, and scheduling, and the telerobotic systems of the future Space Station.

  15. Web-Accessible Scientific Workflow System for Performance Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roelof Versteeg; Roelof Versteeg; Trevor Rowe

    2006-03-01

    We describe the design and implementation of a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic Javascript and HTML/CSS) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This environment allows for reproducible, transparent result generation by a diverse user base. It has been implemented for several monitoringmore » systems with different degrees of complexity.« less

  16. Tracking the NOvA Detectors' Performance

    NASA Astrophysics Data System (ADS)

    Psihas, Fernanda; NOvA Collaboration

    2016-03-01

    The NOvA experiment measures long baseline νμ -->νe oscillations in Fermilab's NuMI beam. We employ two detectors equipped with over 10 thousand sets of data-taking electronics; avalanche photo diodes and front end boards which collect and process the scintillation signal from particle interactions within the detectors. These sets of electronics -as well as the systems which power and cool them- must be monitored and maintained at precise working conditions to ensure maximal data-taking uptime, good data quality and a lasting life for our detectors. This poster describes the automated systems used on NOvA to simultaneously monitor our data quality, diagnose hardware issues, track our performance and coordinate maintenance for the detectors.

  17. The Relationship of Self-Efficacy and Complacency in Pilot-Automation Interaction

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III

    2002-01-01

    Pilot 'complacency' has been implicated as a contributing factor in numerous aviation accidents and incidents. The term has become more prominent with the increase in automation technology in modern cockpits and, therefore, research has been focused on understanding the factors that may mitigate its effect on pilot-automation interaction. The study examined self-efficacy of supervisory monitoring and the relationship between complacency on strategy of pilot use of automation for workload management under automation schedules that produce the potential for complacency. The results showed that self-efficacy can be a 'double-edged' sword in reducing potential for automation-induced complacency but limiting workload management strategies and increasing other hazardous states of awareness.

  18. The Automation of Reserve Processing.

    ERIC Educational Resources Information Center

    Self, James

    1985-01-01

    Describes an automated reserve processing system developed locally at Clemons Library, University of Virginia. Discussion covers developments in the reserve operation at Clemons Library, automation of the processing and circulation functions of reserve collections, and changes in reserve operation performance and staffing needs due to automation.…

  19. [Airborne particles in a multi-wall carbon nanotube production plant: observation of particle emission and personal exposure 1: Measurement in the packing process].

    PubMed

    Takaya, Mitsutoshi; Serita, Fumio; Ono-Ogasawara, Mariko; Shinohara, Yasushi; Saito, Hiroyuki; Koda, Shigeki

    2010-01-01

    In order to assess the exposure risks of multiwall carbon nanotubes (MWCNT) for packing workers, we carried out real-time monitoring in the two types of packing facilities of MWCNT, and exposure measurements for the packing workers. In the real-time monitoring, a scanning mobility particle sizer (SMPS) and an optical particle counter (OPC) were used to measure nanoscale particles and sub-micron/micron scale particles, respectively. A personal sampler with PM 4.0 was used to measure the personal exposures in the packing facilities. One of the packing facilities is manually operated and the other is automated. The concentrations of airborne dust in both facilities were almost the same as each other at 0.24 mg/m(3) (total dust). However, the results of personal exposure measurements were quite different between the two facilities. The exposure concentrations of workers in the manually and automated operations were 2.39/0.39 (total/respirable) mg/m(3) and 0.29/0.08 (total/respirable) mg/m(3), respectively. From the time series study, submicron scale particles were released into the workplace air when the CNT products were put into temporary container bags from a hopper and manually packed into shipping bags. However, the task-related nanoscale particle release was not observed. The manual packing operation is one of the "hot spots" in MWCNT production facilities, and automation brings much improvement to reduce MWCNT exposure.

  20. Electronic Data Interchange in Procurement

    DTIC Science & Technology

    1990-04-01

    contract management and order processing systems. This conversion of automated information to paper and back to automated form is not only slow and...automated purchasing computer and the contractor’s order processing computer through telephone lines, as illustrated in Figure 1-1. Computer-to-computer...into the contractor’s order processing or contract management system. This approach - converting automated information to paper and back to automated

  1. Subnanosecond polarized microfluorimetry in the time domain: An instrument for studying receptor trafficking in live cells

    NASA Astrophysics Data System (ADS)

    Martin-Fernandez, M. L.; Tobin, M. J.; Clarke, D. T.; Gregory, C. M.; Jones, G. R.

    1998-02-01

    We describe an instrument designed to monitor molecular motions in multiphasic, weakly fluorescent microscopic systems. It combines synchrotron radiation, a low irradiance polarized microfluorimeter, and an automated, multiframing, single-photon-counting data acquisition system, and is capable of continually accumulating subnanosecond resolved anisotropy decays with a real-time resolution of about 60 s. The instrument has initially been built to monitor ligand-receptor interactions in living cells, but can equally be applied to the continual measurement of any dynamic process involving fluorescent molecules, that occurs over a time scale from a few minutes to several hours. As a particularly demanding demonstration of its capabilities, we have used it to monitor the environmental constraints imposed on the peptide hormone epidermal growth factor during its endocytosis and recycling to the cell surface in live cells.

  2. Configurable technology development for reusable control and monitor ground systems

    NASA Technical Reports Server (NTRS)

    Uhrlaub, David R.

    1994-01-01

    The control monitor unit (CMU) uses configurable software technology for real-time mission command and control, telemetry processing, simulation, data acquisition, data archiving, and ground operations automation. The base technology is currently planned for the following control and monitor systems: portable Space Station checkout systems; ecological life support systems; Space Station logistics carrier system; and the ground system of the Delta Clipper (SX-2) in the Single-Stage Rocket Technology program. The CMU makes extensive use of commercial technology to increase capability and reduce development and life-cycle costs. The concepts and technology are being developed by McDonnell Douglas Space and Defense Systems for the Real-Time Systems Laboratory at NASA's Kennedy Space Center under the Payload Ground Operations Contract. A second function of the Real-Time Systems Laboratory is development and utilization of advanced software development practices.

  3. Comparison Between Manual Auditing and a Natural Language Process With Machine Learning Algorithm to Evaluate Faculty Use of Standardized Reports in Radiology.

    PubMed

    Guimaraes, Carolina V; Grzeszczuk, Robert; Bisset, George S; Donnelly, Lane F

    2018-03-01

    When implementing or monitoring department-sanctioned standardized radiology reports, feedback about individual faculty performance has been shown to be a useful driver of faculty compliance. Most commonly, these data are derived from manual audit, which can be both time-consuming and subject to sampling error. The purpose of this study was to evaluate whether a software program using natural language processing and machine learning could accurately audit radiologist compliance with the use of standardized reports compared with performed manual audits. Radiology reports from a 1-month period were loaded into such a software program, and faculty compliance with use of standardized reports was calculated. For that same period, manual audits were performed (25 reports audited for each of 42 faculty members). The mean compliance rates calculated by automated auditing were then compared with the confidence interval of the mean rate by manual audit. The mean compliance rate for use of standardized reports as determined by manual audit was 91.2% with a confidence interval between 89.3% and 92.8%. The mean compliance rate calculated by automated auditing was 92.0%, within that confidence interval. This study shows that by use of natural language processing and machine learning algorithms, an automated analysis can accurately define whether reports are compliant with use of standardized report templates and language, compared with manual audits. This may avoid significant labor costs related to conducting the manual auditing process. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  4. Automated and model-based assembly of an anamorphic telescope

    NASA Astrophysics Data System (ADS)

    Holters, Martin; Dirks, Sebastian; Stollenwerk, Jochen; Loosen, Peter

    2018-02-01

    Since the first usage of optical glasses there has been an increasing demand for optical systems which are highly customized for a wide field of applications. To meet the challenge of the production of so many unique systems, the development of new techniques and approaches has risen in importance. However, the assembly of precision optical systems with lot sizes of one up to a few tens of systems is still dominated by manual labor. In contrast, highly adaptive and model-based approaches may offer a solution for manufacturing with a high degree of automation and high throughput while maintaining high precision. In this work a model-based automated assembly approach based on ray-tracing is presented. This process runs autonomously, and accounts for a wide range of functionality. It firstly identifies the sequence for an optimized assembly and secondly, generates and matches intermediate figures of merit to predict the overall optical functionality of the optical system. This process also takes into account the generation of a digital twin of the optical system, by mapping key-performance-indicators like the first and the second momentum of intensity into the optical model. This approach is verified by the automatic assembly of an anamorphic telescope within an assembly cell. By continuous measuring and mapping the key-performance-indicators into the optical model, the quality of the digital twin is determined. Moreover, by measuring the optical quality and geometrical parameters of the telescope, the precision of this approach is determined. Finally, the productivity of the process is evaluated by monitoring the speed of the different steps of the process.

  5. Aquarius's Instrument Science Data System (ISDS) Automated to Acquire, Process, Trend Data and Produce Radiometric System Assessment Reports

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Aquarius Radiometer, a subsystem of the Aquarius Instrument required a data acquisition ground system to support calibration and radiometer performance assessment. To support calibration and compose performance assessments, we developed an automated system which uploaded raw data to a ftp server and saved raw and processed data to a database. This paper details the overall functionalities of the Aquarius Instrument Science Data System (ISDS) and the individual electrical ground support equipment (EGSE) which produced data files that were infused into the ISDS. Real time EGSEs include an ICDS Simulator, Calibration GSE, Labview controlled power supply, and a chamber data acquisition system. ICDS Simulator serves as a test conductor primary workstation, collecting radiometer housekeeping (HK) and science data and passing commands and HK telemetry collection request to the radiometer. Calibration GSE (Radiometer Active Test Source) provides source choice from multiple targets for the radiometer external calibration. Power Supply GSE, controlled by labview, provides real time voltage and current monitoring of the radiometer. And finally the chamber data acquisition system produces data reflecting chamber vacuum pressure, thermistor temperatures, AVG and watts. Each GSE system produce text based data files every two to six minutes and automatically copies the data files to the Central Archiver PC. The Archiver PC stores the data files, schedules automated uploads of these files to an external FTP server, and accepts request to copy all data files to the ISDS for offline data processing and analysis. Aquarius Radiometer ISDS contains PHP and MATLab programs to parse, process and save all data to a MySQL database. Analysis tools (MATLab programs) in the ISDS system are capable of displaying radiometer science, telemetry and auxiliary data in near real time as well as performing data analysis and producing automated performance assessment reports of the Aquarius Radiometer.

  6. Designing an autonomous environment for mission critical operation of the EUVE satellite

    NASA Technical Reports Server (NTRS)

    Abedini, Annadiana; Malina, Roger F.

    1994-01-01

    Since the launch of NASA's Extreme Ultraviolet Explorer (EUVE) satellite in 1992, there has only been a handful of occurrences that have warranted manual intervention in the EUVE Science Operations Center (ESOC). So, in an effort to reduce costs, the current environment is being redesigned to utilize a combination of off-the-shelf packages and recently developed artificial intelligence (AI) software to automate the monitoring of the science payload and ground systems. The successful implementation of systemic automation would allow the ESOC to evolve from a seven day/week, three shift operation, to a seven day/week one shift operation. First, it was necessary to identify all areas considered mission critical. These were defined as follows: (1) The telemetry stream must be monitored autonomously and anomalies identified. (2) Duty personnel must be automatically paged and informed of the occurrence of an anomaly. (3) The 'basic' state of the ground system must be assessed. (4) Monitors should check that the systems and processes needed to continue in a 'healthy' operational mode are working at all times. (5) Network loads should be monitored to ensure that they stay within established limits. (6) Connectivity to Goddard Space Flight Center (GSFC) systems should be monitored as well, not just for connectivity of the network itself but also for the ability to transfer files. (7) All necessary peripheral devices should be monitored. This would include the disks, routers, tape drives, printers, tape carousel, and power supplies. (8) System daemons such as the archival daemon, the Sybase server, the payload monitoring software, and any other necessary processes should be monitored to ensure that they are operational. (9) The monitoring system needs to be redundant so that the failure of a single machine will not paralyze the monitors. (10) Notification should be done by means of looking though a table of the pager numbers for current 'on call' personnel. The software should be capable of dialing out to notify, sending email, and producing error logs. (11) The system should have knowledge of when real-time passes and tape recorder dumps will occur and should know that these passes and data transmissions are successful. Once the design criteria were established, the design team split into two groups: one that addressed the tracking, commanding, and health and safety of the science payload and another group that addressed the ground systems and communications aspects of the overall system.

  7. Sensor-Augmented Insulin Pumps and Hypoglycemia Prevention in Type 1 Diabetes.

    PubMed

    Steineck, Isabelle; Ranjan, Ajenthen; Nørgaard, Kirsten; Schmidt, Signe

    2017-01-01

    Hypoglycemia can lead to seizures, unconsciousness, or death. Insulin pump treatment reduces the frequency of severe hypoglycemia compared with multiple daily injections treatment. The addition of a continuous glucose monitor, so-called sensor-augmented pump (SAP) treatment, has the potential to further limit the duration and severity of hypoglycemia as the system can detect and in some systems act on impending and prevailing low blood glucose levels. In this narrative review we summarize the available knowledge on SAPs with and without automated insulin suspension, in relation to hypoglycemia prevention. We present evidence from randomized trials, observational studies, and meta-analyses including nonpregnant individuals with type 1 diabetes mellitus. We also outline concerns regarding SAPs with and without automated insulin suspension. There is evidence that SAP treatment reduces episodes of moderate and severe hypoglycemia compared with multiple daily injections plus self-monitoring of blood glucose. There is some evidence that SAPs both with and without automated suspension reduces the frequency of severe hypoglycemic events compared with insulin pumps without continuous glucose monitoring.

  8. Uncertainty quantification in structural health monitoring: Applications on cultural heritage buildings

    NASA Astrophysics Data System (ADS)

    Lorenzoni, Filippo; Casarin, Filippo; Caldon, Mauro; Islami, Kleidi; Modena, Claudio

    2016-01-01

    In the last decades the need for an effective seismic protection and vulnerability reduction of cultural heritage buildings and sites determined a growing interest in structural health monitoring (SHM) as a knowledge-based assessment tool to quantify and reduce uncertainties regarding their structural performance. Monitoring can be successfully implemented in some cases as an alternative to interventions or to control the medium- and long-term effectiveness of already applied strengthening solutions. The research group at the University of Padua, in collaboration with public administrations, has recently installed several SHM systems on heritage structures. The paper reports the application of monitoring strategies implemented to avoid (or at least minimize) the execution of strengthening interventions/repairs and control the response as long as a clear worsening or damaging process is detected. Two emblematic case studies are presented and discussed: the Roman Amphitheatre (Arena) of Verona and the Conegliano Cathedral. Both are excellent examples of on-going monitoring activities, performed through static and dynamic approaches in combination with automated procedures to extract meaningful structural features from collected data. In parallel to the application of innovative monitoring techniques, statistical models and data processing algorithms have been developed and applied in order to reduce uncertainties and exploit monitoring results for an effective assessment and protection of historical constructions. Processing software for SHM was implemented to perform the continuous real time treatment of static data and the identification of modal parameters based on the structural response to ambient vibrations. Statistical models were also developed to filter out the environmental effects and thermal cycles from the extracted features.

  9. Development and validation of inexpensive, automated, dynamic flux chambers

    EPA Science Inventory

    We developed and validated an automated, inexpensive, and continuous multiple-species gas-flux monitoring system that can provide data for a variety of relevant atmospheric pollutants, including O3, CO2, and NOx. Validation consisted of conducting concurrent gas-phase dry deposit...

  10. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    PubMed

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  11. Streamlining plant sample preparation: the use of high-throughput robotics to process echinacea samples for biomarker profiling by MALDI-TOF mass spectrometry.

    PubMed

    Greene, Leasa A; Isaac, Issa; Gray, Dean E; Schwartz, Sarah A

    2007-09-01

    Several species in the genus Echinacea are beneficial herbs popularly used for many ailments. The most popular Echinacea species for cultivation, wild collection, and herbal products include E. purpurea (L.) Moench, E. pallida (Nutt.) Nutt., and E. angustifolia (DC). Product adulteration is a key concern for the natural products industry, where botanical misidentification and introduction of other botanical and nonbotanical contaminants exist throughout the formulation and production process. Therefore, rapid and cost-effective methods that can be used to monitor these materials for complex product purity and consistency are of benefit to consumers and producers. The objective of this continuing research was to develop automated, high-throughput processing methods that, teamed with matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) analysis, differentiate Echinacea species by their mass profiles. Small molecules, peptide, and proteins from aerial parts (leaf/stem/flowers), seeds, and roots from E. purpurea and E. angustifolia; seeds and roots from E. pallida; and off-the-shelf Echinacea supplements were extracted and analyzed by MS using methods developed on the ProPrep liquid handling system (Genomic Solutions). Analysis of these samples highlighted key MS signal patterns from both small molecules and proteins that characterized the individual Echinacea materials analyzed. Based on analysis of pure Echinacea samples, off-the-shelf products containing Echinacea could then be evaluated in a streamlined process. Corresponding analysis of dietary supplements was used to monitor for product composition, including Echinacea species and plant materials used. These results highlight the potential for streamlined, automated approaches for agricultural species differentiation and botanical product evaluation.

  12. Advanced information processing system

    NASA Technical Reports Server (NTRS)

    Lala, J. H.

    1984-01-01

    Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.

  13. Holographic optical assembly and photopolymerized joining of planar microspheres

    DOE PAGES

    Shaw, L. A.; Chizari, S.; Panas, R. M.; ...

    2016-07-27

    The aim of this research is to demonstrate a holographically driven photopolymerization process for joining colloidal particles to create planar microstructures fixed to a substrate, which can be monitored with real-time measurement. Holographic optical tweezers (HOT) have been used to arrange arrays of microparticles prior to this work; here we introduce a new photopolymerization process for rapidly joining simultaneously handled microspheres in a plane. Additionally, we demonstrate a new process control technique for efficiently identifying when particles have been successfully joined by measuring a sufficient reduction in the particles’ Brownian motion. Furthermore, this technique and our demonstrated joining approach enablemore » HOT technology to take critical steps toward automated additive fabrication of microstructures.« less

  14. Digital video system for on-line portal verification

    NASA Astrophysics Data System (ADS)

    Leszczynski, Konrad W.; Shalev, Shlomo; Cosby, N. Scott

    1990-07-01

    A digital system has been developed for on-line acquisition, processing and display of portal images during radiation therapy treatment. A metal/phosphor screen combination is the primary detector, where the conversion from high-energy photons to visible light takes place. A mirror angled at 45 degrees reflects the primary image to a low-light-level camera, which is removed from the direct radiation beam. The image registered by the camera is digitized, processed and displayed on a CRT monitor. Advanced digital techniques for processing of on-line images have been developed and implemented to enhance image contrast and suppress the noise. Some elements of automated radiotherapy treatment verification have been introduced.

  15. Gypsy moth (Lepidoptera: Lymantriidae) flight behavior and phenology based on field-deployed automated pheromone-baited traps

    Treesearch

    Patrick C. Tobin; Kenneth T. Klein; Donna S. Leonard

    2009-01-01

    Populations of the gypsy moth, Lymantria dispar (L.), are extensively monitored in the United States through the use of pheromone-baited traps.We report on use of automated pheromone-baited traps that use a recording sensor and data logger to record the unique date-time stamp of males as they enter the trap.We deployed a total of 352 automated traps...

  16. Quantifying, Visualizing, and Monitoring Lead Optimization.

    PubMed

    Maynard, Andrew T; Roberts, Christopher D

    2016-05-12

    Although lead optimization (LO) is by definition a process, process-centric analysis and visualization of this important phase of pharmaceutical R&D has been lacking. Here we describe a simple statistical framework to quantify and visualize the progression of LO projects so that the vital signs of LO convergence can be monitored. We refer to the resulting visualizations generated by our methodology as the "LO telemetry" of a project. These visualizations can be automated to provide objective, holistic, and instantaneous analysis and communication of LO progression. This enhances the ability of project teams to more effectively drive LO process, while enabling management to better coordinate and prioritize LO projects. We present the telemetry of five LO projects comprising different biological targets and different project outcomes, including clinical compound selection, termination due to preclinical safety/tox, and termination due to lack of tractability. We demonstrate that LO progression is accurately captured by the telemetry. We also present metrics to quantify LO efficiency and tractability.

  17. Automation of Classical QEEG Trending Methods for Early Detection of Delayed Cerebral Ischemia: More Work to Do.

    PubMed

    Wickering, Ellis; Gaspard, Nicolas; Zafar, Sahar; Moura, Valdery J; Biswal, Siddharth; Bechek, Sophia; OʼConnor, Kathryn; Rosenthal, Eric S; Westover, M Brandon

    2016-06-01

    The purpose of this study is to evaluate automated implementations of continuous EEG monitoring-based detection of delayed cerebral ischemia based on methods used in classical retrospective studies. We studied 95 patients with either Fisher 3 or Hunt Hess 4 to 5 aneurysmal subarachnoid hemorrhage who were admitted to the Neurosciences ICU and underwent continuous EEG monitoring. We implemented several variations of two classical algorithms for automated detection of delayed cerebral ischemia based on decreases in alpha-delta ratio and relative alpha variability. Of 95 patients, 43 (45%) developed delayed cerebral ischemia. Our automated implementation of the classical alpha-delta ratio-based trending method resulted in a sensitivity and specificity (Se,Sp) of (80,27)%, compared with the values of (100,76)% reported in the classic study using similar methods in a nonautomated fashion. Our automated implementation of the classical relative alpha variability-based trending method yielded (Se,Sp) values of (65,43)%, compared with (100,46)% reported in the classic study using nonautomated analysis. Our findings suggest that improved methods to detect decreases in alpha-delta ratio and relative alpha variability are needed before an automated EEG-based early delayed cerebral ischemia detection system is ready for clinical use.

  18. Artificial intelligence in a mission operations and satellite test environment

    NASA Technical Reports Server (NTRS)

    Busse, Carl

    1988-01-01

    A Generic Mission Operations System using Expert System technology to demonstrate the potential of Artificial Intelligence (AI) automated monitor and control functions in a Mission Operations and Satellite Test environment will be developed at the National Aeronautics and Space Administration (NASA) Jet Propulsion Laboratory (JPL). Expert system techniques in a real time operation environment are being studied and applied to science and engineering data processing. Advanced decommutation schemes and intelligent display technology will be examined to develop imaginative improvements in rapid interpretation and distribution of information. The Generic Payload Operations Control Center (GPOCC) will demonstrate improved data handling accuracy, flexibility, and responsiveness in a complex mission environment. The ultimate goal is to automate repetitious mission operations, instrument, and satellite test functions by the applications of expert system technology and artificial intelligence resources and to enhance the level of man-machine sophistication.

  19. A standardized kit for automated quantitative assessment of candidate protein biomarkers in human plasma.

    PubMed

    Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H

    2015-12-01

    An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.

  20. NASA Tech Briefs, August 2003

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Topics covered include: Stable, Thermally Conductive Fillers for Bolted Joints; Connecting to Thermocouples with Fewer Lead Wires; Zipper Connectors for Flexible Electronic Circuits; Safety Interlock for Angularly Misdirected Power Tool; Modular, Parallel Pulse-Shaping Filter Architectures; High-Fidelity Piezoelectric Audio Device; Photovoltaic Power Station with Ultracapacitors for Storage; Time Analyzer for Time Synchronization and Monitor of the Deep Space Network; Program for Computing Albedo; Integrated Software for Analyzing Designs of Launch Vehicles; Abstract-Reasoning Software for Coordinating Multiple Agents; Software Searches for Better Spacecraft-Navigation Models; Software for Partly Automated Recognition of Targets; Antistatic Polycarbonate/Copper Oxide Composite; Better VPS Fabrication of Crucibles and Furnace Cartridges; Burn-Resistant, Strong Metal-Matrix Composites; Self-Deployable Spring-Strip Booms; Explosion Welding for Hermetic Containerization; Improved Process for Fabricating Carbon Nanotube Probes; Automated Serial Sectioning for 3D Reconstruction; and Parallel Subconvolution Filtering Architectures.

  1. Acoustic-sensor-based detection of damage in composite aircraft structures

    NASA Astrophysics Data System (ADS)

    Foote, Peter; Martin, Tony; Read, Ian

    2004-03-01

    Acoustic emission detection is a well-established method of locating and monitoring crack development in metal structures. The technique has been adapted to test facilities for non-destructive testing applications. Deployment as an operational or on-line automated damage detection technology in vehicles is posing greater challenges. A clear requirement of potential end-users of such systems is a level of automation capable of delivering low-level diagnosis information. The output from the system is in the form of "go", "no-go" indications of structural integrity or immediate maintenance actions. This level of automation requires significant data reduction and processing. This paper describes recent trials of acoustic emission detection technology for the diagnosis of damage in composite aerospace structures. The technology comprises low profile detection sensors using piezo electric wafers encapsulated in polymer film ad optical sensors. Sensors are bonded to the structure"s surface and enable acoustic events from the loaded structure to be located by triangulation. Instrumentation has been enveloped to capture and parameterise the sensor data in a form suitable for low-bandwidth storage and transmission.

  2. Automated Visual Cognitive Tasks for Recording Neural Activity Using a Floor Projection Maze

    PubMed Central

    Kent, Brendon W.; Yang, Fang-Chi; Burwell, Rebecca D.

    2014-01-01

    Neuropsychological tasks used in primates to investigate mechanisms of learning and memory are typically visually guided cognitive tasks. We have developed visual cognitive tasks for rats using the Floor Projection Maze1,2 that are optimized for visual abilities of rats permitting stronger comparisons of experimental findings with other species. In order to investigate neural correlates of learning and memory, we have integrated electrophysiological recordings into fully automated cognitive tasks on the Floor Projection Maze1,2. Behavioral software interfaced with an animal tracking system allows monitoring of the animal's behavior with precise control of image presentation and reward contingencies for better trained animals. Integration with an in vivo electrophysiological recording system enables examination of behavioral correlates of neural activity at selected epochs of a given cognitive task. We describe protocols for a model system that combines automated visual presentation of information to rodents and intracranial reward with electrophysiological approaches. Our model system offers a sophisticated set of tools as a framework for other cognitive tasks to better isolate and identify specific mechanisms contributing to particular cognitive processes. PMID:24638057

  3. NASA Systems Autonomy Demonstration Program - A step toward Space Station automation

    NASA Technical Reports Server (NTRS)

    Starks, S. A.; Rundus, D.; Erickson, W. K.; Healey, K. J.

    1987-01-01

    This paper addresses a multiyear NASA program, the Systems Autonomy Demonstration Program (SADP), whose main objectives include the development, integration, and demonstration of automation technology in Space Station flight and ground support systems. The role of automation in the Space Station is reviewed, and the main players in SADP and their roles are described. The core research and technology being promoted by SADP are discussed, and a planned 1988 milestone demonstration of the automated monitoring, operation, and control of a complete mission operations subsystem is addressed.

  4. Integration of Sensors, Controllers and Instruments Using a Novel OPC Architecture

    PubMed Central

    2017-01-01

    The interconnection between sensors, controllers and instruments through a communication network plays a vital role in the performance and effectiveness of a control system. Since its inception in the 90s, the Object Linking and Embedding for Process Control (OPC) protocol has provided open connectivity for monitoring and automation systems. It has been widely used in several environments such as industrial facilities, building and energy automation, engineering education and many others. This paper presents a novel OPC-based architecture to implement automation systems devoted to R&D and educational activities. The proposal is a novel conceptual framework, structured into four functional layers where the diverse components are categorized aiming to foster the systematic design and implementation of automation systems involving OPC communication. Due to the benefits of OPC, the proposed architecture provides features like open connectivity, reliability, scalability, and flexibility. Furthermore, four successful experimental applications of such an architecture, developed at the University of Extremadura (UEX), are reported. These cases are a proof of concept of the ability of this architecture to support interoperability for different domains. Namely, the automation of energy systems like a smart microgrid and photobioreactor facilities, the implementation of a network-accessible industrial laboratory and the development of an educational hardware-in-the-loop platform are described. All cases include a Programmable Logic Controller (PLC) to automate and control the plant behavior, which exchanges operative data (measurements and signals) with a multiplicity of sensors, instruments and supervisory systems under the structure of the novel OPC architecture. Finally, the main conclusions and open research directions are highlighted. PMID:28654002

  5. Integration of Sensors, Controllers and Instruments Using a Novel OPC Architecture.

    PubMed

    González, Isaías; Calderón, Antonio José; Barragán, Antonio Javier; Andújar, José Manuel

    2017-06-27

    The interconnection between sensors, controllers and instruments through a communication network plays a vital role in the performance and effectiveness of a control system. Since its inception in the 90s, the Object Linking and Embedding for Process Control (OPC) protocol has provided open connectivity for monitoring and automation systems. It has been widely used in several environments such as industrial facilities, building and energy automation, engineering education and many others. This paper presents a novel OPC-based architecture to implement automation systems devoted to R&D and educational activities. The proposal is a novel conceptual framework, structured into four functional layers where the diverse components are categorized aiming to foster the systematic design and implementation of automation systems involving OPC communication. Due to the benefits of OPC, the proposed architecture provides features like open connectivity, reliability, scalability, and flexibility. Furthermore, four successful experimental applications of such an architecture, developed at the University of Extremadura (UEX), are reported. These cases are a proof of concept of the ability of this architecture to support interoperability for different domains. Namely, the automation of energy systems like a smart microgrid and photobioreactor facilities, the implementation of a network-accessible industrial laboratory and the development of an educational hardware-in-the-loop platform are described. All cases include a Programmable Logic Controller (PLC) to automate and control the plant behavior, which exchanges operative data (measurements and signals) with a multiplicity of sensors, instruments and supervisory systems under the structure of the novel OPC architecture. Finally, the main conclusions and open research directions are highlighted.

  6. Monitoring small-crack growth by the replication method

    NASA Technical Reports Server (NTRS)

    Swain, Mary H.

    1992-01-01

    The suitability of the acetate replication method for monitoring the growth of small cracks is discussed. Applications of this technique are shown for cracks growing at the notch root in semicircular-edge-notch specimens of a variety of aluminum alloys and one steel. The calculated crack growth rate versus Delta K relationship for small cracks was compared to that for large cracks obtained from middle-crack-tension specimens. The primary advantage of this techinque is that it provides an opportunity, at the completion of the test, to go backward in time towards the crack initiation event and 'zoom in' on areas of interest on the specimen surface with a resolution of about 0.1 micron. The primary disadvantage is the inability to automate the process. Also, for some materials, the replication process may alter the crack-tip chemistry or plastic zone, thereby affecting crack growth rates.

  7. ACQUA: Automated Cyanobacterial Quantification Algorithm for toxic filamentous genera using spline curves, pattern recognition and machine learning.

    PubMed

    Gandola, Emanuele; Antonioli, Manuela; Traficante, Alessio; Franceschini, Simone; Scardi, Michele; Congestri, Roberta

    2016-05-01

    Toxigenic cyanobacteria are one of the main health risks associated with water resources worldwide, as their toxins can affect humans and fauna exposed via drinking water, aquaculture and recreation. Microscopy monitoring of cyanobacteria in water bodies and massive growth systems is a routine operation for cell abundance and growth estimation. Here we present ACQUA (Automated Cyanobacterial Quantification Algorithm), a new fully automated image analysis method designed for filamentous genera in Bright field microscopy. A pre-processing algorithm has been developed to highlight filaments of interest from background signals due to other phytoplankton and dust. A spline-fitting algorithm has been designed to recombine interrupted and crossing filaments in order to perform accurate morphometric analysis and to extract the surface pattern information of highlighted objects. In addition, 17 specific pattern indicators have been developed and used as input data for a machine-learning algorithm dedicated to the recognition between five widespread toxic or potentially toxic filamentous genera in freshwater: Aphanizomenon, Cylindrospermopsis, Dolichospermum, Limnothrix and Planktothrix. The method was validated using freshwater samples from three Italian volcanic lakes comparing automated vs. manual results. ACQUA proved to be a fast and accurate tool to rapidly assess freshwater quality and to characterize cyanobacterial assemblages in aquatic environments. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Study on diesel vertical migration characteristics and mechanism in water-bearing sand stratum using an automated resistivity monitoring system.

    PubMed

    Pan, Yuying; Jia, Yonggang; Wang, Yuhua; Xia, Xin; Guo, Lei

    2018-02-01

    Oil spills frequently occur on both land and sea. Petroleum in mobile phase will cause serious pollution in the sediment and can form a secondary pollution source. Therefore, it is very important to study the migration of petroleum in sediments ideally in a rapid and simplified approach. The release of diesel was simulated using fine beach sand to construct a model aquifer, and dynamic monitoring was carried out using an automated monitoring system including a resistivity probe originally developed by our research group. The mobile phase migration fronts were determined accurately using wavelet analysis method combined with resistivity curve method. Then, a relationship between resistivity and the joint oil-water content was established. The main conclusions were as follows. The seepage velocity of the diesel with high mobility at the initial stage of infiltration was faster, followed by a period when gravity seepage was dominant, and finally a redistribution period at the later stage, which was mainly an oil-water displacement process. The resistivity trends for diesel infiltration in different water-saturated soil layers varied with depth. The resistivity in the vadose zone fluctuated significantly, increasing initially and later decreasing. The resistivity change in the capillary zone was relatively small and constant in the initial stage; then, it increased and subsequently decreased. The resistivity in the saturated zone was basically unchanged with depth, and the value became slightly larger than the background value over time. Overall, for a large volume of mobile phase diesel leakage, the arrival migration fronts can be detected by wavelet analysis combined with resistivity curves. The thickness of the oil slick in the capillary zone can be estimated by resistivity changes. The relationships between resistivity and both the moisture content and oil-water joint saturation are in agreement with the linear models. The research results provide basic data and a new data processing method for monitoring of contaminated sites following major oil spills using the resistivity method.

  9. Sensory-based expert monitoring and control

    NASA Astrophysics Data System (ADS)

    Yen, Gary G.

    1999-03-01

    Field operators use their eyes, ears, and nose to detect process behavior and to trigger corrective control actions. For instance: in daily practice, the experienced operator in sulfuric acid treatment of phosphate rock may observe froth color or bubble character to control process material in-flow. Or, similarly, (s)he may use acoustic sound of cavitation or boiling/flashing to increase or decrease material flow rates in tank levels. By contrast, process control computers continue to be limited to taking action on P, T, F, and A signals. Yet, there is sufficient evidence from the fields that visual and acoustic information can be used for control and identification. Smart in-situ sensors have facilitated potential mechanism for factory automation with promising industry applicability. In respond to these critical needs, a generic, structured health monitoring approach is proposed. The system assumes a given sensor suite will act as an on-line health usage monitor and at best provide the real-time control autonomy. The sensor suite can incorporate various types of sensory devices, from vibration accelerometers, directional microphones, machine vision CCDs, pressure gauges to temperature indicators. The decision can be shown in a visual on-board display or fed to the control block to invoke controller reconfigurration.

  10. Automatic Adviser on stationary devices status identification and anticipated change

    NASA Astrophysics Data System (ADS)

    Shabelnikov, A. N.; Liabakh, N. N.; Gibner, Ya M.; Pushkarev, E. A.

    2018-05-01

    A task is defined to synthesize an Automatic Adviser to identify the automation systems stationary devices status using an autoregressive model of changing their key parameters. An applied model type was rationalized and the research objects monitoring process algorithm was developed. A complex of mobile objects status operation simulation and prediction results analysis was proposed. Research results are commented using a specific example of a hump yard compressor station. The work was supported by the Russian Fundamental Research Fund, project No. 17-20-01040.

  11. A Bioluminometric Method of DNA Sequencing

    NASA Technical Reports Server (NTRS)

    Ronaghi, Mostafa; Pourmand, Nader; Stolc, Viktor; Arnold, Jim (Technical Monitor)

    2001-01-01

    Pyrosequencing is a bioluminometric single-tube DNA sequencing method that takes advantage of co-operativity between four enzymes to monitor DNA synthesis. In this sequencing-by-synthesis method, a cascade of enzymatic reactions yields detectable light, which is proportional to incorporated nucleotides. Pyrosequencing has the advantages of accuracy, flexibility and parallel processing. It can be easily automated. Furthermore, the technique dispenses with the need for labeled primers, labeled nucleotides and gel-electrophoresis. In this chapter, the use of this technique for different applications is discussed.

  12. Towards an intelligent hospital environment: OR of the future.

    PubMed

    Sutherland, Jeffrey V; van den Heuvel, Willem-Jan; Ganous, Tim; Burton, Matthew M; Kumar, Animesh

    2005-01-01

    Patients, providers, payers, and government demand more effective and efficient healthcare services, and the healthcare industry needs innovative ways to re-invent core processes. Business process reengineering (BPR) showed adopting new hospital information systems can leverage this transformation and workflow management technologies can automate process management. Our research indicates workflow technologies in healthcare require real time patient monitoring, detection of adverse events, and adaptive responses to breakdown in normal processes. Adaptive workflow systems are rarely implemented making current workflow implementations inappropriate for healthcare. The advent of evidence based medicine, guideline based practice, and better understanding of cognitive workflow combined with novel technologies including Radio Frequency Identification (RFID), mobile/wireless technologies, internet workflow, intelligent agents, and Service Oriented Architectures (SOA) opens up new and exciting ways of automating business processes. Total situational awareness of events, timing, and location of healthcare activities can generate self-organizing change in behaviors of humans and machines. A test bed of a novel approach towards continuous process management was designed for the new Weinburg Surgery Building at the University of Maryland Medical. Early results based on clinical process mapping and analysis of patient flow bottlenecks demonstrated 100% improvement in delivery of supplies and instruments at surgery start time. This work has been directly applied to the design of the DARPA Trauma Pod research program where robotic surgery will be performed on wounded soldiers on the battlefield.

  13. Oscillometric Blood Pressure Estimation: Past, Present, and Future.

    PubMed

    Forouzanfar, Mohamad; Dajani, Hilmi R; Groza, Voicu Z; Bolic, Miodrag; Rajan, Sreeraman; Batkin, Izmail

    2015-01-01

    The use of automated blood pressure (BP) monitoring is growing as it does not require much expertise and can be performed by patients several times a day at home. Oscillometry is one of the most common measurement methods used in automated BP monitors. A review of the literature shows that a large variety of oscillometric algorithms have been developed for accurate estimation of BP but these algorithms are scattered in many different publications or patents. Moreover, considering that oscillometric devices dominate the home BP monitoring market, little effort has been made to survey the underlying algorithms that are used to estimate BP. In this review, a comprehensive survey of the existing oscillometric BP estimation algorithms is presented. The survey covers a broad spectrum of algorithms including the conventional maximum amplitude and derivative oscillometry as well as the recently proposed learning algorithms, model-based algorithms, and algorithms that are based on analysis of pulse morphology and pulse transit time. The aim is to classify the diverse underlying algorithms, describe each algorithm briefly, and discuss their advantages and disadvantages. This paper will also review the artifact removal techniques in oscillometry and the current standards for the automated BP monitors.

  14. Automated Internet-Based Control of Spacecraft Groundstations: Beacon-Based Health Monitoring Concept

    NASA Technical Reports Server (NTRS)

    Cantwell, Brian; Twiggs, Robert; Swartwout, Michael

    1997-01-01

    This report serves as an update about the activities of Stanford University's Space Systems Development Laboratory (SSDL) in their beacon-based health monitoring experiment. Section 1 describes the goals of the project and the organization of the team. Section 2 provides an overview of the major components of the system, describing the general approach of automated health monitoring and the beacon signal relay. It also provides background about the SAPPHIRE spacecraft and ASSET operations system, which will be used for the experiment. Specific details about implementation and status of each element of the experiment are found in Section 3. Section 4 describes the experiment and future work, and references are contained in Section 5.

  15. Automation of Physiologic Data Presentation and Alarms in the Post Anesthesia Care Unit

    PubMed Central

    Aukburg, S.J.; Ketikidis, P.H.; Kitz, D.S.; Mavrides, T.G.; Matschinsky, B.B.

    1989-01-01

    The routine use of pulse oximeters, non-invasive blood pressure monitors and electrocardiogram monitors have considerably improved patient care in the post anesthesia period. Using an automated data collection system, we investigated the occurrence of several adverse events frequently revealed by these monitors. We found that the incidence of hypoxia was 35%, hypertension 12%, hypotension 8%, tachycardia 25% and bradycardia 1%. Discriminant analysis was able to correctly predict classification of about 90% of patients into normal vs. hypotensive or hypotensive groups. The system software minimizes artifact, validates data for epidemiologic studies, and is able to identify variables that predict adverse events through application of appropriate statistical and artificial intelligence techniques.

  16. Advanced Biotelemetry Systems for Space Life Sciences: PH Telemetry

    NASA Technical Reports Server (NTRS)

    Hines, John W.; Somps, Chris; Ricks, Robert; Kim, Lynn; Connolly, John P. (Technical Monitor)

    1995-01-01

    The SENSORS 2000! (S2K!) program at NASA's Ames Research Center is currently developing a biotelemetry system for monitoring pH and temperature in unrestrained subjects. This activity is part of a broader scope effort to provide an Advanced Biotelemetry System (ABTS) for use in future space life sciences research. Many anticipated research endeavors will require biomedical and biochemical sensors and related instrumentation to make continuous inflight measurements in a variable-gravity environment. Since crew time is limited, automated data acquisition, data processing, data storage, and subject health monitoring are required. An automated biochemical and physiological data acquisition system based on non invasive or implantable biotelemetry technology will meet these requirements. The ABTS will ultimately acquire a variety of physiological measurands including temperature, biopotentials (e.g. ECG, EEG, EMG, EOG), blood pressure, flow and dimensions, as well as chemical and biological parameters including pH. Development activities are planned in evolutionary, leveraged steps. Near-term activities include 1) development of a dual channel pH/temperature telemetry system, and 2) development of a low bandwidth, 4-channel telemetry system, that measures temperature, heart rate, pressure, and pH. This abstract describes the pH/temperature telemeter.

  17. Automated general temperature correction method for dielectric soil moisture sensors

    NASA Astrophysics Data System (ADS)

    Kapilaratne, R. G. C. Jeewantinie; Lu, Minjiao

    2017-08-01

    An effective temperature correction method for dielectric sensors is important to ensure the accuracy of soil water content (SWC) measurements of local to regional-scale soil moisture monitoring networks. These networks are extensively using highly temperature sensitive dielectric sensors due to their low cost, ease of use and less power consumption. Yet there is no general temperature correction method for dielectric sensors, instead sensor or site dependent correction algorithms are employed. Such methods become ineffective at soil moisture monitoring networks with different sensor setups and those that cover diverse climatic conditions and soil types. This study attempted to develop a general temperature correction method for dielectric sensors which can be commonly used regardless of the differences in sensor type, climatic conditions and soil type without rainfall data. In this work an automated general temperature correction method was developed by adopting previously developed temperature correction algorithms using time domain reflectometry (TDR) measurements to ThetaProbe ML2X, Stevens Hydra probe II and Decagon Devices EC-TM sensor measurements. The rainy day effects removal procedure from SWC data was automated by incorporating a statistical inference technique with temperature correction algorithms. The temperature correction method was evaluated using 34 stations from the International Soil Moisture Monitoring Network and another nine stations from a local soil moisture monitoring network in Mongolia. Soil moisture monitoring networks used in this study cover four major climates and six major soil types. Results indicated that the automated temperature correction algorithms developed in this study can eliminate temperature effects from dielectric sensor measurements successfully even without on-site rainfall data. Furthermore, it has been found that actual daily average of SWC has been changed due to temperature effects of dielectric sensors with a significant error factor comparable to ±1% manufacturer's accuracy.

  18. Seismic array processing and computational infrastructure for improved monitoring of Alaskan and Aleutian seismicity and volcanoes

    NASA Astrophysics Data System (ADS)

    Lindquist, Kent Gordon

    We constructed a near-real-time system, called Iceworm, to automate seismic data collection, processing, storage, and distribution at the Alaska Earthquake Information Center (AEIC). Phase-picking, phase association, and interprocess communication components come from Earthworm (U.S. Geological Survey). A new generic, internal format for digital data supports unified handling of data from diverse sources. A new infrastructure for applying processing algorithms to near-real-time data streams supports automated information extraction from seismic wavefields. Integration of Datascope (U. of Colorado) provides relational database management of all automated measurements, parametric information for located hypocenters, and waveform data from Iceworm. Data from 1997 yield 329 earthquakes located by both Iceworm and the AEIC. Of these, 203 have location residuals under 22 km, sufficient for hazard response. Regionalized inversions for local magnitude in Alaska yield Msb{L} calibration curves (logAsb0) that differ from the Californian Richter magnitude. The new curve is 0.2\\ Msb{L} units more attenuative than the Californian curve at 400 km for earthquakes north of the Denali fault. South of the fault, and for a region north of Cook Inlet, the difference is 0.4\\ Msb{L}. A curve for deep events differs by 0.6\\ Msb{L} at 650 km. We expand geographic coverage of Alaskan regional seismic monitoring to the Aleutians, the Bering Sea, and the entire Arctic by initiating the processing of four short-period, Alaskan seismic arrays. To show the array stations' sensitivity, we detect and locate two microearthquakes that were missed by the AEIC. An empirical study of the location sensitivity of the arrays predicts improvements over the Alaskan regional network that are shown as map-view contour plots. We verify these predictions by detecting an Msb{L} 3.2 event near Unimak Island with one array. The detection and location of four representative earthquakes illustrates the expansion of geographic coverage from array processing. Measurements at the arrays of systematic azimuth residuals, between 5sp° and 50sp° from 203 Aleutian events, reveal significant effects of heterogeneous structure on wavefields. Finally, algorithms to automatically detect earthquakes in continuous array data are demonstrated with the detection of an Aleutian earthquake.

  19. Machine-vision-based roadway health monitoring and assessment : development of a shape-based pavement-crack-detection approach.

    DOT National Transportation Integrated Search

    2016-01-01

    State highway agencies (SHAs) routinely employ semi-automated and automated image-based methods for network-level : pavement-cracking data collection, and there are different types of pavement-cracking data collected by SHAs for reporting and : manag...

  20. 40 CFR 63.7740 - What are my monitoring requirements?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Iron and Steel Foundries Continuous...) Inspect fans for wear, material buildup, and corrosion through quarterly visual inspections, vibration... more automated conveyor and pallet cooling lines and automated shakeout lines at a new iron and steel...

Top