Sample records for process monitoring tool

  1. Process tool monitoring and matching using interferometry technique

    NASA Astrophysics Data System (ADS)

    Anberg, Doug; Owen, David M.; Mileham, Jeffrey; Lee, Byoung-Ho; Bouche, Eric

    2016-03-01

    The semiconductor industry makes dramatic device technology changes over short time periods. As the semiconductor industry advances towards to the 10 nm device node, more precise management and control of processing tools has become a significant manufacturing challenge. Some processes require multiple tool sets and some tools have multiple chambers for mass production. Tool and chamber matching has become a critical consideration for meeting today's manufacturing requirements. Additionally, process tools and chamber conditions have to be monitored to ensure uniform process performance across the tool and chamber fleet. There are many parameters for managing and monitoring tools and chambers. Particle defect monitoring is a well-known and established example where defect inspection tools can directly detect particles on the wafer surface. However, leading edge processes are driving the need to also monitor invisible defects, i.e. stress, contamination, etc., because some device failures cannot be directly correlated with traditional visualized defect maps or other known sources. Some failure maps show the same signatures as stress or contamination maps, which implies correlation to device performance or yield. In this paper we present process tool monitoring and matching using an interferometry technique. There are many types of interferometry techniques used for various process monitoring applications. We use a Coherent Gradient Sensing (CGS) interferometer which is self-referencing and enables high throughput measurements. Using this technique, we can quickly measure the topography of an entire wafer surface and obtain stress and displacement data from the topography measurement. For improved tool and chamber matching and reduced device failure, wafer stress measurements can be implemented as a regular tool or chamber monitoring test for either unpatterned or patterned wafers as a good criteria for improved process stability.

  2. DAMT - DISTRIBUTED APPLICATION MONITOR TOOL (HP9000 VERSION)

    NASA Technical Reports Server (NTRS)

    Keith, B.

    1994-01-01

    Typical network monitors measure status of host computers and data traffic among hosts. A monitor to collect statistics about individual processes must be unobtrusive and possess the ability to locate and monitor processes, locate and monitor circuits between processes, and report traffic back to the user through a single application program interface (API). DAMT, Distributed Application Monitor Tool, is a distributed application program that will collect network statistics and make them available to the user. This distributed application has one component (i.e., process) on each host the user wishes to monitor as well as a set of components at a centralized location. DAMT provides the first known implementation of a network monitor at the application layer of abstraction. Potential users only need to know the process names of the distributed application they wish to monitor. The tool locates the processes and the circuit between them, and reports any traffic between them at a user-defined rate. The tool operates without the cooperation of the processes it monitors. Application processes require no changes to be monitored by this tool. Neither does DAMT require the UNIX kernel to be recompiled. The tool obtains process and circuit information by accessing the operating system's existing process database. This database contains all information available about currently executing processes. Expanding the information monitored by the tool can be done by utilizing more information from the process database. Traffic on a circuit between processes is monitored by a low-level LAN analyzer that has access to the raw network data. The tool also provides features such as dynamic event reporting and virtual path routing. A reusable object approach was used in the design of DAMT. The tool has four main components; the Virtual Path Switcher, the Central Monitor Complex, the Remote Monitor, and the LAN Analyzer. All of DAMT's components are independent, asynchronously executing processes. The independent processes communicate with each other via UNIX sockets through a Virtual Path router, or Switcher. The Switcher maintains a routing table showing the host of each component process of the tool, eliminating the need for each process to do so. The Central Monitor Complex provides the single application program interface (API) to the user and coordinates the activities of DAMT. The Central Monitor Complex is itself divided into independent objects that perform its functions. The component objects are the Central Monitor, the Process Locator, the Circuit Locator, and the Traffic Reporter. Each of these objects is an independent, asynchronously executing process. User requests to the tool are interpreted by the Central Monitor. The Process Locator identifies whether a named process is running on a monitored host and which host that is. The circuit between any two processes in the distributed application is identified using the Circuit Locator. The Traffic Reporter handles communication with the LAN Analyzer and accumulates traffic updates until it must send a traffic report to the user. The Remote Monitor process is replicated on each monitored host. It serves the Central Monitor Complex processes with application process information. The Remote Monitor process provides access to operating systems information about currently executing processes. It allows the Process Locator to find processes and the Circuit Locator to identify circuits between processes. It also provides lifetime information about currently monitored processes. The LAN Analyzer consists of two processes. Low-level monitoring is handled by the Sniffer. The Sniffer analyzes the raw data on a single, physical LAN. It responds to commands from the Analyzer process, which maintains the interface to the Traffic Reporter and keeps track of which circuits to monitor. DAMT is written in C-language for HP-9000 series computers running HP-UX and Sun 3 and 4 series computers running SunOS. DAMT requires 1Mb of disk space and 4Mb of RAM for execution. This package requires MIT's X Window System, Version 11 Revision 4, with OSF/Motif 1.1. The HP-9000 version (GSC-13589) includes sample HP-9000/375 and HP-9000/730 executables which were compiled under HP-UX, and the Sun version (GSC-13559) includes sample Sun3 and Sun4 executables compiled under SunOS. The standard distribution medium for the HP version of DAMT is a .25 inch HP pre-formatted streaming magnetic tape cartridge in UNIX tar format. It is also available on a 4mm magnetic tape in UNIX tar format. The standard distribution medium for the Sun version of DAMT is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. DAMT was developed in 1992.

  3. On-line Monitoring for Cutting Tool Wear Condition Based on the Parameters

    NASA Astrophysics Data System (ADS)

    Han, Fenghua; Xie, Feng

    2017-07-01

    In the process of cutting tools, it is very important to monitor the working state of the tools. On the basis of acceleration signal acquisition under the constant speed, time domain and frequency domain analysis of relevant indicators monitor the online of tool wear condition. The analysis results show that the method can effectively judge the tool wear condition in the process of machining. It has certain application value.

  4. Condition monitoring of turning process using infrared thermography technique - An experimental approach

    NASA Astrophysics Data System (ADS)

    Prasad, Balla Srinivasa; Prabha, K. Aruna; Kumar, P. V. S. Ganesh

    2017-03-01

    In metal cutting machining, major factors that affect the cutting tool life are machine tool vibrations, tool tip/chip temperature and surface roughness along with machining parameters like cutting speed, feed rate, depth of cut, tool geometry, etc., so it becomes important for the manufacturing industry to find the suitable levels of process parameters for obtaining maintaining tool life. Heat generation in cutting was always a main topic to be studied in machining. Recent advancement in signal processing and information technology has resulted in the use of multiple sensors for development of the effective monitoring of tool condition monitoring systems with improved accuracy. From a process improvement point of view, it is definitely more advantageous to proactively monitor quality directly in the process instead of the product, so that the consequences of a defective part can be minimized or even eliminated. In the present work, a real time process monitoring method is explored using multiple sensors. It focuses on the development of a test bed for monitoring the tool condition in turning of AISI 316L steel by using both coated and uncoated carbide inserts. Proposed tool condition monitoring (TCM) is evaluated in the high speed turning using multiple sensors such as Laser Doppler vibrometer and infrared thermography technique. The results indicate the feasibility of using the dominant frequency of the vibration signals for the monitoring of high speed turning operations along with temperatures gradient. A possible correlation is identified in both regular and irregular cutting tool wear. While cutting speed and feed rate proved to be influential parameter on the depicted temperatures and depth of cut to be less influential. Generally, it is observed that lower heat and temperatures are generated when coated inserts are employed. It is found that cutting temperatures are gradually increased as edge wear and deformation developed.

  5. Multi-category micro-milling tool wear monitoring with continuous hidden Markov models

    NASA Astrophysics Data System (ADS)

    Zhu, Kunpeng; Wong, Yoke San; Hong, Geok Soon

    2009-02-01

    In-process monitoring of tool conditions is important in micro-machining due to the high precision requirement and high tool wear rate. Tool condition monitoring in micro-machining poses new challenges compared to conventional machining. In this paper, a multi-category classification approach is proposed for tool flank wear state identification in micro-milling. Continuous Hidden Markov models (HMMs) are adapted for modeling of the tool wear process in micro-milling, and estimation of the tool wear state given the cutting force features. For a noise-robust approach, the HMM outputs are connected via a medium filter to minimize the tool state before entry into the next state due to high noise level. A detailed study on the selection of HMM structures for tool condition monitoring (TCM) is presented. Case studies on the tool state estimation in the micro-milling of pure copper and steel demonstrate the effectiveness and potential of these methods.

  6. Tool path strategy and cutting process monitoring in intelligent machining

    NASA Astrophysics Data System (ADS)

    Chen, Ming; Wang, Chengdong; An, Qinglong; Ming, Weiwei

    2018-06-01

    Intelligent machining is a current focus in advanced manufacturing technology, and is characterized by high accuracy and efficiency. A central technology of intelligent machining—the cutting process online monitoring and optimization—is urgently needed for mass production. In this research, the cutting process online monitoring and optimization in jet engine impeller machining, cranio-maxillofacial surgery, and hydraulic servo valve deburring are introduced as examples of intelligent machining. Results show that intelligent tool path optimization and cutting process online monitoring are efficient techniques for improving the efficiency, quality, and reliability of machining.

  7. Process monitoring and visualization solutions for hot-melt extrusion: a review.

    PubMed

    Saerens, Lien; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2014-02-01

    Hot-melt extrusion (HME) is applied as a continuous pharmaceutical manufacturing process for the production of a variety of dosage forms and formulations. To ensure the continuity of this process, the quality of the extrudates must be assessed continuously during manufacturing. The objective of this review is to provide an overview and evaluation of the available process analytical techniques which can be applied in hot-melt extrusion. Pharmaceutical extruders are equipped with traditional (univariate) process monitoring tools, observing barrel and die temperatures, throughput, screw speed, torque, drive amperage, melt pressure and melt temperature. The relevance of several spectroscopic process analytical techniques for monitoring and control of pharmaceutical HME has been explored recently. Nevertheless, many other sensors visualizing HME and measuring diverse critical product and process parameters with potential use in pharmaceutical extrusion are available, and were thoroughly studied in polymer extrusion. The implementation of process analytical tools in HME serves two purposes: (1) improving process understanding by monitoring and visualizing the material behaviour and (2) monitoring and analysing critical product and process parameters for process control, allowing to maintain a desired process state and guaranteeing the quality of the end product. This review is the first to provide an evaluation of the process analytical tools applied for pharmaceutical HME monitoring and control, and discusses techniques that have been used in polymer extrusion having potential for monitoring and control of pharmaceutical HME. © 2013 Royal Pharmaceutical Society.

  8. On-line tool breakage monitoring of vibration tapping using spindle motor current

    NASA Astrophysics Data System (ADS)

    Li, Guangjun; Lu, Huimin; Liu, Gang

    2008-10-01

    Input current of driving motor has been employed successfully as monitoring the cutting state in manufacturing processes for more than a decade. In vibration tapping, however, the method of on-line monitoring motor electric current has not been reported. In this paper, a tap failure prediction method is proposed to monitor the vibration tapping process using the electrical current signal of the spindle motor. The process of vibration tapping is firstly described. Then the relationship between the torque of vibration tapping and the electric current of motor is investigated by theoretic deducing and experimental measurement. According to those results, a monitoring method of tool's breakage is proposed through monitoring the ratio of the current amplitudes during adjacent vibration tapping periods. Finally, a low frequency vibration tapping system with motor current monitoring is built up using a servo motor B-106B and its driver CR06. The proposed method has been demonstrated with experiment data of vibration tapping in titanic alloys. The result of experiments shows that the method, which can avoid the tool breakage and giving a few error alarms when the threshold of amplitude ratio is 1.2 and there is at least 2 times overrun among 50 adjacent periods, is feasible for tool breakage monitoring in the process of vibration tapping small thread holes.

  9. Performance Monitoring Of A Computer Numerically Controlled (CNC) Lathe Using Pattern Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Daneshmend, L. K.; Pak, H. A.

    1984-02-01

    On-line monitoring of the cutting process in CNC lathe is desirable to ensure unattended fault-free operation in an automated environment. The state of the cutting tool is one of the most important parameters which characterises the cutting process. Direct monitoring of the cutting tool or workpiece is not feasible during machining. However several variables related to the state of the tool can be measured on-line. A novel monitoring technique is presented which uses cutting torque as the variable for on-line monitoring. A classifier is designed on the basis of the empirical relationship between cutting torque and flank wear. The empirical model required by the on-line classifier is established during an automated training cycle using machine vision for off-line direct inspection of the tool.

  10. Approach to in-process tool wear monitoring in drilling: Application of Kalman filter theory

    NASA Astrophysics Data System (ADS)

    He, Ning; Zhang, Youzhen; Pan, Liangxian

    1993-05-01

    The two parameters often used in adaptive control, tool wear and wear rate, are the important factors affecting machinability. In this paper, it is attempted to use the modern cybernetics to solve the in-process tool wear monitoring problem by applying the Kalman filter theory to monitor drill wear quantitatively. Based on the experimental results, a dynamic model, a measuring model and a measurement conversion model suitable for Kalman filter are established. It is proved that the monitoring system possesses complete observability but does not possess complete controllability. A discriminant for selecting the characteristic parameters is put forward. The thrust force Fz is selected as the characteristic parameter in monitoring the tool wear by this discriminant. The in-process Kalman filter drill wear monitoring system composed of force sensor microphotography and microcomputer is well established. The results obtained by the Kalman filter, the common indirect measuring method and the real drill wear measured by the aid of microphotography are compared. The result shows that the Kalman filter has high precision of measurement and the real time requirement can be satisfied.

  11. FT-NIR: A Tool for Process Monitoring and More.

    PubMed

    Martoccia, Domenico; Lutz, Holger; Cohen, Yvan; Jerphagnon, Thomas; Jenelten, Urban

    2018-03-30

    With ever-increasing pressure to optimize product quality, to reduce cost and to safely increase production output from existing assets, all combined with regular changes in terms of feedstock and operational targets, process monitoring with traditional instruments reaches its limits. One promising answer to these challenges is in-line, real time process analysis with spectroscopic instruments, and above all Fourier-Transform Near Infrared spectroscopy (FT-NIR). Its potential to afford decreased batch cycle times, higher yields, reduced rework and minimized batch variance is presented and application examples in the field of fine chemicals are given. We demonstrate that FT-NIR can be an efficient tool for improved process monitoring and optimization, effective process design and advanced process control.

  12. Audio signal analysis for tool wear monitoring in sheet metal stamping

    NASA Astrophysics Data System (ADS)

    Ubhayaratne, Indivarie; Pereira, Michael P.; Xiang, Yong; Rolfe, Bernard F.

    2017-02-01

    Stamping tool wear can significantly degrade product quality, and hence, online tool condition monitoring is a timely need in many manufacturing industries. Even though a large amount of research has been conducted employing different sensor signals, there is still an unmet demand for a low-cost easy to set up condition monitoring system. Audio signal analysis is a simple method that has the potential to meet this demand, but has not been previously used for stamping process monitoring. Hence, this paper studies the existence and the significance of the correlation between emitted sound signals and the wear state of sheet metal stamping tools. The corrupting sources generated by the tooling of the stamping press and surrounding machinery have higher amplitudes compared to that of the sound emitted by the stamping operation itself. Therefore, a newly developed semi-blind signal extraction technique was employed as a pre-processing technique to mitigate the contribution of these corrupting sources. The spectral analysis results of the raw and extracted signals demonstrate a significant qualitative relationship between wear progression and the emitted sound signature. This study lays the basis for employing low-cost audio signal analysis in the development of a real-time industrial tool condition monitoring system.

  13. Gaussian process regression for tool wear prediction

    NASA Astrophysics Data System (ADS)

    Kong, Dongdong; Chen, Yongjie; Li, Ning

    2018-05-01

    To realize and accelerate the pace of intelligent manufacturing, this paper presents a novel tool wear assessment technique based on the integrated radial basis function based kernel principal component analysis (KPCA_IRBF) and Gaussian process regression (GPR) for real-timely and accurately monitoring the in-process tool wear parameters (flank wear width). The KPCA_IRBF is a kind of new nonlinear dimension-increment technique and firstly proposed for feature fusion. The tool wear predictive value and the corresponding confidence interval are both provided by utilizing the GPR model. Besides, GPR performs better than artificial neural networks (ANN) and support vector machines (SVM) in prediction accuracy since the Gaussian noises can be modeled quantitatively in the GPR model. However, the existence of noises will affect the stability of the confidence interval seriously. In this work, the proposed KPCA_IRBF technique helps to remove the noises and weaken its negative effects so as to make the confidence interval compressed greatly and more smoothed, which is conducive for monitoring the tool wear accurately. Moreover, the selection of kernel parameter in KPCA_IRBF can be easily carried out in a much larger selectable region in comparison with the conventional KPCA_RBF technique, which helps to improve the efficiency of model construction. Ten sets of cutting tests are conducted to validate the effectiveness of the presented tool wear assessment technique. The experimental results show that the in-process flank wear width of tool inserts can be monitored accurately by utilizing the presented tool wear assessment technique which is robust under a variety of cutting conditions. This study lays the foundation for tool wear monitoring in real industrial settings.

  14. Process auditing in long term care facilities.

    PubMed

    Hewitt, S M; LeSage, J; Roberts, K L; Ellor, J R

    1985-01-01

    The ECC tool development and audit experiences indicated that there is promise in developing a process audit tool to monitor quality of care in nursing homes; moreover, the tool selected required only one hour per resident. Focusing on the care process and resident needs provided useful information for care providers at the unit level as well as for administrative personnel. Besides incorporating a more interdisciplinary focus, the revised tool needs to define support services most appropriate for nursing homes, includes items related to discharge planning and increases measurement of significant others' involvement in the care process. Future emphasis at the ECC will focus on developing intervention plans to maintain strengths and correct deficiencies identified in the audits. Various strategies to bring about desired changes in the quality of care will be evaluated through regular, periodic monitoring. Having a valid and reliable measure of quality of care as a tool will be an important step forward for LTC facilities.

  15. Numerical Implementation of Indicators and Statistical Control Tools in Monitoring and Evaluating CACEI-ISO Indicators of Study Program in Industrial Process by Systematization

    ERIC Educational Resources Information Center

    Ayala, Gabriela Cota; Real, Francia Angélica Karlos; Ivan, Ramirez Alvarado Edqar

    2016-01-01

    The research was conducted to determine if the study program of the career of industrial processes Technological University of Chihuahua, 1 year after that it was certified by CACEI, continues achieving the established indicators and ISO 9001: 2008, implementing quality tools, monitoring of essential indicators are determined, flow charts are…

  16. In situ monitoring of cocrystals in formulation development using low-frequency Raman spectroscopy.

    PubMed

    Otaki, Takashi; Tanabe, Yuta; Kojima, Takashi; Miura, Masaru; Ikeda, Yukihiro; Koide, Tatsuo; Fukami, Toshiro

    2018-05-05

    In recent years, to guarantee a quality-by-design approach to the development of pharmaceutical products, it is important to identify properties of raw materials and excipients in order to determine critical process parameters and critical quality attributes. Feedback obtained from real-time analyses using various process analytical technology (PAT) tools has been actively investigated. In this study, in situ monitoring using low-frequency (LF) Raman spectroscopy (10-200 cm -1 ), which may have higher discriminative ability among polymorphs than near-infrared spectroscopy and conventional Raman spectroscopy (200-1800 cm -1 ), was investigated as a possible application to PAT. This is because LF-Raman spectroscopy obtains information about intermolecular and/or lattice vibrations in the solid state. The monitoring results obtained from Furosemide/Nicotinamide cocrystal indicate that LF-Raman spectroscopy is applicable to in situ monitoring of suspension and fluidized bed granulation processes, and is an effective technique as a PAT tool to detect the conversion risk of cocrystals. LF-Raman spectroscopy is also used as a PAT tool to monitor reactions, crystallizations, and manufacturing processes of drug substances and products. In addition, a sequence of conversion behaviors of Furosemide/Nicotinamide cocrystals was determined by performing in situ monitoring for the first time. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Sentinel-2 ArcGIS Tool for Environmental Monitoring

    NASA Astrophysics Data System (ADS)

    Plesoianu, Alin; Cosmin Sandric, Ionut; Anca, Paula; Vasile, Alexandru; Calugaru, Andreea; Vasile, Cristian; Zavate, Lucian

    2017-04-01

    This paper addresses one of the biggest challenges regarding Sentinel-2 data, related to the need of an efficient tool to access and process the large collection of images that are available. Consequently, developing a tool for the automation of Sentinel-2 data analysis is the most immediate need. We developed a series of tools for the automation of Sentinel-2 data download and processing for vegetation health monitoring. The tools automatically perform the following operations: downloading image tiles from ESA's Scientific Hub or other venders (Amazon), pre-processing of the images to extract the 10-m bands, creating image composites, applying a series of vegetation indexes (NDVI, OSAVI, etc.) and performing change detection analyses on different temporal data sets. All of these tools run in a dynamic way in the ArcGIS Platform, without the need of creating intermediate datasets (rasters, layers), as the images are processed on-the-fly in order to avoid data duplication. Finally, they allow complete integration with the ArcGIS environment and workflows

  18. In-Line Monitoring of Fab Processing Using X-Ray Diffraction

    NASA Astrophysics Data System (ADS)

    Gittleman, Bruce; Kozaczek, Kris

    2005-09-01

    As the materials shift that started with Cu continues to advance in the semiconductor industry, new issues related to materials microstructure have arisen. While x-ray diffraction (XRD) has long been used in development applications, in this paper we show that results generated in real time by a unique, high throughput, fully automated XRD metrology tool can be used to develop metrics for qualification and monitoring of critical processes in current and future manufacturing. It will be shown that these metrics provide a unique set of data that correlate to manufacturing issues. For example, ionized-sputtering is the current deposition method of choice for both the Cu seed and TaNx/Ta barrier layers. The alpha phase of Ta is widely used in production for the upper layer of the barrier stack, but complete elimination of the beta phase requires a TaNx layer with sufficient N content, but not so much as to start poisoning the target and generating particle issues. This is a well documented issue, but traditional monitoring by sheet resistance methods cannot guarantee the absence of the beta phase, whereas XRD can determine the presence of even small amounts of beta. Nickel silicide for gate metallization is another example where monitoring of phase is critical. As well being able to qualify an anneal process that gives only the desired NiSi phase everywhere across the wafer, XRD can be used to determine if full silicidation of the Ni has occurred and characterize the crystallographic microstructure of the Ni to determine any effect of that microstructure on the anneal process. The post-anneal nickel silicide phase and uniformity of the silicide microstructure can all be monitored in production. Other examples of the application of XRD to process qualification and production monitoring are derived from the dependence of certain processes, some types of defect generation, and device performance on crystallographic texture. The data presented will show that CMP dishing problems could be traced to texture of the barrier layer and mitigated by adjusting the barrier process. The density of pits developed during CMP of electrochemically deposited (ECD) Cu depends on the fraction of (111) oriented grains. It must be emphasized that the crystallographic texture is not only a key parameter for qualification of high yielding and reliable processes, but also serves as a critical parameter for monitoring tool health. The texture of Cu and W are sensitive not only to deviations in performance of the tool depositing or annealing a particular film, but also highly sensitive to the texture of the barrier underlayers and thus any performance deviations in those tools. The XRD metrology tool has been designed with production monitoring in mind and has been fully integrated into both 200 mm and 300 mm fabs. Rapid analysis is achieved by using a high intensity fixed x-ray source, coupled with a large area 2D detector. The output metrics from one point are generated while the tool is measuring a subsequent point, giving true on-the-fly analysis; no post-processing of data is necessary. Spatial resolution on the wafer surface ranging from 35 μm to 1 mm is available, making the tool suitable for monitoring of product wafers. Typical analysis times range from 10 seconds to 2 minutes per point, depending on the film thickness and spot size. Current metrics used for process qualification and production monitoring are phase, FWHM of the primary phase peaks (for mean grain size tracking), and crystallographic texture.

  19. Monitoring Satellite Data Ingest and Processing for the Atmosphere Science Investigator-led Processing Systems (SIPS)

    NASA Astrophysics Data System (ADS)

    Witt, J.; Gumley, L.; Braun, J.; Dutcher, S.; Flynn, B.

    2017-12-01

    The Atmosphere SIPS (Science Investigator-led Processing Systems) team at the Space Science and Engineering Center (SSEC), which is funded through a NASA contract, creates Level 2 cloud and aerosol products from the VIIRS instrument aboard the S-NPP satellite. In order to monitor the ingest and processing of files, we have developed an extensive monitoring system to observe every step in the process. The status grid is used for real time monitoring, and shows the current state of the system, including what files we have and whether or not we are meeting our latency requirements. Our snapshot tool displays the state of the system in the past. It displays which files were available at a given hour and is used for historical and backtracking purposes. In addition to these grid like tools we have created histograms and other statistical graphs for tracking processing and ingest metrics, such as total processing time, job queue time, and latency statistics.

  20. In-line and real-time process monitoring of a freeze drying process using Raman and NIR spectroscopy as complementary process analytical technology (PAT) tools.

    PubMed

    De Beer, T R M; Vercruysse, P; Burggraeve, A; Quinten, T; Ouyang, J; Zhang, X; Vervaet, C; Remon, J P; Baeyens, W R G

    2009-09-01

    The aim of the present study was to examine the complementary properties of Raman and near infrared (NIR) spectroscopy as PAT tools for the fast, noninvasive, nondestructive and in-line process monitoring of a freeze drying process. Therefore, Raman and NIR probes were built in the freeze dryer chamber, allowing simultaneous process monitoring. A 5% (w/v) mannitol solution was used as model for freeze drying. Raman and NIR spectra were continuously collected during freeze drying (one Raman and NIR spectrum/min) and the spectra were analyzed using principal component analysis (PCA) and multivariate curve resolution (MCR). Raman spectroscopy was able to supply information about (i) the mannitol solid state throughout the entire process, (ii) the endpoint of freezing (endpoint of mannitol crystallization), and (iii) several physical and chemical phenomena occurring during the process (onset of ice nucleation, onset of mannitol crystallization). NIR spectroscopy proved to be a more sensitive tool to monitor the critical aspects during drying: (i) endpoint of ice sublimation and (ii) monitoring the release of hydrate water during storage. Furthermore, via NIR spectroscopy some Raman observations were confirmed: start of ice nucleation, end of mannitol crystallization and solid state characteristics of the end product. When Raman and NIR monitoring were performed on the same vial, the Raman signal was saturated during the freezing step caused by reflected NIR light reaching the Raman detector. Therefore, NIR and Raman measurements were done on a different vial. Also the importance of the position of the probes (Raman probe above the vial and NIR probe at the bottom of the sidewall of the vial) in order to obtain all required critical information is outlined. Combining Raman and NIR spectroscopy for the simultaneous monitoring of freeze drying allows monitoring almost all critical freeze drying process aspects. Both techniques do not only complement each other, they also provided mutual confirmation of specific conclusions.

  1. DATA QUALITY OBJECTIVES-FOUNDATION OF A SUCCESSFUL MONITORING PROGRAM

    EPA Science Inventory

    The data quality objectives (DQO) process is a fundamental site characterization tool and the foundation of a successful monitoring program. The DQO process is a systematic planning approach based on the scientific method of inquiry. The process identifies the goals of data col...

  2. Technology Enhanced Learning for People with Intellectual Disabilities and Cerebral Paralysis: The MAS Platform

    NASA Astrophysics Data System (ADS)

    Colomo-Palacios, Ricardo; Paniagua-Martín, Fernando; García-Crespo, Ángel; Ruiz-Mezcua, Belén

    Education for students with disabilities now takes place in a wide range of settings, thus, including a wider range of assistive tools. As a result of this, one of the most interesting application domains of technology enhanced learning is related to the adoption of learning technologies and designs for people with disabilities. Following this unstoppable trend, this paper presents MAS, a software platform aimed to help people with severe intellectual disabilities and cerebral paralysis in their learning processes. MAS, as a technology enhanced learning platform, provides several tools that supports learning and monitoring for people with special needs, including adaptative games, data processing and monitoring tools. Installed in a special needs education institution in Madrid, Spain, MAS provides special educators with a tool that improved students education processes.

  3. Tips and Traps: Lessons From Codesigning a Clinician E-Monitoring Tool for Computerized Cognitive Behavioral Therapy

    PubMed Central

    Hawken, Susan J; Stasiak, Karolina; Lucassen, Mathijs FG; Fleming, Theresa; Shepherd, Matthew; Greenwood, Andrea; Osborne, Raechel; Merry, Sally N

    2017-01-01

    Background Computerized cognitive behavioral therapy (cCBT) is an acceptable and promising treatment modality for adolescents with mild-to-moderate depression. Many cCBT programs are standalone packages with no way for clinicians to monitor progress or outcomes. We sought to develop an electronic monitoring (e-monitoring) tool in consultation with clinicians and adolescents to allow clinicians to monitor mood, risk, and treatment adherence of adolescents completing a cCBT program called SPARX (Smart, Positive, Active, Realistic, X-factor thoughts). Objective The objectives of our study were as follows: (1) assess clinicians’ and adolescents’ views on using an e-monitoring tool and to use this information to help shape the development of the tool and (2) assess clinician experiences with a fully developed version of the tool that was implemented in their clinical service. Methods A descriptive qualitative study using semistructured focus groups was conducted in New Zealand. In total, 7 focus groups included clinicians (n=50) who worked in primary care, and 3 separate groups included adolescents (n=29). Clinicians were general practitioners (GPs), school guidance counselors, clinical psychologists, youth workers, and nurses. Adolescents were recruited from health services and a high school. Focus groups were run to enable feedback at 3 phases that corresponded to the consultation, development, and postimplementation stages. Thematic analysis was applied to transcribed responses. Results Focus groups during the consultation and development phases revealed the need for a simple e-monitoring registration process with guides for end users. Common concerns were raised in relation to clinical burden, monitoring risk (and effects on the therapeutic relationship), alongside confidentiality or privacy and technical considerations. Adolescents did not want to use their social media login credentials for e-monitoring, as they valued their privacy. However, adolescents did want information on seeking help and personalized monitoring and communication arrangements. Postimplementation, clinicians who had used the tool in practice revealed no adverse impact on the therapeutic relationship, and adolescents were not concerned about being e-monitored. Clinicians did need additional time to monitor adolescents, and the e-monitoring tool was used in a different way than was originally anticipated. Also, it was suggested that the registration process could be further streamlined and integrated with existing clinical data management systems, and the use of clinician alerts could be expanded beyond the scope of simply flagging adolescents of concern. Conclusions An e-monitoring tool was developed in consultation with clinicians and adolescents. However, the study revealed the complexity of implementing the tool in clinical practice. Of salience were privacy, parallel monitoring systems, integration with existing electronic medical record systems, customization of the e-monitor, and preagreed monitoring arrangements between clinicians and adolescents. PMID:28077345

  4. Enhanced methodology of focus control and monitoring on scanner tool

    NASA Astrophysics Data System (ADS)

    Chen, Yen-Jen; Kim, Young Ki; Hao, Xueli; Gomez, Juan-Manuel; Tian, Ye; Kamalizadeh, Ferhad; Hanson, Justin K.

    2017-03-01

    As the demand of the technology node shrinks from 14nm to 7nm, the reliability of tool monitoring techniques in advanced semiconductor fabs to achieve high yield and quality becomes more critical. Tool health monitoring methods involve periodic sampling of moderately processed test wafers to detect for particles, defects, and tool stability in order to ensure proper tool health. For lithography TWINSCAN scanner tools, the requirements for overlay stability and focus control are very strict. Current scanner tool health monitoring methods include running BaseLiner to ensure proper tool stability on a periodic basis. The focus measurement on YIELDSTAR by real-time or library-based reconstruction of critical dimensions (CD) and side wall angle (SWA) has been demonstrated as an accurate metrology input to the control loop. The high accuracy and repeatability of the YIELDSTAR focus measurement provides a common reference of scanner setup and user process. In order to further improve the metrology and matching performance, Diffraction Based Focus (DBF) metrology enabling accurate, fast, and non-destructive focus acquisition, has been successfully utilized for focus monitoring/control of TWINSCAN NXT immersion scanners. The optimal DBF target was determined to have minimized dose crosstalk, dynamic precision, set-get residual, and lens aberration sensitivity. By exploiting this new measurement target design, 80% improvement in tool-to-tool matching, >16% improvement in run-to-run mean focus stability, and >32% improvement in focus uniformity have been demonstrated compared to the previous BaseLiner methodology. Matching <2.4 nm across multiple NXT immersion scanners has been achieved with the new methodology of set baseline reference. This baseline technique, with either conventional BaseLiner low numerical aperture (NA=1.20) mode or advanced illumination high NA mode (NA=1.35), has also been evaluated to have consistent performance. This enhanced methodology of focus control and monitoring on multiple illumination conditions, opens an avenue to significantly reduce Focus-Exposure Matrix (FEM) wafer exposure for new product/layer best focus (BF) setup.

  5. Quality control troubleshooting tools for the mill floor

    Treesearch

    John Dramm

    2000-01-01

    Statistical Process Control (SPC) provides effective tools for improving process quality in the forest products industry resulting in reduced costs and improved productivity. Implementing SPC helps identify and locate problems that occur in wood products manufacturing. SPC tools achieve their real value when applied on the mill floor for monitoring and troubleshooting...

  6. Research on intelligent monitoring technology of machining process

    NASA Astrophysics Data System (ADS)

    Wang, Taiyong; Meng, Changhong; Zhao, Guoli

    1995-08-01

    Based upon research on sound and vibration characteristics of tool condition, we explore the multigrade monitoring system which takes single-chip microcomputers as the core hardware. By using the specially designed pickup true signal devices, we can more effectively do the intelligent multigrade monitoring and forecasting, and furthermore, we can build the tool condition models adaptively. This is the key problem in FMS, CIMS, and even the IMS.

  7. Compliance monitoring in business processes: Functionalities, application, and tool-support.

    PubMed

    Ly, Linh Thao; Maggi, Fabrizio Maria; Montali, Marco; Rinderle-Ma, Stefanie; van der Aalst, Wil M P

    2015-12-01

    In recent years, monitoring the compliance of business processes with relevant regulations, constraints, and rules during runtime has evolved as major concern in literature and practice. Monitoring not only refers to continuously observing possible compliance violations, but also includes the ability to provide fine-grained feedback and to predict possible compliance violations in the future. The body of literature on business process compliance is large and approaches specifically addressing process monitoring are hard to identify. Moreover, proper means for the systematic comparison of these approaches are missing. Hence, it is unclear which approaches are suitable for particular scenarios. The goal of this paper is to define a framework for Compliance Monitoring Functionalities (CMF) that enables the systematic comparison of existing and new approaches for monitoring compliance rules over business processes during runtime. To define the scope of the framework, at first, related areas are identified and discussed. The CMFs are harvested based on a systematic literature review and five selected case studies. The appropriateness of the selection of CMFs is demonstrated in two ways: (a) a systematic comparison with pattern-based compliance approaches and (b) a classification of existing compliance monitoring approaches using the CMFs. Moreover, the application of the CMFs is showcased using three existing tools that are applied to two realistic data sets. Overall, the CMF framework provides powerful means to position existing and future compliance monitoring approaches.

  8. Compliance monitoring in business processes: Functionalities, application, and tool-support

    PubMed Central

    Ly, Linh Thao; Maggi, Fabrizio Maria; Montali, Marco; Rinderle-Ma, Stefanie; van der Aalst, Wil M.P.

    2015-01-01

    In recent years, monitoring the compliance of business processes with relevant regulations, constraints, and rules during runtime has evolved as major concern in literature and practice. Monitoring not only refers to continuously observing possible compliance violations, but also includes the ability to provide fine-grained feedback and to predict possible compliance violations in the future. The body of literature on business process compliance is large and approaches specifically addressing process monitoring are hard to identify. Moreover, proper means for the systematic comparison of these approaches are missing. Hence, it is unclear which approaches are suitable for particular scenarios. The goal of this paper is to define a framework for Compliance Monitoring Functionalities (CMF) that enables the systematic comparison of existing and new approaches for monitoring compliance rules over business processes during runtime. To define the scope of the framework, at first, related areas are identified and discussed. The CMFs are harvested based on a systematic literature review and five selected case studies. The appropriateness of the selection of CMFs is demonstrated in two ways: (a) a systematic comparison with pattern-based compliance approaches and (b) a classification of existing compliance monitoring approaches using the CMFs. Moreover, the application of the CMFs is showcased using three existing tools that are applied to two realistic data sets. Overall, the CMF framework provides powerful means to position existing and future compliance monitoring approaches. PMID:26635430

  9. A combination of HPLC and automated data analysis for monitoring the efficiency of high-pressure homogenization.

    PubMed

    Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver

    2017-08-01

    Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.

  10. Estimation of tool wear during CNC milling using neural network-based sensor fusion

    NASA Astrophysics Data System (ADS)

    Ghosh, N.; Ravi, Y. B.; Patra, A.; Mukhopadhyay, S.; Paul, S.; Mohanty, A. R.; Chattopadhyay, A. B.

    2007-01-01

    Cutting tool wear degrades the product quality in manufacturing processes. Monitoring tool wear value online is therefore needed to prevent degradation in machining quality. Unfortunately there is no direct way of measuring the tool wear online. Therefore one has to adopt an indirect method wherein the tool wear is estimated from several sensors measuring related process variables. In this work, a neural network-based sensor fusion model has been developed for tool condition monitoring (TCM). Features extracted from a number of machining zone signals, namely cutting forces, spindle vibration, spindle current, and sound pressure level have been fused to estimate the average flank wear of the main cutting edge. Novel strategies such as, signal level segmentation for temporal registration, feature space filtering, outlier removal, and estimation space filtering have been proposed. The proposed approach has been validated by both laboratory and industrial implementations.

  11. Novel tool wear monitoring method in milling difficult-to-machine materials using cutting chip formation

    NASA Astrophysics Data System (ADS)

    Zhang, P. P.; Guo, Y.; Wang, B.

    2017-05-01

    The main problems in milling difficult-to-machine materials are the high cutting temperature and rapid tool wear. However it is impossible to investigate tool wear in machining. Tool wear and cutting chip formation are two of the most important representations for machining efficiency and quality. The purpose of this paper is to develop the model of tool wear with cutting chip formation (width of chip and radian of chip) on difficult-to-machine materials. Thereby tool wear is monitored by cutting chip formation. A milling experiment on the machining centre with three sets cutting parameters was performed to obtain chip formation and tool wear. The experimental results show that tool wear increases gradually along with cutting process. In contrast, width of chip and radian of chip decrease. The model is developed by fitting the experimental data and formula transformations. The most of monitored errors of tool wear by the chip formation are less than 10%. The smallest error is 0.2%. Overall errors by the radian of chip are less than the ones by the width of chip. It is new way to monitor and detect tool wear by cutting chip formation in milling difficult-to-machine materials.

  12. Errors in patient specimen collection: application of statistical process control.

    PubMed

    Dzik, Walter Sunny; Beckman, Neil; Selleng, Kathleen; Heddle, Nancy; Szczepiorkowski, Zbigniew; Wendel, Silvano; Murphy, Michael

    2008-10-01

    Errors in the collection and labeling of blood samples for pretransfusion testing increase the risk of transfusion-associated patient morbidity and mortality. Statistical process control (SPC) is a recognized method to monitor the performance of a critical process. An easy-to-use SPC method was tested to determine its feasibility as a tool for monitoring quality in transfusion medicine. SPC control charts were adapted to a spreadsheet presentation. Data tabulating the frequency of mislabeled and miscollected blood samples from 10 hospitals in five countries from 2004 to 2006 were used to demonstrate the method. Control charts were produced to monitor process stability. The participating hospitals found the SPC spreadsheet very suitable to monitor the performance of the sample labeling and collection and applied SPC charts to suit their specific needs. One hospital monitored subcategories of sample error in detail. A large hospital monitored the number of wrong-blood-in-tube (WBIT) events. Four smaller-sized facilities, each following the same policy for sample collection, combined their data on WBIT samples into a single control chart. One hospital used the control chart to monitor the effect of an educational intervention. A simple SPC method is described that can monitor the process of sample collection and labeling in any hospital. SPC could be applied to other critical steps in the transfusion processes as a tool for biovigilance and could be used to develop regional or national performance standards for pretransfusion sample collection. A link is provided to download the spreadsheet for free.

  13. Direct analysis in real time mass spectrometry, a process analytical technology tool for real-time process monitoring in botanical drug manufacturing.

    PubMed

    Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin

    2014-03-01

    A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Multi-parameters monitoring during traditional Chinese medicine concentration process with near infrared spectroscopy and chemometrics

    NASA Astrophysics Data System (ADS)

    Liu, Ronghua; Sun, Qiaofeng; Hu, Tian; Li, Lian; Nie, Lei; Wang, Jiayue; Zhou, Wanhui; Zang, Hengchang

    2018-03-01

    As a powerful process analytical technology (PAT) tool, near infrared (NIR) spectroscopy has been widely used in real-time monitoring. In this study, NIR spectroscopy was applied to monitor multi-parameters of traditional Chinese medicine (TCM) Shenzhiling oral liquid during the concentration process to guarantee the quality of products. Five lab scale batches were employed to construct quantitative models to determine five chemical ingredients and physical change (samples density) during concentration process. The paeoniflorin, albiflorin, liquiritin and samples density were modeled by partial least square regression (PLSR), while the content of the glycyrrhizic acid and cinnamic acid were modeled by support vector machine regression (SVMR). Standard normal variate (SNV) and/or Savitzkye-Golay (SG) smoothing with derivative methods were adopted for spectra pretreatment. Variable selection methods including correlation coefficient (CC), competitive adaptive reweighted sampling (CARS) and interval partial least squares regression (iPLS) were performed for optimizing the models. The results indicated that NIR spectroscopy was an effective tool to successfully monitoring the concentration process of Shenzhiling oral liquid.

  15. Monitoring of antisolvent crystallization of sodium scutellarein by combined FBRM-PVM-NIR.

    PubMed

    Liu, Xuesong; Sun, Di; Wang, Feng; Wu, Yongjiang; Chen, Yong; Wang, Longhu

    2011-06-01

    Antisolvent crystallization can be used as an alternative to cooling or evaporation for the separation and purification of solid product in the pharmaceutical industry. To improve the process understanding of antisolvent crystallization, the use of in-line tools is vital. In this study, the process analytical technology (PAT) tools including focused beam reflectance measurement (FBRM), particle video microscope (PVM), and near-infrared spectroscopy (NIRS) were utilized to monitor antisolvent crystallization of sodium scutellarein. FBRM was used to monitor chord count and chord length distribution of sodium scutellarein particles in the crystallizer, and PVM, as an in-line video camera, provided pictures imaging particle shape and dimension. In addition, a quantitative model of PLS was established by in-line NIRS to detect the concentration of sodium scutellarein in the solvent and good calibration statistics were obtained (r(2) = 0.976) with the residual predictive deviation value of 11.3. The discussion over sensitivities, strengths, and weaknesses of the PAT tools may be helpful in selection of suitable PAT techniques. These in-line techniques eliminate the need for sample preparation and offer a time-saving approach to understand and monitor antisolvent crystallization process. Copyright © 2011 Wiley-Liss, Inc.

  16. A Web-Based Monitoring System for Multidisciplinary Design Projects

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Salas, Andrea O.; Weston, Robert P.

    1998-01-01

    In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary computational environments, is defined as a hardware and software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, displaying, monitoring, and controlling the design process. The objective of this research is to explore how Web technology, integrated with an existing framework, can improve these areas of weakness. This paper describes a Web-based system that optimizes and controls the execution sequence of design processes; and monitors the project status and results. The three-stage evolution of the system with increasingly complex problems demonstrates the feasibility of this approach.

  17. The design of an intelligent human-computer interface for the test, control and monitor system

    NASA Technical Reports Server (NTRS)

    Shoaff, William D.

    1988-01-01

    The graphical intelligence and assistance capabilities of a human-computer interface for the Test, Control, and Monitor System at Kennedy Space Center are explored. The report focuses on how a particular commercial off-the-shelf graphical software package, Data Views, can be used to produce tools that build widgets such as menus, text panels, graphs, icons, windows, and ultimately complete interfaces for monitoring data from an application; controlling an application by providing input data to it; and testing an application by both monitoring and controlling it. A complete set of tools for building interfaces is described in a manual for the TCMS toolkit. Simple tools create primitive widgets such as lines, rectangles and text strings. Intermediate level tools create pictographs from primitive widgets, and connect processes to either text strings or pictographs. Other tools create input objects; Data Views supports output objects directly, thus output objects are not considered. Finally, a set of utilities for executing, monitoring use, editing, and displaying the content of interfaces is included in the toolkit.

  18. High throughput wafer defect monitor for integrated metrology applications in photolithography

    NASA Astrophysics Data System (ADS)

    Rao, Nagaraja; Kinney, Patrick; Gupta, Anand

    2008-03-01

    The traditional approach to semiconductor wafer inspection is based on the use of stand-alone metrology tools, which while highly sensitive, are large, expensive and slow, requiring inspection to be performed off-line and on a lot sampling basis. Due to the long cycle times and sparse sampling, the current wafer inspection approach is not suited to rapid detection of process excursions that affect yield. The semiconductor industry is gradually moving towards deploying integrated metrology tools for real-time "monitoring" of product wafers during the manufacturing process. Integrated metrology aims to provide end-users with rapid feedback of problems during the manufacturing process, and the benefit of increased yield, and reduced rework and scrap. The approach of monitoring 100% of the wafers being processed requires some trade-off in sensitivity compared to traditional standalone metrology tools, but not by much. This paper describes a compact, low-cost wafer defect monitor suitable for integrated metrology applications and capable of detecting submicron defects on semiconductor wafers at an inspection rate of about 10 seconds per wafer (or 360 wafers per hour). The wafer monitor uses a whole wafer imaging approach to detect defects on both un-patterned and patterned wafers. Laboratory tests with a prototype system have demonstrated sensitivity down to 0.3 µm on un-patterned wafers and down to 1 µm on patterned wafers, at inspection rates of 10 seconds per wafer. An ideal application for this technology is preventing photolithography defects such as "hot spots" by implementing a wafer backside monitoring step prior to exposing wafers in the lithography step.

  19. The Neural Bases of Event Monitoring across Domains: a Simultaneous ERP-fMRI Study

    PubMed Central

    Tarantino, Vincenza; Mazzonetto, Ilaria; Formica, Silvia; Causin, Francesco; Vallesi, Antonino

    2017-01-01

    The ability to check and evaluate the environment over time with the aim to detect the occurrence of target stimuli is supported by sustained/tonic as well as transient/phasic control processes, which overall might be referred to as event monitoring. The neural underpinning of sustained attentional control processes involves a fronto-parietal network. However, it has not been well-defined yet whether this cortical circuit acts irrespective of the specific material to be monitored and whether this mediates sustained as well as transient monitoring processes. In the current study, the functional activity of brain during an event monitoring task was investigated and compared between two cognitive domains, whose processing is mediated by differently lateralized areas. Namely, participants were asked to monitor sequences of either faces (supported by right-hemisphere regions) or tools (left-hemisphere). In order to disentangle sustained from transient components of monitoring, a simultaneous EEG-fMRI technique was adopted within a block design. When contrasting monitoring versus control blocks, the conventional fMRI analysis revealed the sustained involvement of bilateral fronto-parietal regions, in both task domains. Event-related potentials (ERPs) showed a more positive amplitude over frontal sites in monitoring compared to control blocks, providing evidence of a transient monitoring component. The joint ERP-fMRI analysis showed that, in the case of face monitoring, this transient component relies on right-lateralized areas, including the inferior parietal lobule and the middle frontal gyrus. In the case of tools, no fronto-parietal areas correlated with the transient ERP activity, suggesting that in this domain phasic monitoring processes were masked by tonic ones. Overall, the present findings highlight the role of bilateral fronto-parietal regions in sustained monitoring, independently of the specific task requirements, and suggest that right-lateralized areas subtend transient monitoring processes, at least in some task contexts. PMID:28785212

  20. Process defects and in situ monitoring methods in metal powder bed fusion: a review

    NASA Astrophysics Data System (ADS)

    Grasso, Marco; Colosimo, Bianca Maria

    2017-04-01

    Despite continuous technological enhancements of metal Additive Manufacturing (AM) systems, the lack of process repeatability and stability still represents a barrier for the industrial breakthrough. The most relevant metal AM applications currently involve industrial sectors (e.g. aerospace and bio-medical) where defects avoidance is fundamental. Because of this, there is the need to develop novel in situ monitoring tools able to keep under control the stability of the process on a layer-by-layer basis, and to detect the onset of defects as soon as possible. On the one hand, AM systems must be equipped with in situ sensing devices able to measure relevant quantities during the process, a.k.a. process signatures. On the other hand, in-process data analytics and statistical monitoring techniques are required to detect and localize the defects in an automated way. This paper reviews the literature and the commercial tools for in situ monitoring of powder bed fusion (PBF) processes. It explores the different categories of defects and their main causes, the most relevant process signatures and the in situ sensing approaches proposed so far. Particular attention is devoted to the development of automated defect detection rules and the study of process control strategies, which represent two critical fields for the development of future smart PBF systems.

  1. Using 3D Printing for Rapid Prototyping of Characterization Tools for Investigating Powder Blend Behavior.

    PubMed

    Hirschberg, Cosima; Boetker, Johan P; Rantanen, Jukka; Pein-Hackelbusch, Miriam

    2018-02-01

    There is an increasing need to provide more detailed insight into the behavior of particulate systems. The current powder characterization tools are developed empirically and in many cases, modification of existing equipment is difficult. More flexible tools are needed to provide understanding of complex powder behavior, such as mixing process and segregation phenomenon. An approach based on the fast prototyping of new powder handling geometries and interfacing solutions for process analytical tools is reported. This study utilized 3D printing for rapid prototyping of customized geometries; overall goal was to assess mixing process of powder blends at small-scale with a combination of spectroscopic and mechanical monitoring. As part of the segregation evaluation studies, the flowability of three different paracetamol/filler-blends at different ratios was investigated, inter alia to define the percolation thresholds. Blends with a paracetamol wt% above the percolation threshold were subsequently investigated in relation to their segregation behavior. Rapid prototyping using 3D printing allowed designing two funnels with tailored flow behavior (funnel flow) of model formulations, which could be monitored with an in-line near-infrared (NIR) spectrometer. Calculating the root mean square (RMS) of the scores of the two first principal components of the NIR spectra visualized spectral variation as a function of process time. In a same setup, mechanical properties (basic flow energy) of the powder blend were monitored during blending. Rapid prototyping allowed for fast modification of powder testing geometries and easy interfacing with process analytical tools, opening new possibilities for more detailed powder characterization.

  2. Overview of PAT process analysers applicable in monitoring of film coating unit operations for manufacturing of solid oral dosage forms.

    PubMed

    Korasa, Klemen; Vrečer, Franc

    2018-01-01

    Over the last two decades, regulatory agencies have demanded better understanding of pharmaceutical products and processes by implementing new technological approaches, such as process analytical technology (PAT). Process analysers present a key PAT tool, which enables effective process monitoring, and thus improved process control of medicinal product manufacturing. Process analysers applicable in pharmaceutical coating unit operations are comprehensibly described in the present article. The review is focused on monitoring of solid oral dosage forms during film coating in two most commonly used coating systems, i.e. pan and fluid bed coaters. Brief theoretical background and critical overview of process analysers used for real-time or near real-time (in-, on-, at- line) monitoring of critical quality attributes of film coated dosage forms are presented. Besides well recognized spectroscopic methods (NIR and Raman spectroscopy), other techniques, which have made a significant breakthrough in recent years, are discussed (terahertz pulsed imaging (TPI), chord length distribution (CLD) analysis, and image analysis). Last part of the review is dedicated to novel techniques with high potential to become valuable PAT tools in the future (optical coherence tomography (OCT), acoustic emission (AE), microwave resonance (MR), and laser induced breakdown spectroscopy (LIBS)). Copyright © 2017 Elsevier B.V. All rights reserved.

  3. AE Monitoring of Diamond Turned Rapidly Soldified Aluminium 443

    NASA Astrophysics Data System (ADS)

    Onwuka, G.; Abou-El-Hossein, K.; Mkoko, Z.

    2017-05-01

    The fast replacement of conventional aluminium with rapidly solidified aluminium alloys has become a noticeable trend in the current manufacturing industries involved in the production of optics and optical molding inserts. This is as a result of the improved performance and durability of rapidly solidified aluminium alloys when compared to conventional aluminium. Melt spinning process is vital for manufacturing rapidly solidified aluminium alloys like RSA 905, RSA 6061 and RSA 443 which are common in the industries today. RSA 443 is a newly developed alloy with few research findings and huge research potential. There is no available literature focused on monitoring the machining of RSA 443 alloys. In this research, Acoustic Emission sensing technique was applied to monitor the single point diamond turning of RSA 443 on an ultrahigh precision lathe machine. The machining process was carried out after careful selection of feed, speed and depths of cut. The monitoring process was achieved with a high sampling data acquisition system using different tools while concurrent measurement of the surface roughness and tool wear were initiated after covering a total feed distance of 13km. An increasing trend of raw AE spikes and peak to peak signal were observed with an increase in the surface roughness and tool wear values. Hence, acoustic emission sensing technique proves to be an effective monitoring method for the machining of RSA 443 alloy.

  4. Diagnostic tool for structural health monitoring: effect of material nonlinearity and vibro-impact process

    NASA Astrophysics Data System (ADS)

    Hiwarkar, V. R.; Babitsky, V. I.; Silberschmidt, V. V.

    2013-07-01

    Numerous techniques are available for monitoring structural health. Most of these techniques are expensive and time-consuming. In this paper, vibration-based techniques are explored together with their use as diagnostic tools for structural health monitoring. Finite-element simulations are used to study the effect of material nonlinearity on dynamics of a cracked bar. Additionally, several experiments are performed to study the effect of vibro-impact behavior of crack on its dynamics. It was observed that a change in the natural frequency of the cracked bar due to crack-tip plasticity and vibro-impact behavior linked to interaction of crack faces, obtained from experiments, led to generation of higher harmonics; this can be used as a diagnostic tool for structural health monitoring.

  5. Using the Leitz LMS 2000 for monitoring and improvement of an e-beam

    NASA Astrophysics Data System (ADS)

    Blaesing-Bangert, Carola; Roeth, Klaus-Dieter; Ogawa, Yoichi

    1994-11-01

    Kaizen--a continuously improving--is a philosophy lived in Japan which is also becoming more and more important in Western companies. To implement this philosophy in the semiconductor industry, a high performance metrology tool is essential to determine the status of production quality periodically. An important prerequisite for statistical process control is the high stability of the metrology tool over several months or years; the tool-induced shift should be as small as possible. The pattern placement metrology tool Leitz LMS 2000 has been used in a major European mask house for several years now to qualify masks within the tightest specifications and to monitor the MEBES III and its cassettes. The mask shop's internal specification for the long term repeatability of the pattern placement metrology tool is 19 nm instead of 42 nm as specified by the supplier of the tool. Then the process capability of the LMS 2000 over 18 months is represented by an average cpk value of 2.8 for orthogonality, 5.2 for x-scaling, and 3.0 for y-scaling. The process capability of the MEBES III and its cassettes was improved in the past years. For instance, 100% of the masks produced with a process tolerance of +/- 200 nm are now within this limit.

  6. Monitoring machining conditions by infrared images

    NASA Astrophysics Data System (ADS)

    Borelli, Joao E.; Gonzaga Trabasso, Luis; Gonzaga, Adilson; Coelho, Reginaldo T.

    2001-03-01

    During machining process the knowledge of the temperature is the most important factor in tool analysis. It allows to control main factors that influence tool use, life time and waste. The temperature in the contact area between the piece and the tool is resulting from the material removal in cutting operation and it is too difficult to be obtained because the tool and the work piece are in motion. One way to measure the temperature in this situation is detecting the infrared radiation. This work presents a new methodology for diagnosis and monitoring of machining processes with the use of infrared images. The infrared image provides a map in gray tones of the elements in the process: tool, work piece and chips. Each gray tone in the image corresponds to a certain temperature for each one of those materials and the relationship between the gray tones and the temperature is gotten by the previous of infrared camera calibration. The system developed in this work uses an infrared camera, a frame grabber board and a software composed of three modules. The first module makes the image acquisition and processing. The second module makes the feature image extraction and performs the feature vector. Finally, the third module uses fuzzy logic to evaluate the feature vector and supplies the tool state diagnostic as output.

  7. Potentials for the use of tool-integrated in-line data acquisition systems in press shops

    NASA Astrophysics Data System (ADS)

    Maier, S.; Schmerbeck, T.; Liebig, A.; Kautz, T.; Volk, W.

    2017-09-01

    Robust in-line data acquisition systems are required for the realization of process monitoring and control systems in press shops. A promising approach is the integration of sensors in the following press tools. There they can be easy integrated and maintained. It also achieves the necessary robustness for the rough press environment. Such concepts were already investigated for the measurement of the geometrical accuracy as well as for the material flow of inner part areas. They enable the monitoring of each produced part’s quality. An important success factor are practical approaches to the use of this new process information in press shops. This work presents various applications of these measuring concepts, based on real car body components of the BMW Group. For example, the procedure of retroactive error analysis is explained for a side frame. It also shows how this data acquisition can be used for the optimization of drawing tools in tool shops. With the skid-line, there is a continuous value that can be monitored from planning to serial production.

  8. Monitoring social indicators in the Bear Trap Canyon Wilderness 1988-1998

    Treesearch

    Joe L. Ashor

    2000-01-01

    Since its inception as a wilderness planning and management tool almost 15 years ago, the Limits of Acceptable Change (LAC) process has stressed the importance of monitoring. Monitoring social conditions is critical to ensure that quality visitor experiences are maintained. Ten years of data collected from monitoring three river corridor-related social indicators in...

  9. New Tools and Methods for Assessing Risk-Management Strategies

    DTIC Science & Technology

    2004-03-01

    Theories to evaluate the risks and benefits of various acquisition alternatives and allowed researchers to monitor the process students used to make a...revealed distinct risk-management strategies. 15. SUBJECT TERMS risk managements, acquisition process, expected value theory , multi-attribute utility theory ...Utility Theories to evaluate the risks and benefits of various acquisition alternatives, and allowed us to monitor the process subjects used to arrive at

  10. Autonomous cloud based site monitoring through hydro geophysical data assimilation, processing and result delivery

    NASA Astrophysics Data System (ADS)

    Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.

    2017-12-01

    Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical geophysical ingestion and processing and the ability for co analysis and visualization of the raw and processed data with other data of interest (e.g. soil temperature, soil moisture, precipitation). We will demonstrate current PAF capabilities and discuss future efforts.

  11. Process analytical technology (PAT) in insect and mammalian cell culture processes: dielectric spectroscopy and focused beam reflectance measurement (FBRM).

    PubMed

    Druzinec, Damir; Weiss, Katja; Elseberg, Christiane; Salzig, Denise; Kraume, Matthias; Pörtner, Ralf; Czermak, Peter

    2014-01-01

    Modern bioprocesses demand for a careful definition of the critical process parameters (CPPs) already during the early stages of process development in order to ensure high-quality products and satisfactory yields. In this context, online monitoring tools can be applied to recognize unfavorable changes of CPPs during the production processes and to allow for early interventions in order to prevent losses of production batches due to quality issues. Process analytical technologies such as the dielectric spectroscopy or focused beam reflectance measurement (FBRM) are possible online monitoring tools, which can be applied to monitor cell growth as well as morphological changes. Since the dielectric spectroscopy only captures cells with intact cell membranes, even information about dead cells with ruptured or leaking cell membranes can be derived. The following chapter describes the application of dielectric spectroscopy on various virus-infected and non-infected cell lines with respect to adherent as well as suspension cultures in common stirred tank reactors. The adherent mammalian cell lines Vero (African green monkey kidney cells) and hMSC-TERT (telomerase-immortalized human mesenchymal stem cells) are thereby cultured on microcarrier, which provide the required growth surface and allow the cultivation of these cells even in dynamic culture systems. In turn, the insect-derived cell lines S2 and Sf21 are used as examples for cells typically cultured in suspension. Moreover, the FBRM technology as a further monitoring tool for cell culture applications has been included in this chapter using the example of Drosophila S2 insect cells.

  12. Designing for knowledge: bridging socio-hydrological monitoring and beyond

    NASA Astrophysics Data System (ADS)

    Mao, F.; Clark, J.; Buytaert, W.; Ochoa-Tocachi, B. F.; Hannah, D. M.

    2016-12-01

    Many methods and applications have been developed to research socio-hydrological systems, such as participatory monitoring, environmental big data processing and sensor network data transmission. However, these data-centred activities are insufficient to guarantee successful knowledge co-generation, decision making or governance. This research suggests a shift of attentions in designing socio-hydrological monitoring tools, from designing for data to designing for knowledge (DfK). Compared to the former strategy, DfK has at least three features as follows. (1) Why monitor? DfK demands the data produced by the newly introduced monitoring application to have potentials to generate socio-hydrological knowledge that supports decision making or management. It means that when designing a monitoring tool, we should not only answer how to collect data, but also questions such as how to best use the collected data in the form of knowledge. (2) What is the role of monitoring? DfK admits that the socio-hydrological data and knowledge generated by monitoring is just one of many kinds to support decision making and management. It means that the importance of monitoring and scientific evidence should not be overestimated, and knowledge cogeneration and synthesis should be considered in advance in the monitoring design process. (3) Who participate? DfK implies a wider engagement of stakeholders, which is not restricted between volunteers as data collectors and providers, and scientist and researcher communities as main data users. It requires a broader consideration of users, including not only data collectors, processors and interpreters, but also local and indigenous knowledge providers, and decision makers who use the knowledge and data. In summary, this research proposes a knowledge-centred strategy in designing participatory socio-hydrological monitoring tools, in order to make monitoring more useful and effective.

  13. Preliminary Development of Real Time Usage-Phase Monitoring System for CNC Machine Tools with a Case Study on CNC Machine VMC 250

    NASA Astrophysics Data System (ADS)

    Budi Harja, Herman; Prakosa, Tri; Raharno, Sri; Yuwana Martawirya, Yatna; Nurhadi, Indra; Setyo Nogroho, Alamsyah

    2018-03-01

    The production characteristic of job-shop industry at which products have wide variety but small amounts causes every machine tool will be shared to conduct production process with dynamic load. Its dynamic condition operation directly affects machine tools component reliability. Hence, determination of maintenance schedule for every component should be calculated based on actual usage of machine tools component. This paper describes study on development of monitoring system to obtaining information about each CNC machine tool component usage in real time approached by component grouping based on its operation phase. A special device has been developed for monitoring machine tool component usage by utilizing usage phase activity data taken from certain electronics components within CNC machine. The components are adaptor, servo driver and spindle driver, as well as some additional components such as microcontroller and relays. The obtained data are utilized for detecting machine utilization phases such as power on state, machine ready state or spindle running state. Experimental result have shown that the developed CNC machine tool monitoring system is capable of obtaining phase information of machine tool usage as well as its duration and displays the information at the user interface application.

  14. Adapting HIV patient and program monitoring tools for chronic non-communicable diseases in Ethiopia.

    PubMed

    Letebo, Mekitew; Shiferaw, Fassil

    2016-06-02

    Chronic non-communicable diseases (NCDs) have become a huge public health concern in developing countries. Many resource-poor countries facing this growing epidemic, however, lack systems for an organized and comprehensive response to NCDs. Lack of NCD national policy, strategies, treatment guidelines and surveillance and monitoring systems are features of health systems in many developing countries. Successfully responding to the problem requires a number of actions by the countries, including developing context-appropriate chronic care models and programs and standardization of patient and program monitoring tools. In this cross-sectional qualitative study we assessed existing monitoring and evaluation (M&E) tools used for NCD services in Ethiopia. Since HIV care and treatment program is the only large-scale chronic care program in the country, we explored the M&E tools being used in the program and analyzed how these tools might be adapted to support NCD services in the country. Document review and in-depth interviews were the main data collection methods used. The interviews were held with health workers and staff involved in data management purposively selected from four health facilities with high HIV and NCD patient load. Thematic analysis was employed to make sense of the data. Our findings indicate the apparent lack of information systems for NCD services, including the absence of standardized patient and program monitoring tools to support the services. We identified several HIV care and treatment patient and program monitoring tools currently being used to facilitate intake process, enrolment, follow up, cohort monitoring, appointment keeping, analysis and reporting. Analysis of how each tool being used for HIV patient and program monitoring can be adapted for supporting NCD services is presented. Given the similarity between HIV care and treatment and NCD services and the huge investment already made to implement standardized tools for HIV care and treatment program, adaptation and use of HIV patient and program monitoring tools for NCD services can improve NCD response in Ethiopia through structuring services, standardizing patient care and treatment, supporting evidence-based planning and providing information on effectiveness of interventions.

  15. Acoustic emission detection of macro-cracks on engraving tool steel inserts during the injection molding cycle using PZT sensors.

    PubMed

    Svečko, Rajko; Kusić, Dragan; Kek, Tomaž; Sarjaš, Andrej; Hančič, Aleš; Grum, Janez

    2013-05-14

    This paper presents an improved monitoring system for the failure detection of engraving tool steel inserts during the injection molding cycle. This system uses acoustic emission PZT sensors mounted through acoustic waveguides on the engraving insert. We were thus able to clearly distinguish the defect through measured AE signals. Two engraving tool steel inserts were tested during the production of standard test specimens, each under the same processing conditions. By closely comparing the captured AE signals on both engraving inserts during the filling and packing stages, we were able to detect the presence of macro-cracks on one engraving insert. Gabor wavelet analysis was used for closer examination of the captured AE signals' peak amplitudes during the filling and packing stages. The obtained results revealed that such a system could be used successfully as an improved tool for monitoring the integrity of an injection molding process.

  16. Acoustic Emission Detection of Macro-Cracks on Engraving Tool Steel Inserts during the Injection Molding Cycle Using PZT Sensors

    PubMed Central

    Svečko, Rajko; Kusić, Dragan; Kek, Tomaž; Sarjaš, Andrej; Hančič, Aleš; Grum, Janez

    2013-01-01

    This paper presents an improved monitoring system for the failure detection of engraving tool steel inserts during the injection molding cycle. This system uses acoustic emission PZT sensors mounted through acoustic waveguides on the engraving insert. We were thus able to clearly distinguish the defect through measured AE signals. Two engraving tool steel inserts were tested during the production of standard test specimens, each under the same processing conditions. By closely comparing the captured AE signals on both engraving inserts during the filling and packing stages, we were able to detect the presence of macro-cracks on one engraving insert. Gabor wavelet analysis was used for closer examination of the captured AE signals' peak amplitudes during the filling and packing stages. The obtained results revealed that such a system could be used successfully as an improved tool for monitoring the integrity of an injection molding process. PMID:23673677

  17. Overview of 'Omics Technologies for Military Occupational Health Surveillance and Medicine.

    PubMed

    Bradburne, Christopher; Graham, David; Kingston, H M; Brenner, Ruth; Pamuku, Matt; Carruth, Lucy

    2015-10-01

    Systems biology ('omics) technologies are emerging as tools for the comprehensive analysis and monitoring of human health. In order for these tools to be used in military medicine, clinical sampling and biobanking will need to be optimized to be compatible with downstream processing and analysis for each class of molecule measured. This article provides an overview of 'omics technologies, including instrumentation, tools, and methods, and their potential application for warfighter exposure monitoring. We discuss the current state and the potential utility of personalized data from a variety of 'omics sources including genomics, epigenomics, transcriptomics, metabolomics, proteomics, lipidomics, and efforts to combine their use. Issues in the "sample-to-answer" workflow, including collection and biobanking are discussed, as well as national efforts for standardization and clinical interpretation. Establishment of these emerging capabilities, along with accurate xenobiotic monitoring, for the Department of Defense could provide new and effective tools for environmental health monitoring at all duty stations, including deployed locations. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  18. Evaluation of Heat Flux Measurement as a New Process Analytical Technology Monitoring Tool in Freeze Drying.

    PubMed

    Vollrath, Ilona; Pauli, Victoria; Friess, Wolfgang; Freitag, Angelika; Hawe, Andrea; Winter, Gerhard

    2017-05-01

    This study investigates the suitability of heat flux measurement as a new technique for monitoring product temperature and critical end points during freeze drying. The heat flux sensor is tightly mounted on the shelf and measures non-invasively (no contact with the product) the heat transferred from shelf to vial. Heat flux data were compared to comparative pressure measurement, thermocouple readings, and Karl Fischer titration as current state of the art monitoring techniques. The whole freeze drying process including freezing (both by ramp freezing and controlled nucleation) and primary and secondary drying was considered. We found that direct measurement of the transferred heat enables more insights into thermodynamics of the freezing process. Furthermore, a vial heat transfer coefficient can be calculated from heat flux data, which ultimately provides a non-invasive method to monitor product temperature throughout primary drying. The end point of primary drying determined by heat flux measurements was in accordance with the one defined by thermocouples. During secondary drying, heat flux measurements could not indicate the progress of drying as monitoring the residual moisture content. In conclusion, heat flux measurements are a promising new non-invasive tool for lyophilization process monitoring and development using energy transfer as a control parameter. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  19. A New Tool for Industry

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Ultrasonic P2L2 bolt monitor is a new industrial tool, developed at Langley Research Laboratory, which is lightweight, portable, extremely accurate because it is not subject to friction error, and it is cost-competitive with the least expensive of other types of accurate strain monitors. P2L2 is an acronym for Pulse Phase Locked Loop. The ultrasound system which measures the stress that occurs when a bolt becomes elongated in the process of tightening, transmits sound waves to the bolt being fastened and receives a return signal indicating changes in bolt stress. Results are translated into a digital reading of the actual stress on the bolt. Device monitors the bolt tensioning process on mine roof bolts that provide increased safety within the mine. Also has utility in industrial applications.

  20. Detection of Collapse and Crystallization of Saccharide, Protein, and Mannitol Formulations by Optical Fibers in Lyophilization

    PubMed Central

    Horn, Jacqueline; Friess, Wolfgang

    2018-01-01

    The collapse temperature (Tc) and the glass transition temperature of freeze-concentrated solutions (Tg') as well as the crystallization behavior of excipients are important physicochemical characteristics which guide the cycle development in freeze-drying. The most frequently used methods to determine these values are differential scanning calorimetry (DSC) and freeze-drying microscopy (FDM). The objective of this study was to evaluate the optical fiber system (OFS) unit as alternative tool for the analysis of Tc, Tg' and crystallization events. The OFS unit was also tested as a potential online monitoring tool during freeze-drying. Freeze/thawing and freeze-drying experiments of sucrose, trehalose, stachyose, mannitol, and highly concentrated IgG1 and lysozyme solutions were carried out and monitored by the OFS. Comparative analyses were performed by DSC and FDM. OFS and FDM results correlated well. The crystallization behavior of mannitol could be monitored by the OFS during freeze/thawing as it can be done by DSC. Online monitoring of freeze-drying runs detected collapse of amorphous saccharide matrices. The OFS unit enabled the analysis of both Tc and crystallization processes, which is usually carried out by FDM and DSC. The OFS can hence be used as novel measuring device. Additionally, detection of these events during lyophilization facilitates online-monitoring. Thus the OFS is a new beneficial tool for the development and monitoring of freeze-drying processes. PMID:29435445

  1. Detection of Collapse and Crystallization of Saccharide, Protein and Mannitol Formulations by Optical Fibers in Lyophilization

    NASA Astrophysics Data System (ADS)

    Horn, Jacqueline; Friess, Wolfgang

    2018-01-01

    The collapse temperature (Tc) and the glass transition temperature of freeze-concentrated solutions (Tg’) as well as the crystallization behavior of excipients are important physicochemical characteristics which guide the cycle development in freeze-drying. The most frequently used methods to determine these values are differential scanning calorimetry (DSC) and freeze-drying microscopy (FDM). The objective of this study was to evaluate the optical fiber system (OFS) unit as alternative tool for the analysis of Tc, Tg’ and crystallization events. The OFS unit was also tested as a potential online monitoring tool during freeze-drying. Freeze/thawing and freeze-drying experiments of sucrose, trehalose, stachyose, mannitol and highly concentrated IgG1 and lysozyme solutions were carried out and monitored by the OFS. Comparative analyses were performed by DSC and FDM. OFS and FDM results correlated well. The crystallization behavior of mannitol could be monitored by the OFS during freeze/thawing as it can be done by DSC. Online monitoring of freeze-drying runs detected collapse of amorphous saccharide matrices. The OFS unit enabled the analysis of both Tc and crystallization processes, which is usually carried out by FDM and DSC. The OFS can hence be used as novel measuring device. Additionally, detection of these events during lyophilization facilitate online-monitoring. Thus the OFS is a new beneficial tool for the development and monitoring of freeze-drying processes.

  2. MURMoT. Design and Application of Microbial Uranium Reduction Monitoring Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loeffler, Frank E.

    2014-12-31

    Uranium (U) contamination in the subsurface is a major remediation challenge at many DOE sites. Traditional site remedies present enormous costs to DOE; hence, enhanced bioremediation technologies (i.e., biostimulation and bioaugmentation) combined with monitoring efforts are being considered as cost-effective corrective actions to address subsurface contamination. This research effort improved understanding of the microbial U reduction process and developed new tools for monitoring microbial activities. Application of these tools will promote science-based site management decisions that achieve contaminant detoxification, plume control, and long-term stewardship in the most efficient manner. The overarching hypothesis was that the design, validation and application ofmore » a suite of new molecular and biogeochemical tools advance process understanding, and improve environmental monitoring regimes to assess and predict in situ U immobilization. Accomplishments: This project (i) advanced nucleic acid-based approaches to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-detoxifying bacteria; (ii) developed proteomics workflows for detection of metal reduction biomarker proteins in laboratory cultures and contaminated site groundwater; (iii) developed and demonstrated the utility of U isotopic fractionation using high precision mass spectrometry to quantify U(VI) reduction for a range of reduction mechanisms and environmental conditions; and (iv) validated the new tools using field samples from U-contaminated IFRC sites, and demonstrated their prognostic and diagnostic capabilities in guiding decision making for environmental remediation and long-term site stewardship.« less

  3. The Health Impact Assessment (HIA) Resource and Tool Compilation

    EPA Pesticide Factsheets

    The compilation includes tools and resources related to the HIA process and can be used to collect and analyze data, establish a baseline profile, assess potential health impacts, and establish benchmarks and indicators for monitoring and evaluation.

  4. ECOSYSTEM RESEARCH: THE WESTERN PILOT

    EPA Science Inventory

    The products of this research include tools, monitoring data and assessments. Tools include biological indicators and a process for setting expectation or reference conditions against which to evaluate the indicators. It will also include a prioritized set of indicators on anthr...

  5. MODSNOW-Tool: an operational tool for daily snow cover monitoring using MODIS data

    NASA Astrophysics Data System (ADS)

    Gafurov, Abror; Lüdtke, Stefan; Unger-Shayesteh, Katy; Vorogushyn, Sergiy; Schöne, Tilo; Schmidt, Sebastian; Kalashnikova, Olga; Merz, Bruno

    2017-04-01

    Spatially distributed snow cover information in mountain areas is extremely important for water storage estimations, seasonal water availability forecasting, or the assessment of snow-related hazards (e.g. enhanced snow-melt following intensive rains, or avalanche events). Moreover, spatially distributed snow cover information can be used to calibrate and/or validate hydrological models. We present the MODSNOW-Tool - an operational monitoring tool offers a user-friendly application which can be used for catchment-based operational snow cover monitoring. The application automatically downloads and processes freely available daily Moderate Resolution Imaging Spectroradiometer (MODIS) snow cover data. The MODSNOW-Tool uses a step-wise approach for cloud removal and delivers cloud-free snow cover maps for the selected river basins including basin specific snow cover extent statistics. The accuracy of cloud-eliminated MODSNOW snow cover maps was validated for 84 almost cloud-free days in the Karadarya river basin in Central Asia, and an average accuracy of 94 % was achieved. The MODSNOW-Tool can be used in operational and non-operational mode. In the operational mode, the tool is set up as a scheduled task on a local computer allowing automatic execution without user interaction and delivers snow cover maps on a daily basis. In the non-operational mode, the tool can be used to process historical time series of snow cover maps. The MODSNOW-Tool is currently implemented and in use at the national hydrometeorological services of four Central Asian states - Kazakhstan, Kyrgyzstan, Uzbekistan and Turkmenistan and used for seasonal water availability forecast.

  6. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify a variety of plant phenomena and improve monitoring capabilities.

  7. Monitoring real-time navigation processes using the automated reasoning tool (ART)

    NASA Technical Reports Server (NTRS)

    Maletz, M. C.; Culbert, C. J.

    1985-01-01

    An expert system is described for monitoring and controlling navigation processes in real-time. The ART-based system features data-driven computation, accommodation of synchronous and asynchronous data, temporal modeling for individual time intervals and chains of time intervals, and hypothetical reasoning capabilities that consider alternative interpretations of the state of navigation processes. The concept is illustrated in terms of the NAVEX system for monitoring and controlling the high speed ground navigation console for Mission Control at Johnson Space Center. The reasoning processes are outlined, including techniques used to consider alternative data interpretations. Installation of the system has permitted using a single operator, instead of three, to monitor the ascent and entry phases of a Shuttle mission.

  8. An integrated process analytical technology (PAT) approach to monitoring the effect of supercooling on lyophilization product and process parameters of model monoclonal antibody formulations.

    PubMed

    Awotwe Otoo, David; Agarabi, Cyrus; Khan, Mansoor A

    2014-07-01

    The aim of the present study was to apply an integrated process analytical technology (PAT) approach to control and monitor the effect of the degree of supercooling on critical process and product parameters of a lyophilization cycle. Two concentrations of a mAb formulation were used as models for lyophilization. ControLyo™ technology was applied to control the onset of ice nucleation, whereas tunable diode laser absorption spectroscopy (TDLAS) was utilized as a noninvasive tool for the inline monitoring of the water vapor concentration and vapor flow velocity in the spool during primary drying. The instantaneous measurements were then used to determine the effect of the degree of supercooling on critical process and product parameters. Controlled nucleation resulted in uniform nucleation at lower degrees of supercooling for both formulations, higher sublimation rates, lower mass transfer resistance, lower product temperatures at the sublimation interface, and shorter primary drying times compared with the conventional shelf-ramped freezing. Controlled nucleation also resulted in lyophilized cakes with more elegant and porous structure with no visible collapse or shrinkage, lower specific surface area, and shorter reconstitution times compared with the uncontrolled nucleation. Uncontrolled nucleation however resulted in lyophilized cakes with relatively lower residual moisture contents compared with controlled nucleation. TDLAS proved to be an efficient tool to determine the endpoint of primary drying. There was good agreement between data obtained from TDLAS-based measurements and SMART™ technology. ControLyo™ technology and TDLAS showed great potential as PAT tools to achieve enhanced process monitoring and control during lyophilization cycles. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  9. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. Copyright © 2011 Wiley-Liss, Inc.

  10. Ground-penetrating radar: A tool for monitoring bridge scour

    USGS Publications Warehouse

    Anderson, N.L.; Ismael, A.M.; Thitimakorn, T.

    2007-01-01

    Ground-penetrating radar (GPR) data were acquired across shallow streams and/or drainage ditches at 10 bridge sites in Missouri by maneuvering the antennae across the surface of the water and riverbank from the bridge deck, manually or by boat. The acquired two-dimensional and three-dimensional data sets accurately image the channel bottom, demonstrating that the GPR tool can be used to estimate and/or monitor water depths in shallow fluvial environments. The study results demonstrate that the GPR tool is a safe and effective tool for measuring and/or monitoring scour in proximity to bridges. The technique can be used to safely monitor scour at assigned time intervals during peak flood stages, thereby enabling owners to take preventative action prior to potential failure. The GPR tool can also be used to investigate depositional and erosional patterns over time, thereby elucidating these processes on a local scale. In certain instances, in-filled scour features can also be imaged and mapped. This information may be critically important to those engaged in bridge design. GPR has advantages over other tools commonly employed for monitoring bridge scour (reflection seismic profiling, echo sounding, and electrical conductivity probing). The tool doesn't need to be coupled to the water, can be moved rapidly across (or above) the surface of a stream, and provides an accurate depth-structure model of the channel bottom and subchannel bottom sediments. The GPR profiles can be extended across emerged sand bars or onto the shore.

  11. In-situ Frequency Dependent Dielectric Sensing of Cure

    NASA Technical Reports Server (NTRS)

    Kranbuehl, David E.

    1996-01-01

    With the expanding use of polymeric materials as composite matrices, adhesives, coatings and films, the need to develop low cost, automated fabrication processes to produce consistently high quality parts is critical. Essential to the development of reliable, automated, intelligent processing is the ability to continuously monitor the changing state of the polymeric resin in-situ in the fabrication tool. This final report discusses work done on developing dielectric sensing to monitor polymeric material cure and which provides a fundamental understanding of the underlying science for the use of frequency dependent dielectri sensors to monitor the cure process.

  12. Tool Condition Monitoring in Micro-End Milling using wavelets

    NASA Astrophysics Data System (ADS)

    Dubey, N. K.; Roushan, A.; Rao, U. S.; Sandeep, K.; Patra, K.

    2018-04-01

    In this work, Tool Condition Monitoring (TCM) strategy is developed for micro-end milling of titanium alloy and mild steel work-pieces. Full immersion slot milling experiments are conducted using a solid tungsten carbide end mill for more than 1900 s to have reasonable amount of tool wear. During the micro-end milling process, cutting force and vibration signals are acquired using Kistler piezo-electric 3-component force dynamometer (9256C2) and accelerometer (NI cDAQ-9188) respectively. The force components and the vibration signals are processed using Discrete Wavelet Transformation (DWT) in both time and frequency window. 5-level wavelet packet decomposition using Db-8 wavelet is carried out and the detailed coefficients D1 to D5 for each of the signals are obtained. The results of the wavelet transformation are correlated with the tool wear. In case of vibration signals, de-noising is done for higher frequency components (D1) and force signals were de-noised for lower frequency components (D5). Increasing value of MAD (Mean Absolute Deviation) of the detail coefficients for successive channels depicted tool wear. The predictions of the tool wear are confirmed from the actual wear observed in the SEM of the worn tool.

  13. Toward a More Flexible Web-Based Framework for Multidisciplinary Design

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.; Salas, A. O.

    1999-01-01

    In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary design, is defined as a hardware-software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, monitoring, controlling, and displaying the design process. The objective of this research is to explore how Web technology can improve these areas of weakness and lead toward a more flexible framework. This article describes a Web-based system that optimizes and controls the execution sequence of design processes in addition to monitoring the project status and displaying the design results.

  14. AgriSense-STARS: Advancing Methods of Agricultural Monitoring for Food Security in Smallholder Regions - the Case for Tanzania

    NASA Astrophysics Data System (ADS)

    Dempewolf, J.; Becker-Reshef, I.; Nakalembe, C. L.; Tumbo, S.; Maurice, S.; Mbilinyi, B.; Ntikha, O.; Hansen, M.; Justice, C. J.; Adusei, B.; Kongo, V.

    2015-12-01

    In-season monitoring of crop conditions provides critical information for agricultural policy and decision making and most importantly for food security planning and management. Nationwide agricultural monitoring in countries dominated by smallholder farming systems, generally relies on extensive networks of field data collectors. In Tanzania, extension agents make up this network and report on conditions across the country, approaching a "near-census". Data is collected on paper which is resource and time intensive, as well as prone to errors. Data quality is ambiguous and there is a general lack of clear and functional feedback loops between farmers, extension agents, analysts and decision makers. Moreover, the data are not spatially explicit, limiting the usefulness for analysis and quality of policy outcomes. Despite significant advances in remote sensing and information communication technologies (ICT) for monitoring agriculture, the full potential of these new tools is yet to be realized in Tanzania. Their use is constrained by the lack of resources, skills and infrastructure to access and process these data. The use of ICT technologies for data collection, processing and analysis is equally limited. The AgriSense-STARS project is developing and testing a system for national-scale in-season monitoring of smallholder agriculture using a combination of three main tools, 1) GLAM-East Africa, an automated MODIS satellite image processing system, 2) field data collection using GeoODK and unmanned aerial vehicles (UAVs), and 3) the Tanzania Crop Monitor, a collaborative online portal for data management and reporting. These tools are developed and applied in Tanzania through the National Food Security Division of the Ministry of Agriculture, Food Security and Cooperatives (MAFC) within a statistically representative sampling framework (area frame) that ensures data quality, representability and resource efficiency.

  15. Batch Statistical Process Monitoring Approach to a Cocrystallization Process.

    PubMed

    Sarraguça, Mafalda C; Ribeiro, Paulo R S; Dos Santos, Adenilson O; Lopes, João A

    2015-12-01

    Cocrystals are defined as crystalline structures composed of two or more compounds that are solid at room temperature held together by noncovalent bonds. Their main advantages are the increase of solubility, bioavailability, permeability, stability, and at the same time retaining active pharmaceutical ingredient bioactivity. The cocrystallization between furosemide and nicotinamide by solvent evaporation was monitored on-line using near-infrared spectroscopy (NIRS) as a process analytical technology tool. The near-infrared spectra were analyzed using principal component analysis. Batch statistical process monitoring was used to create control charts to perceive the process trajectory and define control limits. Normal and non-normal operating condition batches were performed and monitored with NIRS. The use of NIRS associated with batch statistical process models allowed the detection of abnormal variations in critical process parameters, like the amount of solvent or amount of initial components present in the cocrystallization. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  16. Diamond tool wear detection method using cutting force and its power spectrum analysis in ultra-precision fly cutting

    NASA Astrophysics Data System (ADS)

    Zhang, G. Q.; To, S.

    2014-08-01

    Cutting force and its power spectrum analysis was thought to be an effective method monitoring tool wear in many cutting processes and a significant body of research has been conducted on this research area. However, relative little similar research was found in ultra-precision fly cutting. In this paper, a group of experiments were carried out to investigate the cutting forces and its power spectrum characteristics under different tool wear stages. Result reveals that the cutting force increases with the progress of tool wear. The cutting force signals under different tool wear stages were analyzed using power spectrum analysis. The analysis indicates that a characteristic frequency does exist in the power spectrum of the cutting force, whose power spectral density increases with the increasing of tool wear level, this characteristic frequency could be adopted to monitor diamond tool wear in ultra-precision fly cutting.

  17. Decision support tool for diagnosing the source of variation

    NASA Astrophysics Data System (ADS)

    Masood, Ibrahim; Azrul Azhad Haizan, Mohamad; Norbaya Jumali, Siti; Ghazali, Farah Najihah Mohd; Razali, Hazlin Syafinaz Md; Shahir Yahya, Mohd; Azlan, Mohd Azwir bin

    2017-08-01

    Identifying the source of unnatural variation (SOV) in manufacturing process is essential for quality control. The Shewhart control chart patterns (CCPs) are commonly used to monitor the SOV. However, a proper interpretation of CCPs associated to its SOV requires a high skill industrial practitioner. Lack of knowledge in process engineering will lead to erroneous corrective action. The objective of this study is to design the operating procedures of computerized decision support tool (DST) for process diagnosis. The DST is an embedded tool in CCPs recognition scheme. Design methodology involves analysis of relationship between geometrical features, manufacturing process and CCPs. The DST contents information about CCPs and its possible root cause error and description on SOV phenomenon such as process deterioration in tool bluntness, offsetting tool, loading error, and changes in materials hardness. The DST will be useful for an industrial practitioner in making effective troubleshooting.

  18. Risk based monitoring (RBM) tools for clinical trials: A systematic review.

    PubMed

    Hurley, Caroline; Shiely, Frances; Power, Jessica; Clarke, Mike; Eustace, Joseph A; Flanagan, Evelyn; Kearney, Patricia M

    2016-11-01

    In November 2016, the Integrated Addendum to ICH-GCP E6 (R2) will advise trial sponsors to develop a risk-based approach to clinical trial monitoring. This new process is commonly known as risk based monitoring (RBM). To date, a variety of tools have been developed to guide RBM. However, a gold standard approach does not exist. This review aims to identify and examine RBM tools. Review of published and grey literature using a detailed search-strategy and cross-checking of reference lists. This review included academic and commercial instruments that met the Organisation for Economic Co-operation and Development (OECD) classification of RBM tools. Ninety-one potential RBM tools were identified and 24 were eligible for inclusion. These tools were published between 2000 and 2015. Eight tools were paper based or electronic questionnaires and 16 operated as Service as a System (SaaS). Risk associated with the investigational medicinal product (IMP), phase of the clinical trial and study population were examined by all tools and suitable mitigation guidance through on-site and centralised monitoring was provided. RBM tools for clinical trials are relatively new, their features and use varies widely and they continue to evolve. This makes it difficult to identify the "best" RBM technique or tool. For example, equivalence testing is required to determine if RBM strategies directed by paper based and SaaS based RBM tools are comparable. Such research could be embedded within multi-centre clinical trials and conducted as a SWAT (Study within a Trial). Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Adaptive Management and Monitoring as Fundamental Tools to Effective Salt Marsh Restoration

    EPA Science Inventory

    Adaptive management as applied to ecological restoration is a systematic decision-making process in which the results of restoration activities are repeatedly monitored and evaluated to provide guidance that can be used in determining any necessary future restoration actions. In...

  20. In-line monitoring of a pharmaceutical blending process using FT-Raman spectroscopy.

    PubMed

    Vergote, G J; De Beer, T R M; Vervaet, C; Remon, J P; Baeyens, W R G; Diericx, N; Verpoort, F

    2004-03-01

    FT-Raman spectroscopy (in combination with a fibre optic probe) was evaluated as an in-line tool to monitor a blending process of diltiazem hydrochloride pellets and paraffinic wax beads. The mean square of differences (MSD) between two consecutive spectra was used to identify the time required to obtain a homogeneous mixture. A traditional end-sampling thief probe was used to collect samples, followed by HPLC analysis to verify the Raman data. Large variations were seen in the FT-Raman spectra logged during the initial minutes of the blending process using a binary mixture (ratio: 50/50, w/w) of diltiazem pellets and paraffinic wax beads (particle size: 800-1200 microm). The MSD-profiles showed that a homogeneous mixture was obtained after about 15 min blending. HPLC analysis confirmed these observations. The Raman data showed that the mixing kinetics depended on the particle size of the material and on the mixing speed. The results of this study proved that FT-Raman spectroscopy can be successfully implemented as an in-line monitoring tool for blending processes.

  1. Shortcomings of low-cost imaging systems for viewing computed radiographs.

    PubMed

    Ricke, J; Hänninen, E L; Zielinski, C; Amthauer, H; Stroszczynski, C; Liebig, T; Wolf, M; Hosten, N

    2000-01-01

    To assess potential advantages of a new PC-based viewing tool featuring image post-processing for viewing computed radiographs on low-cost hardware (PC) with a common display card and color monitor, and to evaluate the effect of using color versus monochrome monitors. Computed radiographs of a statistical phantom were viewed on a PC, with and without post-processing (spatial frequency and contrast processing), employing a monochrome or a color monitor. Findings were compared with the viewing on a radiological Workstation and evaluated with ROC analysis. Image post-processing improved the perception of low-contrast details significantly irrespective of the monitor used. No significant difference in perception was observed between monochrome and color monitors. The review at the radiological Workstation was superior to the review done using the PC with image processing. Lower quality hardware (graphic card and monitor) used in low cost PCs negatively affects perception of low-contrast details in computed radiographs. In this situation, it is highly recommended to use spatial frequency and contrast processing. No significant quality gain has been observed for the high-end monochrome monitor compared to the color display. However, the color monitor was affected stronger by high ambient illumination.

  2. Effect of Friction Stir Process Parameters on the Mechanical and Thermal Behavior of 5754-H111 Aluminum Plates.

    PubMed

    Serio, Livia Maria; Palumbo, Davide; De Filippis, Luigi Alberto Ciro; Galietti, Umberto; Ludovico, Antonio Domenico

    2016-02-23

    A study of the Friction Stir Welding (FSW) process was carried out in order to evaluate the influence of process parameters on the mechanical properties of aluminum plates (AA5754-H111). The process was monitored during each test by means of infrared cameras in order to correlate temperature information with eventual changes of the mechanical properties of joints. In particular, two process parameters were considered for tests: the welding tool rotation speed and the welding tool traverse speed. The quality of joints was evaluated by means of destructive and non-destructive tests. In this regard, the presence of defects and the ultimate tensile strength (UTS) were investigated for each combination of the process parameters. A statistical analysis was carried out to assess the correlation between the thermal behavior of joints and the process parameters, also proving the capability of Infrared Thermography for on-line monitoring of the quality of joints.

  3. Effect of Friction Stir Process Parameters on the Mechanical and Thermal Behavior of 5754-H111 Aluminum Plates

    PubMed Central

    Serio, Livia Maria; Palumbo, Davide; De Filippis, Luigi Alberto Ciro; Galietti, Umberto; Ludovico, Antonio Domenico

    2016-01-01

    A study of the Friction Stir Welding (FSW) process was carried out in order to evaluate the influence of process parameters on the mechanical properties of aluminum plates (AA5754-H111). The process was monitored during each test by means of infrared cameras in order to correlate temperature information with eventual changes of the mechanical properties of joints. In particular, two process parameters were considered for tests: the welding tool rotation speed and the welding tool traverse speed. The quality of joints was evaluated by means of destructive and non-destructive tests. In this regard, the presence of defects and the ultimate tensile strength (UTS) were investigated for each combination of the process parameters. A statistical analysis was carried out to assess the correlation between the thermal behavior of joints and the process parameters, also proving the capability of Infrared Thermography for on-line monitoring of the quality of joints. PMID:28773246

  4. CÆLIS: software for assimilation, management and processing data of an atmospheric measurement network

    NASA Astrophysics Data System (ADS)

    Fuertes, David; Toledano, Carlos; González, Ramiro; Berjón, Alberto; Torres, Benjamín; Cachorro, Victoria E.; de Frutos, Ángel M.

    2018-02-01

    Given the importance of the atmospheric aerosol, the number of instruments and measurement networks which focus on its characterization are growing. Many challenges are derived from standardization of protocols, monitoring of the instrument status to evaluate the network data quality and manipulation and distribution of large volume of data (raw and processed). CÆLIS is a software system which aims at simplifying the management of a network, providing tools by monitoring the instruments, processing the data in real time and offering the scientific community a new tool to work with the data. Since 2008 CÆLIS has been successfully applied to the photometer calibration facility managed by the University of Valladolid, Spain, in the framework of Aerosol Robotic Network (AERONET). Thanks to the use of advanced tools, this facility has been able to analyze a growing number of stations and data in real time, which greatly benefits the network management and data quality control. The present work describes the system architecture of CÆLIS and some examples of applications and data processing.

  5. Performance Monitoring of Chilled-Water Distribution Systems Using HVAC-Cx

    PubMed Central

    Ferretti, Natascha Milesi; Galler, Michael A.; Bushby, Steven T.

    2017-01-01

    In this research we develop, test, and demonstrate the newest extension of the software HVAC-Cx (NIST and CSTB 2014), an automated commissioning tool for detecting common mechanical faults and control errors in chilled-water distribution systems (loops). The commissioning process can improve occupant comfort, ensure the persistence of correct system operation, and reduce energy consumption. Automated tools support the process by decreasing the time and the skill level required to carry out necessary quality assurance measures, and as a result they enable more thorough testing of building heating, ventilating, and air-conditioning (HVAC) systems. This paper describes the algorithm, developed by National Institute of Standards and Technology (NIST), to analyze chilled-water loops and presents the results of a passive monitoring investigation using field data obtained from BACnet® (ASHRAE 2016) controllers and presents field validation of the findings. The tool was successful in detecting faults in system operation in its first field implementation supporting the investigation phase through performance monitoring. Its findings led to a full energy retrocommissioning of the field site. PMID:29167584

  6. Performance Monitoring of Chilled-Water Distribution Systems Using HVAC-Cx.

    PubMed

    Ferretti, Natascha Milesi; Galler, Michael A; Bushby, Steven T

    2017-01-01

    In this research we develop, test, and demonstrate the newest extension of the software HVAC-Cx (NIST and CSTB 2014), an automated commissioning tool for detecting common mechanical faults and control errors in chilled-water distribution systems (loops). The commissioning process can improve occupant comfort, ensure the persistence of correct system operation, and reduce energy consumption. Automated tools support the process by decreasing the time and the skill level required to carry out necessary quality assurance measures, and as a result they enable more thorough testing of building heating, ventilating, and air-conditioning (HVAC) systems. This paper describes the algorithm, developed by National Institute of Standards and Technology (NIST), to analyze chilled-water loops and presents the results of a passive monitoring investigation using field data obtained from BACnet ® (ASHRAE 2016) controllers and presents field validation of the findings. The tool was successful in detecting faults in system operation in its first field implementation supporting the investigation phase through performance monitoring. Its findings led to a full energy retrocommissioning of the field site.

  7. Adverse Effects of Nonsystemic Steroids (Inhaled, Intranasal, and Cutaneous): a Review of the Literature and Suggested Monitoring Tool.

    PubMed

    Gupta, Ratika; Fonacier, Luz S

    2016-06-01

    Inhaled, intranasal, and cutaneous steroids are prescribed by physicians for a plethora of disease processes including asthma and rhinitis. While the high efficacy of this class of medication is well known, the wide range of adverse effects, both local and systemic, is not well elucidated. It is imperative to monitor total steroid burden in its varied forms as well as tracking for possible side effects that may be caused by a high cumulative dose of steroids. This review article highlights the adverse effects of different steroid modalities as well as suggests a monitoring tool to determine steroid totality and side effects.

  8. Characterization of photosynthetically active duckweed (Wolffia australiana) in vitro culture by Respiration Activity Monitoring System (RAMOS).

    PubMed

    Rechmann, Henrik; Friedrich, Andrea; Forouzan, Dara; Barth, Stefan; Schnabl, Heide; Biselli, Manfred; Boehm, Robert

    2007-06-01

    The feasibility of oxygen transfer rate (OTR) measurement to non-destructively monitor plant propagation and vitality of photosynthetically active plant in vitro culture of duckweed (Wolffia australiana, Lemnaceae) was tested using Respiration Activity Monitoring System (RAMOS). As a result, OTR proofed to be a sensitive indicator for plant vitality. The culture characterization under day/night light conditions, however, revealed a complex interaction between oxygen production and consumption, rendering OTR measurement an unsuitable tool to track plant propagation. However, RAMOS was found to be a useful tool in preliminary studies for process development of photosynthetically active plant in vitro cultures.

  9. Infrared thermography for condition monitoring - A review

    NASA Astrophysics Data System (ADS)

    Bagavathiappan, S.; Lahiri, B. B.; Saravanan, T.; Philip, John; Jayakumar, T.

    2013-09-01

    Temperature is one of the most common indicators of the structural health of equipment and components. Faulty machineries, corroded electrical connections, damaged material components, etc., can cause abnormal temperature distribution. By now, infrared thermography (IRT) has become a matured and widely accepted condition monitoring tool where the temperature is measured in real time in a non-contact manner. IRT enables early detection of equipment flaws and faulty industrial processes under operating condition thereby, reducing system down time, catastrophic breakdown and maintenance cost. Last three decades witnessed a steady growth in the use of IRT as a condition monitoring technique in civil structures, electrical installations, machineries and equipment, material deformation under various loading conditions, corrosion damages and welding processes. IRT has also found its application in nuclear, aerospace, food, paper, wood and plastic industries. With the advent of newer generations of infrared camera, IRT is becoming a more accurate, reliable and cost effective technique. This review focuses on the advances of IRT as a non-contact and non-invasive condition monitoring tool for machineries, equipment and processes. Various conditions monitoring applications are discussed in details, along with some basics of IRT, experimental procedures and data analysis techniques. Sufficient background information is also provided for the beginners and non-experts for easy understanding of the subject.

  10. Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining

    PubMed Central

    Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin

    2016-01-01

    Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing. PMID:27854322

  11. Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining.

    PubMed

    Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin

    2016-11-16

    Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing.

  12. Tools to manage the enterprise-wide picture archiving and communications system environment.

    PubMed

    Lannum, L M; Gumpf, S; Piraino, D

    2001-06-01

    The presentation will focus on the implementation and utilization of a central picture archiving and communications system (PACS) network-monitoring tool that allows for enterprise-wide operations management and support of the image distribution network. The MagicWatch (Siemens, Iselin, NJ) PACS/radiology information system (RIS) monitoring station from Siemens has allowed our organization to create a service support structure that has given us proactive control of our environment and has allowed us to meet the service level performance expectations of the users. The Radiology Help Desk has used the MagicWatch PACS monitoring station as an applications support tool that has allowed the group to monitor network activity and individual systems performance at each node. Fast and timely recognition of the effects of single events within the PACS/RIS environment has allowed the group to proactively recognize possible performance issues and resolve problems. The PACS/operations group performs network management control, image storage management, and software distribution management from a single, central point in the enterprise. The MagicWatch station allows for the complete automation of software distribution, installation, and configuration process across all the nodes in the system. The tool has allowed for the standardization of the workstations and provides a central configuration control for the establishment and maintenance of the system standards. This report will describe the PACS management and operation prior to the implementation of the MagicWatch PACS monitoring station and will highlight the operational benefits of a centralized network and system-monitoring tool.

  13. 4D-SFM Photogrammetry for Monitoring Sediment Dynamics in a Debris-Flow Catchment: Software Testing and Results Comparison

    NASA Astrophysics Data System (ADS)

    Cucchiaro, S.; Maset, E.; Fusiello, A.; Cazorzi, F.

    2018-05-01

    In recent years, the combination of Structure-from-Motion (SfM) algorithms and UAV-based aerial images has revolutionised 3D topographic surveys for natural environment monitoring, offering low-cost, fast and high quality data acquisition and processing. A continuous monitoring of the morphological changes through multi-temporal (4D) SfM surveys allows, e.g., to analyse the torrent dynamic also in complex topography environment like debris-flow catchments, provided that appropriate tools and procedures are employed in the data processing steps. In this work we test two different software packages (3DF Zephyr Aerial and Agisoft Photoscan) on a dataset composed of both UAV and terrestrial images acquired on a debris-flow reach (Moscardo torrent - North-eastern Italian Alps). Unlike other papers in the literature, we evaluate the results not only on the raw point clouds generated by the Structure-from- Motion and Multi-View Stereo algorithms, but also on the Digital Terrain Models (DTMs) created after post-processing. Outcomes show differences between the DTMs that can be considered irrelevant for the geomorphological phenomena under analysis. This study confirms that SfM photogrammetry can be a valuable tool for monitoring sediment dynamics, but accurate point cloud post-processing is required to reliably localize geomorphological changes.

  14. Force Sensor Based Tool Condition Monitoring Using a Heterogeneous Ensemble Learning Model

    PubMed Central

    Wang, Guofeng; Yang, Yinwei; Li, Zhimeng

    2014-01-01

    Tool condition monitoring (TCM) plays an important role in improving machining efficiency and guaranteeing workpiece quality. In order to realize reliable recognition of the tool condition, a robust classifier needs to be constructed to depict the relationship between tool wear states and sensory information. However, because of the complexity of the machining process and the uncertainty of the tool wear evolution, it is hard for a single classifier to fit all the collected samples without sacrificing generalization ability. In this paper, heterogeneous ensemble learning is proposed to realize tool condition monitoring in which the support vector machine (SVM), hidden Markov model (HMM) and radius basis function (RBF) are selected as base classifiers and a stacking ensemble strategy is further used to reflect the relationship between the outputs of these base classifiers and tool wear states. Based on the heterogeneous ensemble learning classifier, an online monitoring system is constructed in which the harmonic features are extracted from force signals and a minimal redundancy and maximal relevance (mRMR) algorithm is utilized to select the most prominent features. To verify the effectiveness of the proposed method, a titanium alloy milling experiment was carried out and samples with different tool wear states were collected to build the proposed heterogeneous ensemble learning classifier. Moreover, the homogeneous ensemble learning model and majority voting strategy are also adopted to make a comparison. The analysis and comparison results show that the proposed heterogeneous ensemble learning classifier performs better in both classification accuracy and stability. PMID:25405514

  15. Force sensor based tool condition monitoring using a heterogeneous ensemble learning model.

    PubMed

    Wang, Guofeng; Yang, Yinwei; Li, Zhimeng

    2014-11-14

    Tool condition monitoring (TCM) plays an important role in improving machining efficiency and guaranteeing workpiece quality. In order to realize reliable recognition of the tool condition, a robust classifier needs to be constructed to depict the relationship between tool wear states and sensory information. However, because of the complexity of the machining process and the uncertainty of the tool wear evolution, it is hard for a single classifier to fit all the collected samples without sacrificing generalization ability. In this paper, heterogeneous ensemble learning is proposed to realize tool condition monitoring in which the support vector machine (SVM), hidden Markov model (HMM) and radius basis function (RBF) are selected as base classifiers and a stacking ensemble strategy is further used to reflect the relationship between the outputs of these base classifiers and tool wear states. Based on the heterogeneous ensemble learning classifier, an online monitoring system is constructed in which the harmonic features are extracted from force signals and a minimal redundancy and maximal relevance (mRMR) algorithm is utilized to select the most prominent features. To verify the effectiveness of the proposed method, a titanium alloy milling experiment was carried out and samples with different tool wear states were collected to build the proposed heterogeneous ensemble learning classifier. Moreover, the homogeneous ensemble learning model and majority voting strategy are also adopted to make a comparison. The analysis and comparison results show that the proposed heterogeneous ensemble learning classifier performs better in both classification accuracy and stability.

  16. Demonstration of FBRM as process analytical technology tool for dewatering processes via CST correlation.

    PubMed

    Cobbledick, Jeffrey; Nguyen, Alexander; Latulippe, David R

    2014-07-01

    The current challenges associated with the design and operation of net-energy positive wastewater treatment plants demand sophisticated approaches for the monitoring of polymer-induced flocculation. In anaerobic digestion (AD) processes, the dewaterability of the sludge is typically assessed from off-line lab-bench tests - the capillary suction time (CST) test is one of the most common. Focused beam reflectance measurement (FBRM) is a promising technique for real-time monitoring of critical performance attributes in large scale processes and is ideally suited for dewatering applications. The flocculation performance of twenty-four cationic polymers, that spanned a range of polymer size and charge properties, was measured using both the FBRM and CST tests. Analysis of the data revealed a decreasing monotonic trend; the samples that had the highest percent removal of particles less than 50 microns in size as determined by FBRM had the lowest CST values. A subset of the best performing polymers was used to evaluate the effects of dosage amount and digestate sources on dewatering performance. The results from this work show that FBRM is a powerful tool that can be used for optimization and on-line monitoring of dewatering processes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Environmental and Landscape Remote Sensing Using Free and Open Source Image Processing Tools

    EPA Science Inventory

    As global climate change and human activities impact the environment, there is a growing need for scientific tools to monitor and measure environmental conditions that support human and ecological health. Remotely sensed imagery from satellite and airborne platforms provides a g...

  18. [Methodology for construction of a panel of indicators for monitoring and evaluation of unified health system (SUS) management].

    PubMed

    Tamaki, Edson Mamoru; Tanaka, Oswaldo Yoshimi; Felisberto, Eronildo; Alves, Cinthia Kalyne de Almeida; Drumond Junior, Marcos; Bezerra, Luciana Caroline de Albuquerque; Calvo, Maria Cristina Marino; Miranda, Alcides Silva de

    2012-04-01

    This study sought to develop methodology for the construction of a Panel for the Monitoring and Evaluation of Management of the Unified Health System (SUS). The participative process used in addition to the systematization conducted made it possible to identify an effective strategy for building management tools in partnership with researchers, academic institutions and managers of the SUS. The final systematization of the Panel selected indicators for the management of the SUS in terms of Demand, Inputs, Processes, Outputs and Outcomes in order to provide a simple, versatile and useful tool for evaluation at any level of management and more transparent and easier communication with all stakeholders in decision-making. Taking the management of the SUS as the scope of these processes and practices in all normative aspects enabled dialog between systemic theories and those which consider the centrality of the social actor in the decision-making process.

  19. The monitoring of transient regimes on machine tools based on speed, acceleration and active electric power absorbed by motors

    NASA Astrophysics Data System (ADS)

    Horodinca, M.

    2016-08-01

    This paper intend to propose some new results related with computer aided monitoring of transient regimes on machine-tools based on the evolution of active electrical power absorbed by the electric motor used to drive the main kinematic chains and the evolution of rotational speed and acceleration of the main shaft. The active power is calculated in numerical format using the evolution of instantaneous voltage and current delivered by electrical power system to the electric motor. The rotational speed and acceleration of the main shaft are calculated based on the signal delivered by a sensor. Three real-time analogic signals are acquired with a very simple computer assisted setup which contains a voltage transformer, a current transformer, an AC generator as rotational speed sensor, a data acquisition system and a personal computer. The data processing and analysis was done using Matlab software. Some different transient regimes were investigated; several important conclusions related with the advantages of this monitoring technique were formulated. Many others features of the experimental setup are also available: to supervise the mechanical loading of machine-tools during cutting processes or for diagnosis of machine-tools condition by active electrical power signal analysis in frequency domain.

  20. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  1. GROUND WATER ISSUE - CALCULATION AND USE OF FIRST-ORDER RATE CONSTANTS FOR MONITORED NATURAL ATTENUATION STUDIES

    EPA Science Inventory

    This issue paper explains when and how to apply first-order attenuation rate constant calculations in monitored natural attenuation (MNA) studies. First-order attenuation rate constant calculations can be an important tool for evaluating natural attenuation processes at ground-wa...

  2. Quantitative monitoring of an activated sludge reactor using on-line UV-visible and near-infrared spectroscopy.

    PubMed

    Sarraguça, Mafalda C; Paulo, Ana; Alves, Madalena M; Dias, Ana M A; Lopes, João A; Ferreira, Eugénio C

    2009-10-01

    The performance of an activated sludge reactor can be significantly enhanced through use of continuous and real-time process-state monitoring, which avoids the need to sample for off-line analysis and to use chemicals. Despite the complexity associated with wastewater treatment systems, spectroscopic methods coupled with chemometric tools have been shown to be powerful tools for bioprocess monitoring and control. Once implemented and optimized, these methods are fast, nondestructive, user friendly, and most importantly, they can be implemented in situ, permitting rapid inference of the process state at any moment. In this work, UV-visible and NIR spectroscopy were used to monitor an activated sludge reactor using in situ immersion probes connected to the respective analyzers by optical fibers. During the monitoring period, disturbances to the biological system were induced to test the ability of each spectroscopic method to detect the changes in the system. Calibration models based on partial least squares (PLS) regression were developed for three key process parameters, namely chemical oxygen demand (COD), nitrate concentration (N-NO(3)(-)), and total suspended solids (TSS). For NIR, the best results were achieved for TSS, with a relative error of 14.1% and a correlation coefficient of 0.91. The UV-visible technique gave similar results for the three parameters: an error of approximately 25% and correlation coefficients of approximately 0.82 for COD and TSS and 0.87 for N-NO(3)(-) . The results obtained demonstrate that both techniques are suitable for consideration as alternative methods for monitoring and controlling wastewater treatment processes, presenting clear advantages when compared with the reference methods for wastewater treatment process qualification.

  3. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    NASA Astrophysics Data System (ADS)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014, Bishkek, Kyrgyz Republic. [2] UNISDR, "Living with Risk", Geneva, Switzerland, 2004. [3] P. Bisch, E. Carvalho, H. Degree, P. Fajfar, M. Fardis, P. Franchin, M. Kreslin, A. Pecker, "Eurocode 8: Seismic Design of Buildings", Lisbon, 2011. (SENSUM: www.sensum-project.eu, grant number: 312972 ) (RASOR: www.rasor-project.eu, grant number: 606888 )

  4. Getting a handle on virtual tools: An examination of the neuronal activity associated with virtual tool use.

    PubMed

    Rallis, Austin; Fercho, Kelene A; Bosch, Taylor J; Baugh, Lee A

    2018-01-31

    Tool use is associated with three visual streams-dorso-dorsal, ventro-dorsal, and ventral visual streams. These streams are involved in processing online motor planning, action semantics, and tool semantics features, respectively. Little is known about the way in which the brain represents virtual tools. To directly assess this question, a virtual tool paradigm was created that provided the ability to manipulate tool components in isolation of one another. During functional magnetic resonance imaging (fMRI), adult participants performed a series of virtual tool manipulation tasks in which vision and movement kinematics of the tool were manipulated. Reaction time and hand movement direction were monitored while the tasks were performed. Functional imaging revealed that activity within all three visual streams was present, in a similar pattern to what would be expected with physical tool use. However, a previously unreported network of right-hemisphere activity was found including right inferior parietal lobule, middle and superior temporal gyri and supramarginal gyrus - regions well known to be associated with tool processing within the left hemisphere. These results provide evidence that both virtual and physical tools are processed within the same brain regions, though virtual tools recruit bilateral tool processing regions to a greater extent than physical tools. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Monitoring biological diversity: strategies, tools, limitations, and challenges

    USGS Publications Warehouse

    Beever, E.A.

    2006-01-01

    Monitoring is an assessment of the spatial and temporal variability in one or more ecosystem properties, and is an essential component of adaptive management. Monitoring can help determine whether mandated environmental standards are being met and can provide an early-warning system of ecological change. Development of a strategy for monitoring biological diversity will likely be most successful when based upon clearly articulated goals and objectives and may be enhanced by including several key steps in the process. Ideally, monitoring of biological diversity will measure not only composition, but also structure and function at the spatial and temporal scales of interest. Although biodiversity monitoring has several key limitations as well as numerous theoretical and practical challenges, many tools and strategies are available to address or overcome such challenges; I summarize several of these. Due to the diversity of spatio-temporal scales and comprehensiveness encompassed by existing definitions of biological diversity, an effective monitoring design will reflect the desired sampling domain of interest and its key stressors, available funding, legal requirements, and organizational goals.

  6. Using stamping punch force variation for the identification of changes in lubrication and wear mechanism

    NASA Astrophysics Data System (ADS)

    Voss, B. M.; Pereira, M. P.; Rolfe, B. F.; Doolan, M. C.

    2017-09-01

    The growth in use of Advanced High Strength Steels in the automotive industry for light-weighting and safety has increased the rates of tool wear in sheet metal stamping. This is an issue that adds significant costs to production in terms of manual inspection and part refinishing. To reduce these costs, a tool condition monitoring system is required and a firm understanding of process signal variation must form the foundation for any such monitoring system. Punch force is a stamping process signal that is widely collected by industrial presses and has been linked closely to part quality and tool condition, making it an ideal candidate as a tool condition monitoring signal. In this preliminary investigation, the variation of punch force due to different lubrication conditions and progressive wear are examined. Linking specific punch force signature changes to developing lubrication and wear events is valuable for die wear and stamping condition monitoring. A series of semi-industrial channel forming trials were conducted under different lubrication regimes and progressive die wear. Punch force signatures were captured for each part and Principal Component Analysis (PCA) was applied to determine the key Principal Components of the signature data sets. These Principal Components were linked to the evolution of friction conditions over the course of the stroke for the different lubrication regimes and mechanism of galling wear. As a result, variation in punch force signatures were correlated to the current mechanism of wear dominant on the formed part; either abrasion or adhesion, and to changes in lubrication mechanism. The outcomes of this study provide important insights into punch force signature variation, that will provide a foundation for future work into the development of die wear and lubrication monitoring systems for sheet metal stamping.

  7. In-line monitoring of pellet coating thickness growth by means of visual imaging.

    PubMed

    Oman Kadunc, Nika; Sibanc, Rok; Dreu, Rok; Likar, Boštjan; Tomaževič, Dejan

    2014-08-15

    Coating thickness is the most important attribute of coated pharmaceutical pellets as it directly affects release profiles and stability of the drug. Quality control of the coating process of pharmaceutical pellets is thus of utmost importance for assuring the desired end product characteristics. A visual imaging technique is presented and examined as a process analytic technology (PAT) tool for noninvasive continuous in-line and real time monitoring of coating thickness of pharmaceutical pellets during the coating process. Images of pellets were acquired during the coating process through an observation window of a Wurster coating apparatus. Image analysis methods were developed for fast and accurate determination of pellets' coating thickness during a coating process. The accuracy of the results for pellet coating thickness growth obtained in real time was evaluated through comparison with an off-line reference method and a good agreement was found. Information about the inter-pellet coating uniformity was gained from further statistical analysis of the measured pellet size distributions. Accuracy and performance analysis of the proposed method showed that visual imaging is feasible as a PAT tool for in-line and real time monitoring of the coating process of pharmaceutical pellets. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Effects of Heterogeneities, Sampling Frequencies, Tools and Methods on Uncertainties in Subsurface Contaminant Concentration Measurements

    NASA Astrophysics Data System (ADS)

    Ezzedine, S. M.; McNab, W. W.

    2007-12-01

    Long-term monitoring (LTM) is particularly important for contaminants which are mitigated by natural processes of dilution, dispersion, and degradation. At many sites, LTM can require decades of expensive sampling at tens or even hundreds of existing monitoring wells, resulting in hundreds of thousands, or millions of dollars per year for sampling and data management. Therefore, contaminant sampling tools, methods and frequencies are chosen to minimize waste and data management costs while ensuring a reliable and informative time-history of contaminant measurement for regulatory compliance. The interplay play between cause (i.e. subsurface heterogeneities, sampling techniques, measurement frequencies) and effect (unreliable data and measurements gap) has been overlooked in many field applications which can lead to inconsistencies in time- histories of contaminant samples. In this study we address the relationship between cause and effect for different hydrogeological sampling settings: porous and fractured media. A numerical model has been developed using AMR-FEM to solve the physicochemical processes that take place in the aquifer and the monitoring well. In the latter, the flow is governed by the Navier-Stokes equations while in the former the flow is governed by the diffusivity equation; both are fully coupled to mimic stressed conditions and to assess the effect of dynamic sampling tool on the formation surrounding the monitoring well. First of all, different sampling tools (i.e., Easy Pump, Snapper Grab Sampler) were simulated in a monitoring well screened in different homogeneous layered aquifers to assess their effect on the sampling measurements. Secondly, in order to make the computer runs more CPU efficient the flow in the monitoring well was replaced by its counterpart flow in porous media with infinite permeability and the new model was used to simulate the effect of heterogeneities, sampling depth, sampling tool and sampling frequencies on the uncertainties in the concentration measurements. Finally, the models and results were abstracted using a simple mixed-tank approach to further simplify the models and make them more accessible to field hydrogeologists. During the abstraction process a novel method was developed for mapping streamlines in the fractures as well within the monitoring well to illustrate mixing and mixing zones. Applications will be demonstrated for both sampling in porous and fractured media. This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.

  9. New tools using the hardware performance monitor to help users tune programs on the Cray X-MP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engert, D.E.; Rudsinski, L.; Doak, J.

    1991-09-25

    The performance of a Cray system is highly dependent on the tuning techniques used by individuals on their codes. Many of our users were not taking advantage of the tuning tools that allow them to monitor their own programs by using the Hardware Performance Monitor (HPM). We therefore modified UNICOS to collect HPM data for all processes and to report Mflop ratings based on users, programs, and time used. Our tuning efforts are now being focused on the users and programs that have the best potential for performance improvements. These modifications and some of the more striking performance improvements aremore » described.« less

  10. Enhancing wind erosion monitoring and assessment for U.S. rangelands

    USGS Publications Warehouse

    Webb, Nicholas P.; Van Zee, Justin W.; Karl, Jason W.; Herrick, Jeffrey E.; Courtright, Ericha M.; Billings, Benjamin J.; Boyd, Robert C.; Chappell, Adrian; Duniway, Michael C.; Derner, Justin D.; Hand, Jenny L.; Kachergis, Emily; McCord, Sarah E.; Newingham, Beth A.; Pierson, Frederick B.; Steiner, Jean L.; Tatarko, John; Tedela, Negussie H.; Toledo, David; Van Pelt, R. Scott

    2017-01-01

    On the GroundWind erosion is a major resource concern for rangeland managers because it can impact soil health, ecosystem structure and function, hydrologic processes, agricultural production, and air quality.Despite its significance, little is known about which landscapes are eroding, by how much, and when.The National Wind Erosion Research Network was established in 2014 to develop tools for monitoring and assessing wind erosion and dust emissions across the United States.The Network, currently consisting of 13 sites, creates opportunities to enhance existing rangeland soil, vegetation, and air quality monitoring programs.Decision-support tools developed by the Network will improve the prediction and management of wind erosion across rangeland ecosystems.

  11. NEXT GENERATION ANALYSIS SOFTWARE FOR COMPONENT EVALUATION - Results of Rotational Seismometer Evaluation

    NASA Astrophysics Data System (ADS)

    Hart, D. M.; Merchant, B. J.; Abbott, R. E.

    2012-12-01

    The Component Evaluation project at Sandia National Laboratories supports the Ground-based Nuclear Explosion Monitoring program by performing testing and evaluation of the components that are used in seismic and infrasound monitoring systems. In order to perform this work, Component Evaluation maintains a testing facility called the FACT (Facility for Acceptance, Calibration, and Testing) site, a variety of test bed equipment, and a suite of software tools for analyzing test data. Recently, Component Evaluation has successfully integrated several improvements to its software analysis tools and test bed equipment that have substantially improved our ability to test and evaluate components. The software tool that is used to analyze test data is called TALENT: Test and AnaLysis EvaluatioN Tool. TALENT is designed to be a single, standard interface to all test configuration, metadata, parameters, waveforms, and results that are generated in the course of testing monitoring systems. It provides traceability by capturing everything about a test in a relational database that is required to reproduce the results of that test. TALENT provides a simple, yet powerful, user interface to quickly acquire, process, and analyze waveform test data. The software tool has also been expanded recently to handle sensors whose output is proportional to rotation angle, or rotation rate. As an example of this new processing capability, we show results from testing the new ATA ARS-16 rotational seismometer. The test data was collected at the USGS ASL. Four datasets were processed: 1) 1 Hz with increasing amplitude, 2) 4 Hz with increasing amplitude, 3) 16 Hz with increasing amplitude and 4) twenty-six discrete frequencies between 0.353 Hz to 64 Hz. The results are compared to manufacture-supplied data sheets.

  12. Development and implementation of an independence rating scale and evaluation process for nursing orientation of new graduates.

    PubMed

    Durkin, Gregory J

    2010-01-01

    A wide variety of evaluation formats are available for new graduate nurses, but most of them are single-point evaluation tools that do not provide a clear picture of progress for orientee or educator. This article describes the development of a Web-based evaluation tool that combines learning taxonomies with the Synergy model into a rating scale based on independent performance. The evaluation tool and process provides open 24/7 access to evaluation documentation for members of the orientation team, demystifying the process and clarifying expectations. The implementation of the tool has proven to be transformative in the perceptions of evaluation and performance expectations of new graduates. This tool has been successful at monitoring progress, altering education, and opening dialogue about performance for over 125 new graduate nurses since inception.

  13. In-line verification of linewidth uniformity for 0.18 and below: design rule reticles

    NASA Astrophysics Data System (ADS)

    Tan, TaiSheng; Kuo, Shen C.; Wu, Clare; Falah, Reuven; Hemar, Shirley; Sade, Amikam; Gottlib, Gidon

    2000-07-01

    Mask making process development and control is addressed using a reticle inspection tool equipped with the new revolutionized application called LBM-Linewidth Bias Monitoring. In order to use the LBM for mask-making process control, procedures and corresponding test plates are a developed, such that routine monitoring of the manufacturing process discloses process variation and machine variation. At the same time systematic variation are studied and either taken care of or taken into consideration to allow successful production line work. In this paper the contribution of the LBM for mask quality monitoring is studied with respect to dense layers, e.g. DRAM. Another aspect of this application - the detection of very small CD mis-uniformity areas is discussed.

  14. A novel toolbox for E. coli lysis monitoring.

    PubMed

    Rajamanickam, Vignesh; Wurm, David; Slouka, Christoph; Herwig, Christoph; Spadiut, Oliver

    2017-01-01

    The bacterium Escherichia coli is a well-studied recombinant host organism with a plethora of applications in biotechnology. Highly valuable biopharmaceuticals, such as antibody fragments and growth factors, are currently being produced in E. coli. However, the high metabolic burden during recombinant protein production can lead to cell death, consequent lysis, and undesired product loss. Thus, fast and precise analyzers to monitor E. coli bioprocesses and to retrieve key process information, such as the optimal time point of harvest, are needed. However, such reliable monitoring tools are still scarce to date. In this study, we cultivated an E. coli strain producing a recombinant single-chain antibody fragment in the cytoplasm. In bioreactor cultivations, we purposely triggered cell lysis by pH ramps. We developed a novel toolbox using UV chromatograms as fingerprints and chemometric techniques to monitor these lysis events and used flow cytometry (FCM) as reference method to quantify viability offline. Summarizing, we were able to show that a novel toolbox comprising HPLC chromatogram fingerprinting and data science tools allowed the identification of E. coli lysis in a fast and reliable manner. We are convinced that this toolbox will not only facilitate E. coli bioprocess monitoring but will also allow enhanced process control in the future.

  15. Developing the Tools for Geologic Repository Monitoring - Andra's Monitoring R and D Program - 12045

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buschaert, S.; Lesoille, S.; Bertrand, J.

    2012-07-01

    The French Safety Guide recommends that Andra develop a monitoring program to be implemented during repository construction and conducted until (and possibly after) closure, in order to confirm expected behavior and enhance knowledge of relevant processes. To achieve this, Andra has developed an overall monitoring strategy and identified specific technical objectives to inform disposal process management on evolutions relevant to both the long term safety and reversible, pre-closure management of the repository. Andra has launched an ambitious R and D program to ensure that reliable, durable, metrologically qualified and tested monitoring systems will be available at the time of repositorymore » construction in order to respond to monitoring objectives. After four years of a specific R and D program, first observations are described and recommendations are proposed. The results derived from 4 years of Andra's R and D program allow three main observations to be shared. First, while other industries also invest in monitoring equipment, their obvious emphasis will always be on their specific requirements and needs, thus often only providing a partial match with repository requirements. Examples can be found for all available sensors, which are generally not resistant to radiation. Second, the very close scrutiny anticipated for the geologic disposal process is likely to place an unprecedented emphasis on the quality of monitoring results. It therefore seems important to emphasize specific developments with an aim at providing metrologically qualified systems. Third, adapting existing technology to specific repository needs, and providing adequate proof of their worth, is a lengthy process. In conclusion, it therefore seems prudent to plan ahead and to invest wisely in the adequate development of those monitoring tools that will likely be needed in the repository to respond to the implementers' and regulators' requirements, including those agreed and developed to respond to potential stakeholder expectations. (authors)« less

  16. The manipulator tool state classification based on inertia forces analysis

    NASA Astrophysics Data System (ADS)

    Gierlak, Piotr

    2018-07-01

    In this article, we discuss the detection of damage to the cutting tool used in robotised light mechanical processing. Continuous monitoring of the state of the tool mounted in the tool holder of the robot is required due to the necessity to save time. The tool is a brush with ceramic fibres used for surface grinding. A typical example of damage to the brush is the breaking of fibres, resulting in a tool imbalance and vibrations at a high rotational speed, e.g. during grinding. This also results in a limited operating surface of the tool and a decrease in the efficiency of processing. While an imbalanced tool is spinning, fictitious forces occur that carry the information regarding the balance of the tool. The forces can be measured using a force sensor located in the end-effector of the robot allowing the assessment of the damage to the brush in an automatized way, devoid of any operator.

  17. Supervised Classification Processes for the Characterization of Heritage Elements, Case Study: Cuenca-Ecuador

    NASA Astrophysics Data System (ADS)

    Briones, J. C.; Heras, V.; Abril, C.; Sinchi, E.

    2017-08-01

    The proper control of built heritage entails many challenges related to the complexity of heritage elements and the extent of the area to be managed, for which the available resources must be efficiently used. In this scenario, the preventive conservation approach, based on the concept that prevent is better than cure, emerges as a strategy to avoid the progressive and imminent loss of monuments and heritage sites. Regular monitoring appears as a key tool to identify timely changes in heritage assets. This research demonstrates that the supervised learning model (Support Vector Machines - SVM) is an ideal tool that supports the monitoring process detecting visible elements in aerial images such as roofs structures, vegetation and pavements. The linear, gaussian and polynomial kernel functions were tested; the lineal function provided better results over the other functions. It is important to mention that due to the high level of segmentation generated by the classification procedure, it was necessary to apply a generalization process through opening a mathematical morphological operation, which simplified the over classification for the monitored elements.

  18. Advanced Signal Processing for High Temperatures Health Monitoring of Condensed Water Height in Steam Pipes

    NASA Technical Reports Server (NTRS)

    Lih, Shyh-Shiuh; Bar-Cohen, Yoseph; Lee, Hyeong Jae; Takano, Nobuyuki; Bao, Xiaoqi

    2013-01-01

    An advanced signal processing methodology is being developed to monitor the height of condensed water thru the wall of a steel pipe while operating at temperatures as high as 250deg. Using existing techniques, previous study indicated that, when the water height is low or there is disturbance in the environment, the predicted water height may not be accurate. In recent years, the use of the autocorrelation and envelope techniques in the signal processing has been demonstrated to be a very useful tool for practical applications. In this paper, various signal processing techniques including the auto correlation, Hilbert transform, and the Shannon Energy Envelope methods were studied and implemented to determine the water height in the steam pipe. The results have shown that the developed method provides a good capability for monitoring the height in the regular conditions. An alternative solution for shallow water or no water conditions based on a developed hybrid method based on Hilbert transform (HT) with a high pass filter and using the optimized windowing technique is suggested. Further development of the reported methods would provide a powerful tool for the identification of the disturbances of water height inside the pipe.

  19. Centralized Monitoring of the Microsoft Windows-based computers of the LHC Experiment Control Systems

    NASA Astrophysics Data System (ADS)

    Varela Rodriguez, F.

    2011-12-01

    The control system of each of the four major Experiments at the CERN Large Hadron Collider (LHC) is distributed over up to 160 computers running either Linux or Microsoft Windows. A quick response to abnormal situations of the computer infrastructure is crucial to maximize the physics usage. For this reason, a tool was developed to supervise, identify errors and troubleshoot such a large system. Although the monitoring of the performance of the Linux computers and their processes was available since the first versions of the tool, it is only recently that the software package has been extended to provide similar functionality for the nodes running Microsoft Windows as this platform is the most commonly used in the LHC detector control systems. In this paper, the architecture and the functionality of the Windows Management Instrumentation (WMI) client developed to provide centralized monitoring of the nodes running different flavour of the Microsoft platform, as well as the interface to the SCADA software of the control systems are presented. The tool is currently being commissioned by the Experiments and it has already proven to be very efficient optimize the running systems and to detect misbehaving processes or nodes.

  20. In-situ measurement of processing properties during fabrication in a production tool

    NASA Technical Reports Server (NTRS)

    Kranbuehl, D. E.; Haverty, P.; Hoff, M.; Loos, A. C.

    1988-01-01

    Progress is reported on the use of frequency-dependent electromagnetic measurements (FDEMs) as a single, convenient technique for continuous in situ monitoring of polyester cure during fabrication in a laboratory and manufacturing environment. Preliminary FDEM sensor and modeling work using the Loss-Springer model in order to develop an intelligent closed-loop, sensor-controlled cure process is described. FDEMs using impedance bridges in the Hz to MHz region is found to be ideal for automatically monitoring polyester processing properties continuously throughout the cure cycle.

  1. Geophysical methods for monitoring soil stabilization processes

    NASA Astrophysics Data System (ADS)

    Saneiyan, Sina; Ntarlagiannis, Dimitrios; Werkema, D. Dale; Ustra, Andréa

    2018-01-01

    Soil stabilization involves methods used to turn unconsolidated and unstable soil into a stiffer, consolidated medium that could support engineered structures, alter permeability, change subsurface flow, or immobilize contamination through mineral precipitation. Among the variety of available methods carbonate precipitation is a very promising one, especially when it is being induced through common soil borne microbes (MICP - microbial induced carbonate precipitation). Such microbial mediated precipitation has the added benefit of not harming the environment as other methods can be environmentally detrimental. Carbonate precipitation, typically in the form of calcite, is a naturally occurring process that can be manipulated to deliver the expected soil strengthening results or permeability changes. This study investigates the ability of spectral induced polarization and shear-wave velocity for monitoring calcite driven soil strengthening processes. The results support the use of these geophysical methods as soil strengthening characterization and long term monitoring tools, which is a requirement for viable soil stabilization projects. Both tested methods are sensitive to calcite precipitation, with SIP offering additional information related to long term stability of precipitated carbonate. Carbonate precipitation has been confirmed with direct methods, such as direct sampling and scanning electron microscopy (SEM). This study advances our understanding of soil strengthening processes and permeability alterations, and is a crucial step for the use of geophysical methods as monitoring tools in microbial induced soil alterations through carbonate precipitation.

  2. New methodology to baseline and match AME polysilicon etcher using advanced diagnostic tools

    NASA Astrophysics Data System (ADS)

    Poppe, James; Shipman, John; Reinhardt, Barbara E.; Roussel, Myriam; Hedgecock, Raymond; Fonda, Arturo

    1999-09-01

    As process controls tighten in the semiconductor industry, the need to understand the variables that determine system performance become more important. For plasma etch systems, process success depends on the control of key parameters such as: vacuum integrity, pressure, gas flows, and RF power. It is imperative to baseline, monitor, and control these variables. This paper presents an overview of the methods and tools used by Motorola BMC fabrication facility to characterize an Applied Materials polysilicon etcher. Tool performance data obtained from our traditional measurement techniques are limited in their scope and do not provide a complete picture of the ultimate tool performance. Presently the BMC traditional characterization tools provide a snapshot of the static operation of the equipment under test (EUT); however, complete evaluation of the dynamic performance cannot be monitored without the aid of specialized diagnostic equipment. To provide us with a complete system baseline evaluation of the polysilicon etcher, three diagnostic tools were utilized: Lucas Labs Vacuum Diagnostic System, Residual Gas Analyzer, and the ENI Voltage/Impedance Probe. The diagnostic methodology used to baseline and match key parameters of qualified production equipment has had an immense impact on other equipment characterization in the facility. It has resulted in reduced cycle time for new equipment introduction as well.

  3. Methodologies and Tools for Tuning Parallel Programs: 80% Art, 20% Science, and 10% Luck

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Bailey, David (Technical Monitor)

    1996-01-01

    The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessors. However, without effective means to monitor (and analyze) program execution, tuning the performance of parallel programs becomes exponentially difficult as program complexity and machine size increase. In the past few years, the ubiquitous introduction of performance tuning tools from various supercomputer vendors (Intel's ParAide, TMC's PRISM, CRI's Apprentice, and Convex's CXtrace) seems to indicate the maturity of performance instrumentation/monitor/tuning technologies and vendors'/customers' recognition of their importance. However, a few important questions remain: What kind of performance bottlenecks can these tools detect (or correct)? How time consuming is the performance tuning process? What are some important technical issues that remain to be tackled in this area? This workshop reviews the fundamental concepts involved in analyzing and improving the performance of parallel and heterogeneous message-passing programs. Several alternative strategies will be contrasted, and for each we will describe how currently available tuning tools (e.g. AIMS, ParAide, PRISM, Apprentice, CXtrace, ATExpert, Pablo, IPS-2) can be used to facilitate the process. We will characterize the effectiveness of the tools and methodologies based on actual user experiences at NASA Ames Research Center. Finally, we will discuss their limitations and outline recent approaches taken by vendors and the research community to address them.

  4. Process and control systems for composites manufacturing

    NASA Technical Reports Server (NTRS)

    Tsiang, T. H.; Wanamaker, John L.

    1992-01-01

    A precise control of composite material processing would not only improve part quality, but it would also directly reduce the overall manufacturing cost. The development and incorporation of sensors will help to generate real-time information for material processing relationships and equipment characteristics. In the present work, the thermocouple, pressure transducer, and dielectrometer technologies were investigated. The monitoring sensors were integrated with the computerized control system in three non-autoclave fabrication techniques: hot-press, self contained tool (self heating and pressurizing), and pressure vessel). The sensors were implemented in the parts and tools.

  5. Monitoring ibuprofen-nicotinamide cocrystal formation during solvent free continuous cocrystallization (SFCC) using near infrared spectroscopy as a PAT tool.

    PubMed

    Kelly, A L; Gough, T; Dhumal, R S; Halsey, S A; Paradkar, A

    2012-04-15

    The purpose of this work was to explore NIR spectroscopy as a PAT tool to monitor the formation of ibuprofen and nicotinamide cocrystals during extrusion based solvent free continuous cocrystallization (SFCC). Drug and co-former were gravimetrically fed into a heated co-rotating twin screw extruder to form cocrystals. Real-time process monitoring was performed using a high temperature NIR probe in the extruder die to assess cocrystal content and subsequently compared to off-line powder X-ray diffraction measurements. The effect of processing variables, such as temperature and mixing intensity, on the extent of cocrystal formation was investigated. NIR spectroscopy was sensitive to cocrystal formation with the appearance of new peaks and peak shifts, particularly in the 4800-5200 cm(-1) wave-number region. PXRD confirmed an increased conversion of the mixture into cocrystal with increase in barrel temperature and screw mixing intensity. A decrease in screw rotation speed also provided improved cocrystal yield due to the material experiencing longer residence times within the process. A partial least squares analysis in this region of NIR spectrum correlated well with PXRD data, providing a best fit with cocrystal conversion when a limited range of process conditions were considered, for example a single set temperature. The study suggests that NIR spectroscopy could be used to monitor cocrystal purity on an industrial scale using this continuous, solvent-free process. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Particle monitoring and control in vacuum processing equipment

    NASA Astrophysics Data System (ADS)

    Borden, Peter G., Dr.; Gregg, John

    1989-10-01

    Particle contamination during vacuum processes has emerged as the largest single source of yield loss in VLSI manufacturing. While a number of tools have been available to help understand the sources and nature of this contamination, only recently has it been possible to monitor free particle levels within vacuum equipment in real-time. As a result, a better picture is available of how particle contamination can affect a variety of processes. This paper reviews some of the work that has been done to monitor particles in vacuum loadlocks and in processes such as etching, sputtering and ion implantation. The aim has been to make free particles in vacuum equipment a measurable process parameter. Achieving this allows particles to be controlled using statistical process control. It will be shown that free particle levels in load locks correlate to wafer surface counts, device yield and process conditions, but that these levels are considerable higher during production than when dummy wafers are run to qualify a system. It will also be shown how real-time free particle monitoring can be used to monitor and control cleaning cycles, how major episodic events can be detected, and how data can be gathered in a format suitable for statistical process control.

  7. Emergency preparedness: community-based short-term eruption forecasting at Campi Flegrei

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Marzocchi, Warner; Civetta, Lucia; Del Pezzo, Edoardo; Papale, Paolo

    2010-05-01

    A key element in emergency preparedness is to define advance tools to assist decision makers and emergency management groups during crises. Such tools must be prepared in advance, accounting for all of expertise and scientific knowledge accumulated through time. During a pre-eruptive phase, the key for sound short-term eruption forecasting is the analysis of the monitoring signals. This involves the capability (i) to recognize anomalous signals and to relate single or combined anomalies to physical processes, assigning them probability values, and (ii) to quickly provide an answer to the observed phenomena even when unexpected. Here we present a > 4 years long process devoted to define the pre-eruptive Event Tree (ET) for Campi Flegrei. A community of about 40 experts in volcanology and volcano monitoring participating to two Italian Projects on Campi Flegrei funded by the Italian Civil Protection, has been constituted and trained during periodic meetings on the statistical methods and the model BET_EF (Marzocchi et al., 2008) that forms the statistical package tool for ET definition. Model calibration has been carried out through public elicitation sessions, preceded and followed by devoted meetings and web forum discussion on the monitoring parameters, their accuracy and relevance, and their potential meanings. The calibrated ET allows anomalies in the monitored parameters to be recognized and interpreted, assigning probability values to each set of data. This process de-personalizes the difficult task of interpreting multi-parametric sets of data during on-going emergencies, and provides a view of the observed variations that accounts for the averaged, weighted opinion of the scientific community. An additional positive outcome of the described ET calibration process is that of providing a picture of the degree of confidence by the expert community on the capability of the many different monitored quantities of recognizing significant variations in the state of the volcano. This picture is particularly useful since it can be used to guide future implementations in the monitoring network, as well as research investments aimed at substantially improving the capability to forecast the short-term volcanic hazard.

  8. Towards an Ontology-Based Approach to Support Monitoring the Data of the International Monitoring System (IMS)

    NASA Astrophysics Data System (ADS)

    Laban, Shaban; El-Desouky, Ali

    2010-05-01

    The heterogeneity of the distributed processing systems, monitored data and resources is an obvious challenge in monitoring the data of International Monitoring System (IMS) of the Comprehensive Nuclear Test-Ban Treaty organization (CTBTO). Processing engineers, analysts, operators and other interested parties seek for intelligent tools and software that hide the underlying complexity of the systems, allowing them to manage the operation and monitoring the systems at a higher level, focusing on what the expected behavior and results should be instead of how to specifically achieve it. Also, it is needed to share common understanding of the structure of organization information, data, and products among staff, software agents, and policy making organs. Additionally, introducing new monitoring object or system should not complicate the overall system and should be feasible. An ontologybased approach is presented in this paper aiming to support monitoring real-time data processing and supervising the various system resources, focusing on integrating and sharing same knowledge and status information of the system among different environments. The results of a prototype framework is presented and analyzed.

  9. Review of functional near-infrared spectroscopy in neurorehabilitation

    PubMed Central

    Mihara, Masahito; Miyai, Ichiro

    2016-01-01

    Abstract. We provide a brief overview of the research and clinical applications of near-infrared spectroscopy (NIRS) in the neurorehabilitation field. NIRS has several potential advantages and shortcomings as a neuroimaging tool and is suitable for research application in the rehabilitation field. As one of the main applications of NIRS, we discuss its application as a monitoring tool, including investigating the neural mechanism of functional recovery after brain damage and investigating the neural mechanisms for controlling bipedal locomotion and postural balance in humans. In addition to being a monitoring tool, advances in signal processing techniques allow us to use NIRS as a therapeutic tool in this field. With a brief summary of recent studies investigating the clinical application of NIRS using motor imagery task, we discuss the possible clinical usage of NIRS in brain–computer interface and neurofeedback. PMID:27429995

  10. Monitoring and Reporting Tools of the International Data Centre and International Monitoring System

    NASA Astrophysics Data System (ADS)

    Lastowka, L.; Anichenko, A.; Galindo, M.; Villagran Herrera, M.; Mori, S.; Malakhova, M.; Daly, T.; Otsuka, R.; Stangel, H.

    2007-05-01

    The Comprehensive Test-Ban Treaty (CTBT) which prohibits all nuclear explosions was opened for signature in 1996. Since then, the Preparatory Commission for the CTBT Organization has been working towards the establishment of a global verification regime to monitor compliance with the ban on nuclear testing. The International Monitoring System (IMS) comprises facilities for seismic, hydroacoustic, infrasound and radionuclide monitoring, and the means of communication. This system is supported by the International Data Centre (IDC), which provides objective products and services necessary for effective global monitoring. Upon completion of the IMS, 321 stations will be contributing to both near real-time and reviewed data products. Currently there are 194 facilities in IDC operations. This number is expected to increase by about 40% over the next few years, necessitating methods and tools to effectively handle the expansion. The requirements of high data availability as well as operational transparency are fundamental principals of IMS network operations, therefore, a suite of tools for monitoring and reporting have been developed. These include applications for monitoring Global Communication Infrastructure (GCI) links, detecting outages in continuous and segmented data, monitoring the status of data processing and forwarding to member states, and for systematic electronic communication and problem ticketing. The operation of the IMS network requires the help of local specialists whose cooperation is in some cases ensured by contracts or other agreements. The PTS (Provisional Technical Secretariat) strives to make the monitoring of the IMS as standardized and efficient as possible, and has therefore created the Operations Centre in which the use of most the tools are centralized. Recently the tasks of operations across all technologies, including the GCI, have been centralized within a single section of the organization. To harmonize the operations, an ongoing State of Health monitoring project will provide an integrated view of network, station and GCI performance and will provide system metrics. Comprehensive procedures will be developed to utilize this tool. However, as the IMS network expands, easier access to more information will cause additional challenges, mainly with human resources, to analyze and manage these metrics.

  11. A Novel Approach to Monitoring the Curing of Epoxy in Closed Tools by Use of Ultrasonic Spectroscopy

    PubMed Central

    2017-01-01

    The increasing use of composite materials has led to a greater demand for efficient curing cycles to reduce costs and speed up production cycles in manufacturing. One method to achieve this goal is in-line cure monitoring to determine the exact curing time. This article proposes a novel method through which to monitor the curing process inside closed tools by employing ultrasonic spectroscopy. A simple experiment is used to demonstrate the change in the ultrasonic spectrum during the cure cycle of an epoxy. The results clearly reveal a direct correlation between the amplitude and state of cure. The glass transition point is indicated by a global minimum of the reflected amplitude. PMID:29301222

  12. Next Generation Parallelization Systems for Processing and Control of PDS Image Node Assets

    NASA Astrophysics Data System (ADS)

    Verma, R.

    2017-06-01

    We present next-generation parallelization tools to help Planetary Data System (PDS) Imaging Node (IMG) better monitor, process, and control changes to nearly 650 million file assets and over a dozen machines on which they are referenced or stored.

  13. Monitoring tools of COMPASS experiment at CERN

    NASA Astrophysics Data System (ADS)

    Bodlak, M.; Frolov, V.; Huber, S.; Jary, V.; Konorov, I.; Levit, D.; Novy, J.; Salac, R.; Tomsa, J.; Virius, M.

    2015-12-01

    This paper briefly introduces the data acquisition system of the COMPASS experiment and is mainly focused on the part that is responsible for the monitoring of the nodes in the whole newly developed data acquisition system of this experiment. The COMPASS is a high energy particle experiment with a fixed target located at the SPS of the CERN laboratory in Geneva, Switzerland. The hardware of the data acquisition system has been upgraded to use FPGA cards that are responsible for data multiplexing and event building. The software counterpart of the system includes several processes deployed in heterogenous network environment. There are two processes, namely Message Logger and Message Browser, taking care of monitoring. These tools handle messages generated by nodes in the system. While Message Logger collects and saves messages to the database, the Message Browser serves as a graphical interface over the database containing these messages. For better performance, certain database optimizations have been used. Lastly, results of performance tests are presented.

  14. Monitoring Interfacial Lipid Oxidation in Oil-in-Water Emulsions Using Spatially Resolved Optical Techniques.

    PubMed

    Banerjee, Chiranjib; Westberg, Michael; Breitenbach, Thomas; Bregnhøj, Mikkel; Ogilby, Peter R

    2017-06-06

    The oxidation of lipids is an important phenomenon with ramifications for disciplines that range from food science to cell biology. The development and characterization of tools and techniques to monitor lipid oxidation are thus relevant. Of particular significance in this regard are tools that facilitate the study of oxidations at interfaces in heterogeneous samples (e.g., oil-in-water emulsions, cell membranes). In this article, we establish a proof-of-principle for methods to initiate and then monitor such oxidations with high spatial resolution. The experiments were performed using oil-in-water emulsions of polyunsaturated fatty acids (PUFAs) prepared from cod liver oil. We produced singlet oxygen at a point near the oil-water interface of a given PUFA droplet in a spatially localized two-photon photosensitized process. We then followed the oxidation reactions initiated by this process with the fluorescence-based imaging technique of structured illumination microscopy (SIM). We conclude that the approach reported herein has attributes well-suited to the study of lipid oxidation in heterogeneous samples.

  15. A novel methodology for in-process monitoring of flow forming

    NASA Astrophysics Data System (ADS)

    Appleby, Andrew; Conway, Alastair; Ion, William

    2017-10-01

    Flow forming (FF) is an incremental cold working process with near-net-shape forming capability. Failures by fracture due to high deformation can be unexpected and sometimes catastrophic, causing tool damage. If process failures can be identified in real time, an automatic cut-out could prevent costly tool damage. Sound and vibration monitoring is well established and commercially viable in the machining sector to detect current and incipient process failures, but not for FF. A broad-frequency microphone was used to record the sound signature of the manufacturing cycle for a series of FF parts. Parts were flow formed using single and multiple passes, and flaws were introduced into some of the parts to simulate the presence of spontaneously initiated cracks. The results show that this methodology is capable of identifying both introduced defects and spontaneous failures during flow forming. Further investigation is needed to categorise and identify different modes of failure and identify further potential applications in rotary forming.

  16. Monitoring ash (Fraxinus spp.) decline and emerald ash borer (Agrilus planipennis) symptoms in infested areas

    Treesearch

    Kathleen S. Knight; Britton P. Flash; Rachel H. Kappler; Joel A. Throckmorton; Bernadette Grafton; Charles E. Flower

    2014-01-01

    Emerald ash borer (A. planipennis) (EAB) has had a devastating effect on ash (Fraxinus) species since its introduction to North America and has resulted in altered ecological processes across the area of infestation. Monitoring is an important tool for understanding and managing the impact of this threat, and the use of common...

  17. Intelligence Community Forum

    DTIC Science & Technology

    2008-11-05

    Description Operationally Feasible? EEG ms ms cm Measures electrical activity in the brain. Practical tool for applications - real time monitoring or...Cognitive Systems Device Development & Processing Methods Brain activity can be monitored in real-time in operational environments with EEG Brain...biological and cognitive findings about the user to customize the learning environment Neurofeedback • Present the user with real-time feedback

  18. Toward a Scalable Visualization System for Network Traffic Monitoring

    NASA Astrophysics Data System (ADS)

    Malécot, Erwan Le; Kohara, Masayoshi; Hori, Yoshiaki; Sakurai, Kouichi

    With the multiplication of attacks against computer networks, system administrators are required to monitor carefully the traffic exchanged by the networks they manage. However, that monitoring task is increasingly laborious because of the augmentation of the amount of data to analyze. And that trend is going to intensify with the explosion of the number of devices connected to computer networks along with the global rise of the available network bandwidth. So system administrators now heavily rely on automated tools to assist them and simplify the analysis of the data. Yet, these tools provide limited support and, most of the time, require highly skilled operators. Recently, some research teams have started to study the application of visualization techniques to the analysis of network traffic data. We believe that this original approach can also allow system administrators to deal with the large amount of data they have to process. In this paper, we introduce a tool for network traffic monitoring using visualization techniques that we developed in order to assist the system administrators of our corporate network. We explain how we designed the tool and some of the choices we made regarding the visualization techniques to use. The resulting tool proposes two linked representations of the network traffic and activity, one in 2D and the other in 3D. As 2D and 3D visualization techniques have different assets, we resulted in combining them in our tool to take advantage of their complementarity. We finally tested our tool in order to evaluate the accuracy of our approach.

  19. The development of daily monitoring tool in a service part manufacturing company

    NASA Astrophysics Data System (ADS)

    Marpaung, Seamus Tadeo; Rosyidi, Cucuk Nur

    2018-02-01

    Production lead time is one of the key measures to assess whether a production system is running well or not. A short lead time will lead to higher customer satisfaction and will be a solid proof that a system is well-organized. To shorten the production lead time, a good production planning and control are required. There are many obstacles which can occur at any time, for instance shortage of material and worker, or poor production scheduling. Service Parts Planning Department works with many parties from the beginning of service parts production until it is delivered to the customer. This research was conducted to find an appropriate production monitoring tool for Service Parts Planning Department, which is a control method that make problems appears to the surface and can be overcome quickly so that the production process can run normally. The tool development started with a field study to find out the production flow from start to finish, a literature review and a interview with some employees who will later use the production control tool, and the creation of a daily control that went through several modifications until finally meet the needs of the department. In this research, a production monitoring tool which is developed can be used to monitor the entire order status, the production lead time, and also serves as the records and reports for presentation.

  20. Understanding the behavior of Giardia and Cryptosporidium in an urban watershed: Explanation and application of techniques to collect and evaluate monitoring data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crockett, C.S.; Haas, C.N.

    1996-11-01

    Due to current proposed regulations requiring monitoring for protozoans and demonstration of adequate protozoan removal depending on source water concentrations detected, many utilities are considering or are engaged in protozoan monitoring activities within their watershed so that proper watershed management and treatment modifications can reduce their impact on drinking water safety and quality. However, due to the difficulties associated with the current analytical methods and sample collection many sampling efforts collect data that cannot be interpreted or lack the tools to interpret the information obtained. Therefore, it is necessary to determine how to develop an effective sampling program tailored tomore » a utility`s specific needs to provide interpretable data and develop tools for evaluating such data. The following case study describes the process in which a utility learned how to collect and interpret monitoring data for their specific needs and provides concepts and tools which other utilities can use to aid in their own macro and microwatershed management efforts.« less

  1. Operations management tools to be applied for textile

    NASA Astrophysics Data System (ADS)

    Maralcan, A.; Ilhan, I.

    2017-10-01

    In this paper, basic concepts of process analysis such as flow time, inventory, bottleneck, labour cost and utilization are illustrated first. The effect of bottleneck on the results of a business are especially emphasized. In the next section, tools on productivity measurement; KPI (Key Performance Indicators) Tree, OEE (Overall Equipment Effectiveness) and Takt Time are introduced and exemplified. KPI tree is a diagram on which we can visualize all the variables of an operation which are driving financial results through cost and profit. OEE is a tool to measure a potential extra capacity of an equipment or an employee. Takt time is a tool to determine the process flow rate according to the customer demand. KPI tree is studied through the whole process while OEE is exemplified for a stenter frame machine which is the most important machine (and usually the bottleneck) and the most expensive investment in a finishing plant. Takt time is exemplified for the quality control department. Finally quality tools, six sigma, control charts and jidoka are introduced. Six sigma is a tool to measure process capability and by the way probability of a defect. Control chart is a powerful tool to monitor the process. The idea of jidoka (detect, stop and alert) is about alerting the people that there is a problem in the process.

  2. Monitoring of Surface Roughness in Aluminium Turning Process

    NASA Astrophysics Data System (ADS)

    Chaijareenont, Atitaya; Tangjitsitcharoen, Somkiat

    2018-01-01

    As the turning process is one of the most necessary process. The surface roughness has been considered for the quality of workpiece. There are many factors which affect the surface roughness. Hence, the objective of this research is to monitor the relation between the surface roughness and the cutting forces in aluminium turning process with a wide range of cutting conditions. The coated carbide tool and aluminium alloy (Al 6063) are used for this experiment. The cutting parameters are investigated to analyze the effects of them on the surface roughness which are the cutting speed, the feed rate, the tool nose radius and the depth of cut. In the case of this research, the dynamometer is installed in the turret of CNC turning machine to generate a signal while turning. The relation between dynamic cutting forces and the surface roughness profile is examined by applying the Fast Fourier Transform (FFT). The experimentally obtained results showed that the cutting force depends on the cutting condition. The surface roughness can be improved when increasing the cutting speed and the tool nose radius in contrast to the feed rate and the depth of cut. The relation between the cutting parameters and the surface roughness can be explained by the in-process cutting forces. It is understood that the in-process cutting forces are able to predict the surface roughness in the further research.

  3. Towards a geophysical decision-support system for monitoring and managing unstable slopes

    NASA Astrophysics Data System (ADS)

    Chambers, J. E.; Meldrum, P.; Wilkinson, P. B.; Uhlemann, S.; Swift, R. T.; Inauen, C.; Gunn, D.; Kuras, O.; Whiteley, J.; Kendall, J. M.

    2017-12-01

    Conventional approaches for condition monitoring, such as walk over surveys, remote sensing or intrusive sampling, are often inadequate for predicting instabilities in natural and engineered slopes. Surface observations cannot detect the subsurface precursors to failure events; instead they can only identify failure once it has begun. On the other hand, intrusive investigations using boreholes only sample a very small volume of ground and hence small scale deterioration process in heterogeneous ground conditions can easily be missed. It is increasingly being recognised that geophysical techniques can complement conventional approaches by providing spatial subsurface information. Here we describe the development and testing of a new geophysical slope monitoring system. It is built around low-cost electrical resistivity tomography instrumentation, combined with integrated geotechnical logging capability, and coupled with data telemetry. An automated data processing and analysis workflow is being developed to streamline information delivery. The development of this approach has provided the basis of a decision-support tool for monitoring and managing unstable slopes. The hardware component of the system has been operational at a number of field sites associated with a range of natural and engineered slopes for up to two years. We report on the monitoring results from these sites, discuss the practicalities of installing and maintaining long-term geophysical monitoring infrastructure, and consider the requirements of a fully automated data processing and analysis workflow. We propose that the result of this development work is a practical decision-support tool that can provide near-real-time information relating to the internal condition of problematic slopes.

  4. Continuous processing and the applications of online tools in pharmaceutical product manufacture: developments and examples.

    PubMed

    Ooi, Shing Ming; Sarkar, Srimanta; van Varenbergh, Griet; Schoeters, Kris; Heng, Paul Wan Sia

    2013-04-01

    Continuous processing and production in pharmaceutical manufacturing has received increased attention in recent years mainly due to the industries' pressing needs for more efficient, cost-effective processes and production, as well as regulatory facilitation. To achieve optimum product quality, the traditional trial-and-error method for the optimization of different process and formulation parameters is expensive and time consuming. Real-time evaluation and the control of product quality using an online process analyzer in continuous processing can provide high-quality production with very high-throughput at low unit cost. This review focuses on continuous processing and the application of different real-time monitoring tools used in the pharmaceutical industry for continuous processing from powder to tablets.

  5. Expert system and process optimization techniques for real-time monitoring and control of plasma processes

    NASA Astrophysics Data System (ADS)

    Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.

    1991-03-01

    To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).

  6. Instrumentation, performance visualization, and debugging tools for multiprocessors

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Fineman, Charles E.; Hontalas, Philip J.

    1991-01-01

    The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessor architectures. However, without effective means to monitor (and visualize) program execution, debugging, and tuning parallel programs becomes intractably difficult as program complexity increases with the number of processors. Research on performance evaluation tools for multiprocessors is being carried out at ARC. Besides investigating new techniques for instrumenting, monitoring, and presenting the state of parallel program execution in a coherent and user-friendly manner, prototypes of software tools are being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Our current tool set, the Ames Instrumentation Systems (AIMS), incorporates features from various software systems developed in academia and industry. The execution of FORTRAN programs on the Intel iPSC/860 can be automatically instrumented and monitored. Performance data collected in this manner can be displayed graphically on workstations supporting X-Windows. We have successfully compared various parallel algorithms for computational fluid dynamics (CFD) applications in collaboration with scientists from the Numerical Aerodynamic Simulation Systems Division. By performing these comparisons, we show that performance monitors and debuggers such as AIMS are practical and can illuminate the complex dynamics that occur within parallel programs.

  7. Adaptive awareness for personal and small group decision making.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perano, Kenneth J.; Tucker, Steve; Pancerella, Carmen M.

    2003-12-01

    Many situations call for the use of sensors monitoring physiological and environmental data. In order to use the large amounts of sensor data to affect decision making, we are coupling heterogeneous sensors with small, light-weight processors, other powerful computers, wireless communications, and embedded intelligent software. The result is an adaptive awareness and warning tool, which provides both situation awareness and personal awareness to individuals and teams. Central to this tool is a sensor-independent architecture, which combines both software agents and a reusable core software framework that manages the available hardware resources and provides services to the agents. Agents can recognizemore » cues from the data, warn humans about situations, and act as decision-making aids. Within the agents, self-organizing maps (SOMs) are used to process physiological data in order to provide personal awareness. We have employed a novel clustering algorithm to train the SOM to discern individual body states and activities. This awareness tool has broad applicability to emergency teams, military squads, military medics, individual exercise and fitness monitoring, health monitoring for sick and elderly persons, and environmental monitoring in public places. This report discusses our hardware decisions, software framework, and a pilot awareness tool, which has been developed at Sandia National Laboratories.« less

  8. PAT-tools for process control in pharmaceutical film coating applications.

    PubMed

    Knop, Klaus; Kleinebudde, Peter

    2013-12-05

    Recent development of analytical techniques to monitor the coating process of pharmaceutical solid dosage forms such as pellets and tablets are described. The progress from off- or at-line measurements to on- or in-line applications is shown for the spectroscopic methods near infrared (NIR) and Raman spectroscopy as well as for terahertz pulsed imaging (TPI) and image analysis. The common goal of all these methods is to control or at least to monitor the coating process and/or to estimate the coating end point through timely measurements. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Accuracy Evaluation of a CE-Marked Glucometer System for Self-Monitoring of Blood Glucose With Three Reagent Lots Following ISO 15197:2013.

    PubMed

    Hehmke, Bernd; Berg, Sabine; Salzsieder, Eckhard

    2017-05-01

    Continuous standardized verification of the accuracy of blood glucose meter systems for self-monitoring after their introduction into the market is an important clinically tool to assure reliable performance of subsequently released lots of strips. Moreover, such published verification studies permit comparison of different blood glucose monitoring systems and, thus, are increasingly involved in the process of evidence-based purchase decision making.

  10. Monitoring Quality Across Home Visiting Models: A Field Test of Michigan's Home Visiting Quality Assurance System.

    PubMed

    Heany, Julia; Torres, Jennifer; Zagar, Cynthia; Kostelec, Tiffany

    2018-06-05

    Introduction In order to achieve the positive outcomes with parents and children demonstrated by many home visiting models, home visiting services must be well implemented. The Michigan Home Visiting Initiative developed a tool and procedure for monitoring implementation quality across models referred to as Michigan's Home Visiting Quality Assurance System (MHVQAS). This study field tested the MHVQAS. This article focuses on one of the study's evaluation questions: Can the MHVQAS be applied across models? Methods Eight local implementing agencies (LIAs) from four home visiting models (Healthy Families America, Early Head Start-Home Based, Parents as Teachers, Maternal Infant Health Program) and five reviewers participated in the study by completing site visits, tracking their time and costs, and completing surveys about the process. LIAs also submitted their most recent review by their model developer. The researchers conducted participant observation of the review process. Results Ratings on the MHVQAS were not significantly different between models. There were some differences in interrater reliability and perceived reliability between models. There were no significant differences between models in perceived validity, satisfaction with the review process, or cost to participate. Observational data suggested that cross-model applicability could be improved by assisting sites in relating the requirements of the tool to the specifics of their model. Discussion The MHVQAS shows promise as a tool and process to monitor implementation quality of home visiting services across models. The results of the study will be used to make improvements before the MHVQAS is used in practice.

  11. Towards a generalized energy prediction model for machine tools

    PubMed Central

    Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H.; Dornfeld, David A.; Helu, Moneer; Rachuri, Sudarsan

    2017-01-01

    Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process. PMID:28652687

  12. Towards a generalized energy prediction model for machine tools.

    PubMed

    Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H; Dornfeld, David A; Helu, Moneer; Rachuri, Sudarsan

    2017-04-01

    Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process.

  13. A quality by design study applied to an industrial pharmaceutical fluid bed granulation.

    PubMed

    Lourenço, Vera; Lochmann, Dirk; Reich, Gabriele; Menezes, José C; Herdling, Thorsten; Schewitz, Jens

    2012-06-01

    The pharmaceutical industry is encouraged within Quality by Design (QbD) to apply science-based manufacturing principles to assure quality not only of new but also of existing processes. This paper presents how QbD principles can be applied to an existing industrial pharmaceutical fluid bed granulation (FBG) process. A three-step approach is presented as follows: (1) implementation of Process Analytical Technology (PAT) monitoring tools at the industrial scale process, combined with multivariate data analysis (MVDA) of process and PAT data to increase the process knowledge; (2) execution of scaled-down designed experiments at a pilot scale, with adequate PAT monitoring tools, to investigate the process response to intended changes in Critical Process Parameters (CPPs); and finally (3) the definition of a process Design Space (DS) linking CPPs to Critical to Quality Attributes (CQAs), within which product quality is ensured by design, and after scale-up enabling its use at the industrial process scale. The proposed approach was developed for an existing industrial process. Through enhanced process knowledge established a significant reduction in product CQAs, variability already within quality specifications ranges was achieved by a better choice of CPPs values. The results of such step-wise development and implementation are described. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Corrosion process monitoring by AFM higher harmonic imaging

    NASA Astrophysics Data System (ADS)

    Babicz, S.; Zieliński, A.; Smulko, J.; Darowicki, K.

    2017-11-01

    The atomic force microscope (AFM) was invented in 1986 as an alternative to the scanning tunnelling microscope, which cannot be used in studies of non-conductive materials. Today the AFM is a powerful, versatile and fundamental tool for visualizing and studying the morphology of material surfaces. Moreover, additional information for some materials can be recovered by analysing the AFM’s higher cantilever modes when the cantilever motion is inharmonic and generates frequency components above the excitation frequency, usually close to the resonance frequency of the lowest oscillation mode. This method has been applied and developed to monitor corrosion processes. The higher-harmonic imaging is especially helpful for sharpening boundaries between objects in heterogeneous samples, which can be used to identify variations in steel structures (e.g. corrosion products, steel heterogeneity). The corrosion products have different chemical structures because they are composed of chemicals other than the original metal base (mainly iron oxides). Thus, their physicochemical properties are different from the primary basis. These structures have edges at which higher harmonics should be more intense because of stronger interference between the tip and the specimen structure there. This means that the AFM’s higher-harmonic imaging is an excellent tool for monitoring surficial effects of the corrosion process.

  15. Dynamic quantitative photothermal monitoring of cell death of individual human red blood cells upon glucose depletion

    NASA Astrophysics Data System (ADS)

    Vasudevan, Srivathsan; Chen, George Chung Kit; Andika, Marta; Agarwal, Shuchi; Chen, Peng; Olivo, Malini

    2010-09-01

    Red blood cells (RBCs) have been found to undergo ``programmed cell death,'' or eryptosis, and understanding this process can provide more information about apoptosis of nucleated cells. Photothermal (PT) response, a label-free photothermal noninvasive technique, is proposed as a tool to monitor the cell death process of living human RBCs upon glucose depletion. Since the physiological status of the dying cells is highly sensitive to photothermal parameters (e.g., thermal diffusivity, absorption, etc.), we applied linear PT response to continuously monitor the death mechanism of RBC when depleted of glucose. The kinetics of the assay where the cell's PT response transforms from linear to nonlinear regime is reported. In addition, quantitative monitoring was performed by extracting the relevant photothermal parameters from the PT response. Twofold increases in thermal diffusivity and size reduction were found in the linear PT response during cell death. Our results reveal that photothermal parameters change earlier than phosphatidylserine externalization (used for fluorescent studies), allowing us to detect the initial stage of eryptosis in a quantitative manner. Hence, the proposed tool, in addition to detection of eryptosis earlier than fluorescence, could also reveal physiological status of the cells through quantitative photothermal parameter extraction.

  16. Surface Acoustic Wave (SAW) Resonators for Monitoring Conditioning Film Formation

    PubMed Central

    Hohmann, Siegfried; Kögel, Svea; Brunner, Yvonne; Schmieg, Barbara; Ewald, Christina; Kirschhöfer, Frank; Brenner-Weiß, Gerald; Länge, Kerstin

    2015-01-01

    We propose surface acoustic wave (SAW) resonators as a complementary tool for conditioning film monitoring. Conditioning films are formed by adsorption of inorganic and organic substances on a substrate the moment this substrate comes into contact with a liquid phase. In the case of implant insertion, for instance, initial protein adsorption is required to start wound healing, but it will also trigger immune reactions leading to inflammatory responses. The control of the initial protein adsorption would allow to promote the healing process and to suppress adverse immune reactions. Methods to investigate these adsorption processes are available, but it remains difficult to translate measurement results into actual protein binding events. Biosensor transducers allow user-friendly investigation of protein adsorption on different surfaces. The combination of several transduction principles leads to complementary results, allowing a more comprehensive characterization of the adsorbing layer. We introduce SAW resonators as a novel complementary tool for time-resolved conditioning film monitoring. SAW resonators were coated with polymers. The adsorption of the plasma proteins human serum albumin (HSA) and fibrinogen onto the polymer-coated surfaces were monitored. Frequency results were compared with quartz crystal microbalance (QCM) sensor measurements, which confirmed the suitability of the SAW resonators for this application. PMID:26007735

  17. Using Sap Flow Monitoring for Improved Process-based Ecohydrologic Understanding 2022

    USDA-ARS?s Scientific Manuscript database

    Sap flow measurements can be an important tool for unraveling the complex web of ecosystem fluxes, especially when it is combined with other measurements like eddy covariance, isotopes, remote sensing, etc. In this talk, we will demonstrate how sap flow measurements have improved our process-level u...

  18. Near-infrared spectroscopy monitoring and control of the fluidized bed granulation and coating processes-A review.

    PubMed

    Liu, Ronghua; Li, Lian; Yin, Wenping; Xu, Dongbo; Zang, Hengchang

    2017-09-15

    The fluidized bed granulation and pellets coating technologies are widely used in pharmaceutical industry, because the particles made in a fluidized bed have good flowability, compressibility, and the coating thickness of pellets are homogeneous. With the popularization of process analytical technology (PAT), real-time analysis for critical quality attributes (CQA) was getting more attention. Near-infrared (NIR) spectroscopy, as a PAT tool, could realize the real-time monitoring and control during the granulating and coating processes, which could optimize the manufacturing processes. This article reviewed the application of NIR spectroscopy in CQA (moisture content, particle size and tablet/pellet thickness) monitoring during fluidized bed granulation and coating processes. Through this review, we would like to provide references for realizing automated control and intelligent production in fluidized bed granulation and pellets coating of pharmaceutical industry. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. AERO: A Decision Support Tool for Wind Erosion Assessment in Rangelands and Croplands

    NASA Astrophysics Data System (ADS)

    Galloza, M.; Webb, N.; Herrick, J.

    2015-12-01

    Wind erosion is a key driver of global land degradation, with on- and off-site impacts on agricultural production, air quality, ecosystem services and climate. Measuring rates of wind erosion and dust emission across land use and land cover types is important for quantifying the impacts and identifying and testing practical management options. This process can be assisted by the application of predictive models, which can be a powerful tool for land management agencies. The Aeolian EROsion (AERO) model, a wind erosion and dust emission model interface provides access by non-expert land managers to a sophisticated wind erosion decision-support tool. AERO incorporates land surface processes and sediment transport equations from existing wind erosion models and was designed for application with available national long-term monitoring datasets (e.g. USDI BLM Assessment, Inventory and Monitoring, USDA NRCS Natural Resources Inventory) and monitoring protocols. Ongoing AERO model calibration and validation are supported by geographically diverse data on wind erosion rates and land surface conditions collected by the new National Wind Erosion Research Network. Here we present the new AERO interface, describe parameterization of the underpinning wind erosion model, and provide a summary of the model applications across agricultural lands and rangelands in the United States.

  20. ASI-Sistema Rischio Vulcanico SRV: a pilot project to develop EO data processing modules and products for volcanic activity monitoring based on Italian Civil Protection Department requirements and needs

    NASA Astrophysics Data System (ADS)

    Buongiorno, Maria Fabrizia; Musacchio, Massimo; Silvestri, Malvina; Spinetti, Claudia; Corradini, Stefano; Lombardo, Valerio; Merucci, Luca; Sansosti, Eugenio; Pugnagli, Sergio; Teggi, Sergio; Pace, Gaetano; Fermi, Marco; Zoffoli, Simona

    2007-10-01

    The Project called Sistema Rischio Vulcanico (SRV) is funded by the Italian Space Agency (ASI) in the frame of the National Space Plan 2003-2005 under the Earth Observations section for natural risks management. The SRV Project is coordinated by the Istituto Nazionale di Geofisica e Vulcanologia (INGV) which is responsible at national level for the volcanic monitoring. The objective of the project is to develop a pre-operative system based on EO data and ground measurements integration to support the volcanic risk monitoring of the Italian Civil Protection Department which requirements and need are well integrated in the GMES Emergency Core Services program. The project philosophy is to implement, by incremental versions, specific modules which allow to process, store and visualize through Web GIS tools EO derived parameters considering three activity phases: 1) knowledge and prevention; 2) crisis; 3) post crisis. In order to combine effectively the EO data and the ground networks measurements the system will implement a multi-parametric analysis tool, which represents and unique tool to analyze contemporaneously a large data set of data in "near real time". The SRV project will be tested his operational capabilities on three Italian Volcanoes: Etna,Vesuvio and Campi Flegrei.

  1. The microbiome as engineering tool: Manufacturing and trading between microorganisms.

    PubMed

    De Vrieze, Jo; Christiaens, Marlies E R; Verstraete, Willy

    2017-10-25

    The integration of microbial technologies within the framework of the water-energy nexus has been taking place for over a century, but these mixed microbial communities are considered hard to deal with 'black boxes'. Process steering is mainly based on avoiding process failure by monitoring conventional parameters, e.g., pH and temperature, which often leads to operation far below the intrinsic potential. Mixed microbial communities do not reflect a randomised individual mix, but an interacting microbiological entity. Advance monitoring to obtain effective engineering of the microbiome is achievable, and even crucial to obtain the desired performance and products. This can be achieved via a top-down or bottom-up approach. The top-down strategy is reflected in the microbial resource management concept that considers the microbial community as a well-structured network. This network can be monitored by means of molecular techniques that will allow the development of accurate and quick decision tools. In contrast, the bottom-up approach makes use of synthetic cultures that can be composed starting from defined axenic cultures, based on the requirements of the process under consideration. The success of both approaches depends on real-time monitoring and control. Of particular importance is the necessity to identify and characterise the key players in the process. These key players not only relate with the establishment of functional conversions, but also with the interaction between partner bacteria. This emphasises the importance of molecular (screening) techniques to obtain structural and functional insights, minimise energy input, and maximise product output by means of integrated microbiome processes. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Evaluation of risk and benefit in thermal effusivity sensor for monitoring lubrication process in pharmaceutical product manufacturing.

    PubMed

    Uchiyama, Jumpei; Kato, Yoshiteru; Uemoto, Yoshifumi

    2014-08-01

    In the process design of tablet manufacturing, understanding and control of the lubrication process is important from various viewpoints. A detailed analysis of thermal effusivity data in the lubrication process was conducted in this study. In addition, we evaluated the risk and benefit in the lubrication process by a detailed investigation. It was found that monitoring of thermal effusivity detected mainly the physical change of bulk density, which was changed by dispersal of the lubricant and the coating powder particle by the lubricant. The monitoring of thermal effusivity was almost the monitoring of bulk density, thermal effusivity could have a high correlation with tablet hardness. Moreover, as thermal effusivity sensor could detect not only the change of the conventional bulk density but also the fractional change of thermal conductivity and thermal capacity, two-phase progress of lubrication process could be revealed. However, each contribution of density, thermal conductivity, or heat capacity to thermal effusivity has the risk of fluctuation by formulation. After carefully considering the change factor with the risk to be changed by formulation, thermal effusivity sensor can be a useful tool for monitoring as process analytical technology, estimating tablet hardness and investigating the detailed mechanism of the lubrication process.

  3. Development of the major trauma case review tool.

    PubMed

    Curtis, Kate; Mitchell, Rebecca; McCarthy, Amy; Wilson, Kellie; Van, Connie; Kennedy, Belinda; Tall, Gary; Holland, Andrew; Foster, Kim; Dickinson, Stuart; Stelfox, Henry T

    2017-02-28

    As many as half of all patients with major traumatic injuries do not receive the recommended care, with variance in preventable mortality reported across the globe. This variance highlights the need for a comprehensive process for monitoring and reviewing patient care, central to which is a consistent peer-review process that includes trauma system safety and human factors. There is no published, evidence-informed standardised tool that considers these factors for use in adult or paediatric trauma case peer-review. The aim of this research was to develop and validate a trauma case review tool to facilitate clinical review of paediatric trauma patient care in extracting information to facilitate monitoring, inform change and enable loop closure. Development of the trauma case review tool was multi-faceted, beginning with a review of the trauma audit tool literature. Data were extracted from the literature to inform iterative tool development using a consensus approach. Inter-rater agreement was assessed for both the pilot and finalised versions of the tool. The final trauma case review tool contained ten sections, including patient factors (such as pre-existing conditions), presenting problem, a timeline of events, factors contributing to the care delivery problem (including equipment, work environment, staff action, organizational factors), positive aspects of care and the outcome of panel discussion. After refinement, the inter-rater reliability of the human factors and outcome components of the tool improved with an average 86% agreement between raters. This research developed an evidence-informed tool for use in paediatric trauma case review that considers both system safety and human factors to facilitate clinical review of trauma patient care. This tool can be used to identify opportunities for improvement in trauma care and guide quality assurance activities. Validation is required in the adult population.

  4. Industrial implementation of spatial variability control by real-time SPC

    NASA Astrophysics Data System (ADS)

    Roule, O.; Pasqualini, F.; Borde, M.

    2016-10-01

    Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.

  5. Real-time imaging as an emerging process analytical technology tool for monitoring of fluid bed coating process.

    PubMed

    Naidu, Venkata Ramana; Deshpande, Rucha S; Syed, Moinuddin R; Wakte, Pravin S

    2018-07-01

    A direct imaging system (Eyecon TM ) was used as a Process Analytical Technology (PAT) tool to monitor fluid bed coating process. Eyecon TM generated real-time onscreen images, particle size and shape information of two identically manufactured laboratory-scale batches. Eyecon TM has accuracy of measuring the particle size increase of ±1 μm on particles in the size range of 50-3000 μm. Eyecon TM captured data every 2 s during the entire process. The moving average of D90 particle size values recorded by Eyecon TM were calculated for every 30 min to calculate the radial coating thickness of coated particles. After the completion of coating process, the radial coating thickness was found to be 11.3 and 9.11 μm, with a standard deviation of ±0.68 and 1.8 μm for Batch 1 and Batch 2, respectively. The coating thickness was also correlated with percent weight build-up by gel permeation chromatography (GPC) and dissolution. GPC indicated weight build-up of 10.6% and 9.27% for Batch 1 and Batch 2, respectively. In conclusion, weight build-up of 10% can also be correlated with 10 ± 2 μm increase in the coating thickness of pellets, indicating the potential applicability of real-time imaging as an endpoint determination tool for fluid bed coating process.

  6. Famine Early Warning Systems Network (FEWS NET) Agro-climatology Analysis Tools and Knowledge Base Products for Food Security Applications

    NASA Astrophysics Data System (ADS)

    Budde, M. E.; Rowland, J.; Anthony, M.; Palka, S.; Martinez, J.; Hussain, R.

    2017-12-01

    The U.S. Geological Survey (USGS) supports the use of Earth observation data for food security monitoring through its role as an implementing partner of the Famine Early Warning Systems Network (FEWS NET). The USGS Earth Resources Observation and Science (EROS) Center has developed tools designed to aid food security analysts in developing assumptions of agro-climatological outcomes. There are four primary steps to developing agro-climatology assumptions; including: 1) understanding the climatology, 2) evaluating current climate modes, 3) interpretation of forecast information, and 4) incorporation of monitoring data. Analysts routinely forecast outcomes well in advance of the growing season, which relies on knowledge of climatology. A few months prior to the growing season, analysts can assess large-scale climate modes that might influence seasonal outcomes. Within two months of the growing season, analysts can evaluate seasonal forecast information as indicators. Once the growing season begins, monitoring data, based on remote sensing and field information, can characterize the start of season and remain integral monitoring tools throughout the duration of the season. Each subsequent step in the process can lead to modifications of the original climatology assumption. To support such analyses, we have created an agro-climatology analysis tool that characterizes each step in the assumption building process. Satellite-based rainfall and normalized difference vegetation index (NDVI)-based products support both the climatology and monitoring steps, sea-surface temperature data and knowledge of the global climate system inform the climate modes, and precipitation forecasts at multiple scales support the interpretation of forecast information. Organizing these data for a user-specified area provides a valuable tool for food security analysts to better formulate agro-climatology assumptions that feed into food security assessments. We have also developed a knowledge base for over 80 countries that provide rainfall and NDVI-based products, including annual and seasonal summaries, historical anomalies, coefficient of variation, and number of years below 70% of annual or seasonal averages. These products provide a quick look for analysts to assess the agro-climatology of a country.

  7. Additional self-monitoring tools in the dietary modification component of The Women's Health Initiative.

    PubMed

    Mossavar-Rahmani, Yasmin; Henry, Holly; Rodabough, Rebecca; Bragg, Charlotte; Brewer, Amy; Freed, Trish; Kinzel, Laura; Pedersen, Margaret; Soule, C Oehme; Vosburg, Shirley

    2004-01-01

    Self-monitoring promotes behavior changes by promoting awareness of eating habits and creates self-efficacy. It is an important component of the Women's Health Initiative dietary intervention. During the first year of intervention, 74% of the total sample of 19,542 dietary intervention participants self-monitored. As the study progressed the self-monitoring rate declined to 59% by spring 2000. Participants were challenged by inability to accurately estimate fat content of restaurant foods and the inconvenience of carrying bulky self-monitoring tools. In 1996, a Self-Monitoring Working Group was organized to develop additional self-monitoring options that were responsive to participant needs. This article describes the original and additional self-monitoring tools and trends in tool use over time. Original tools were the Food Diary and Fat Scan. Additional tools include the Keeping Track of Goals, Quick Scan, Picture Tracker, and Eating Pattern Changes instruments. The additional tools were used by the majority of participants (5,353 of 10,260 or 52% of participants who were self-monitoring) by spring 2000. Developing self-monitoring tools that are responsive to participant needs increases the likelihood that self-monitoring can enhance dietary reporting adherence, especially in long-term clinical trials.

  8. Automated terrestrial laser scanning with near-real-time change detection - monitoring of the Séchilienne landslide

    NASA Astrophysics Data System (ADS)

    Kromer, Ryan A.; Abellán, Antonio; Hutchinson, D. Jean; Lato, Matt; Chanut, Marie-Aurelie; Dubois, Laurent; Jaboyedoff, Michel

    2017-05-01

    We present an automated terrestrial laser scanning (ATLS) system with automatic near-real-time change detection processing. The ATLS system was tested on the Séchilienne landslide in France for a 6-week period with data collected at 30 min intervals. The purpose of developing the system was to fill the gap of high-temporal-resolution TLS monitoring studies of earth surface processes and to offer a cost-effective, light, portable alternative to ground-based interferometric synthetic aperture radar (GB-InSAR) deformation monitoring. During the study, we detected the flux of talus, displacement of the landslide and pre-failure deformation of discrete rockfall events. Additionally, we found the ATLS system to be an effective tool in monitoring landslide and rockfall processes despite missing points due to poor atmospheric conditions or rainfall. Furthermore, such a system has the potential to help us better understand a wide variety of slope processes at high levels of temporal detail.

  9. Robot Tracer with Visual Camera

    NASA Astrophysics Data System (ADS)

    Jabbar Lubis, Abdul; Dwi Lestari, Yuyun; Dafitri, Haida; Azanuddin

    2017-12-01

    Robot is a versatile tool that can function replace human work function. The robot is a device that can be reprogrammed according to user needs. The use of wireless networks for remote monitoring needs can be utilized to build a robot that can be monitored movement and can be monitored using blueprints and he can track the path chosen robot. This process is sent using a wireless network. For visual robot using high resolution cameras to facilitate the operator to control the robot and see the surrounding circumstances.

  10. Tools to improve planning, implementation, monitoring, and evaluation of complementary feeding programmes.

    PubMed

    Untoro, Juliawati; Childs, Rachel; Bose, Indira; Winichagoon, Pattanee; Rudert, Christiane; Hall, Andrew; de Pee, Saskia

    2017-10-01

    Adequate nutrient intake is a prerequisite for achieving good nutrition status. Suboptimal complementary feeding practices are a main risk factor for stunting. The need for systematic and user-friendly tools to guide the planning, implementation, monitoring, and evaluation of dietary interventions for children aged 6-23 months has been recognized. This paper describes five tools, namely, ProPAN, Optifood, Cost of the Diet, Fill the Nutrient Gap, and Monitoring Results for Equity System that can be used in different combinations to improve situation analysis, planning, implementation, monitoring, or evaluation approaches for complementary feeding in a particular context. ProPAN helps with development of strategies and activities designed to change the behaviours of the target population. Optifood provides guidance for developing food-based recommendations. The Cost of the Diet can provide insight on economic barriers to accessing a nutritious and balanced diet. The Fill the Nutrient Gap facilitates formulation of context-specific policies and programmatic approaches to improve nutrient intake, through a multistakeholder process that uses insights from linear programming and secondary data. The Monitoring Results for Equity System helps with analysis of gaps, constraints, and determinants of complementary feeding interventions and adoption of recommended practices especially in the most vulnerable and deprived populations. These tools, and support for their use, are readily available and can be used either alone and/or complementarily throughout the programme cycle to improve infant and young child-feeding programmes at subnational and national levels. © 2017 John Wiley & Sons Ltd.

  11. In Situ 3D Monitoring of Geometric Signatures in the Powder-Bed-Fusion Additive Manufacturing Process via Vision Sensing Methods

    PubMed Central

    Li, Zhongwei; Liu, Xingjian; Wen, Shifeng; He, Piyao; Zhong, Kai; Wei, Qingsong; Shi, Yusheng; Liu, Sheng

    2018-01-01

    Lack of monitoring of the in situ process signatures is one of the challenges that has been restricting the improvement of Powder-Bed-Fusion Additive Manufacturing (PBF AM). Among various process signatures, the monitoring of the geometric signatures is of high importance. This paper presents the use of vision sensing methods as a non-destructive in situ 3D measurement technique to monitor two main categories of geometric signatures: 3D surface topography and 3D contour data of the fusion area. To increase the efficiency and accuracy, an enhanced phase measuring profilometry (EPMP) is proposed to monitor the 3D surface topography of the powder bed and the fusion area reliably and rapidly. A slice model assisted contour detection method is developed to extract the contours of fusion area. The performance of the techniques is demonstrated with some selected measurements. Experimental results indicate that the proposed method can reveal irregularities caused by various defects and inspect the contour accuracy and surface quality. It holds the potential to be a powerful in situ 3D monitoring tool for manufacturing process optimization, close-loop control, and data visualization. PMID:29649171

  12. Connecting World Heritage Nominations and Monitoring with the Support of the Silk Roads Cultural Heritage Resource Information System

    NASA Astrophysics Data System (ADS)

    Vileikis, O.; Dumont, B.; Serruys, E.; Van Balen, K.; Tigny, V.; De Maeyer, P.

    2013-07-01

    Serial transnational World Heritage nominations are challenging the way cultural heritage has been managed and evaluated in the past. Serial transnational World Heritage nominations are unique in that they consist of multiple sites listed as one property, distributed in different countries, involving a large diversity of stakeholders in the process. As a result, there is a need for precise baseline information for monitoring, reporting and decision making. This type of nomination requires different methodologies and tools to improve the monitoring cycle from the beginning of the nomination towards the periodic reporting. The case study of the Silk Roads Cultural Heritage Resource Information System (CHRIS) illustrates the use of a Geographical Content Management System (Geo-CMS) supporting the serial transnational World Heritage nomination and the monitoring of the Silk Roads in the five Central Asian countries. The Silk Roads CHRIS is an initiative supported by UNESCO World Heritage Centre (WHC) and the Belgian Federal Science Policy Office (BELSPO), and developed by a consortium headed by the Raymond Lemaire International Centre for Conservation (RLICC) at the KULeuven. The Silk Roads CHRIS has been successfully assisting in the preparation of the nomination dossiers of the Republics of Kazakhstan, Tajikistan and Uzbekistan and will be used as a tool for monitoring tool in the Central Asian countries.

  13. Potential of Near-Infrared Chemical Imaging as Process Analytical Technology Tool for Continuous Freeze-Drying.

    PubMed

    Brouckaert, Davinia; De Meyer, Laurens; Vanbillemont, Brecht; Van Bockstal, Pieter-Jan; Lammens, Joris; Mortier, Séverine; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas

    2018-04-03

    Near-infrared chemical imaging (NIR-CI) is an emerging tool for process monitoring because it combines the chemical selectivity of vibrational spectroscopy with spatial information. Whereas traditional near-infrared spectroscopy is an attractive technique for water content determination and solid-state investigation of lyophilized products, chemical imaging opens up possibilities for assessing the homogeneity of these critical quality attributes (CQAs) throughout the entire product. In this contribution, we aim to evaluate NIR-CI as a process analytical technology (PAT) tool for at-line inspection of continuously freeze-dried pharmaceutical unit doses based on spin freezing. The chemical images of freeze-dried mannitol samples were resolved via multivariate curve resolution, allowing us to visualize the distribution of mannitol solid forms throughout the entire cake. Second, a mannitol-sucrose formulation was lyophilized with variable drying times for inducing changes in water content. Analyzing the corresponding chemical images via principal component analysis, vial-to-vial variations as well as within-vial inhomogeneity in water content could be detected. Furthermore, a partial least-squares regression model was constructed for quantifying the water content in each pixel of the chemical images. It was hence concluded that NIR-CI is inherently a most promising PAT tool for continuously monitoring freeze-dried samples. Although some practicalities are still to be solved, this analytical technique could be applied in-line for CQA evaluation and for detecting the drying end point.

  14. Cyclostationarity approach for monitoring chatter and tool wear in high speed milling

    NASA Astrophysics Data System (ADS)

    Lamraoui, M.; Thomas, M.; El Badaoui, M.

    2014-02-01

    Detection of chatter and tool wear is crucial in the machining process and their monitoring is a key issue, for: (1) insuring better surface quality, (2) increasing productivity and (3) protecting both machines and safe workpiece. This paper presents an investigation of chatter and tool wear using the cyclostationary method to process the vibrations signals acquired from high speed milling. Experimental cutting tests were achieved on slot milling operation of aluminum alloy. The experimental set-up is designed for acquisition of accelerometer signals and encoding information picked up from an encoder. The encoder signal is used for re-sampling accelerometers signals in angular domain using a specific algorithm that was developed in LASPI laboratory. The use of cyclostationary on accelerometer signals has been applied for monitoring chatter and tool wear in high speed milling. The cyclostationarity appears on average properties (first order) of signals, on the energetic properties (second order) and it generates spectral lines at cyclic frequencies in spectral correlation. Angular power and kurtosis are used to analyze chatter phenomena. The formation of chatter is characterized by unstable, chaotic motion of the tool and strong anomalous fluctuations of cutting forces. Results show that stable machining generates only very few cyclostationary components of second order while chatter is strongly correlated to cyclostationary components of second order. By machining in the unstable region, chatter results in flat angular kurtosis and flat angular power, such as a pseudo (white) random signal with flat spectrum. Results reveal that spectral correlation and Wigner Ville spectrum or integrated Wigner Ville issued from second-order cyclostationary are an efficient parameter for the early diagnosis of faults in high speed machining, such as chatter, tool wear and bearings, compared to traditional stationary methods. Wigner Ville representation of the residual signal shows that the energy corresponding to the tooth passing decreases when chatter phenomenon occurs. The effect of the tool wear and the number of broken teeth on the excitation of structure resonances appears in Wigner Ville presentation.

  15. Validity and reliability of food security measures.

    PubMed

    Cafiero, Carlo; Melgar-Quiñonez, Hugo R; Ballard, Terri J; Kepple, Anne W

    2014-12-01

    This paper reviews some of the existing food security indicators, discussing the validity of the underlying concept and the expected reliability of measures under reasonably feasible conditions. The main objective of the paper is to raise awareness on existing trade-offs between different qualities of possible food security measurement tools that must be taken into account when such tools are proposed for practical application, especially for use within an international monitoring framework. The hope is to provide a timely, useful contribution to the process leading to the definition of a food security goal and the associated monitoring framework within the post-2015 Development Agenda. © 2014 New York Academy of Sciences.

  16. Development of a golf-specific load monitoring tool: Content validity and feasibility.

    PubMed

    Williams, Scott B; Gastin, Paul B; Saw, Anna E; Robertson, Sam

    2018-05-01

    Athletes often record details of their training and competitions, supported by information such as environmental conditions, travel, as well as how they felt. However, it is not known how prevalent these practices are in golfers, or how valuable this process is perceived. The purpose of this study was to develop a golf-specific load monitoring tool (GLMT), and establish the content validity and feasibility of this tool amongst high-level golfers. In the first phase of development, 21 experts were surveyed to determine the suitability of items for inclusion in the GLMT. Of the 36 items, 21 received >78% agreement, a requirement to establish content validity and for inclusion in the GLMT. Total duration was the preferred metric for golf-specific activities, whilst rating of perceived exertion (RPE) was preferred for measuring physical training. In the second phase, feasibility of the tool was assessed by surveying 13 high-level male golfers following 28-days of daily GLMT use. All items included in the GLMT were deemed feasible to record, with all players participating in the feasibility study providing high to very high ratings. Golfers responded that they would consider using a load monitoring tool of this nature long term, provided it can be completed in less than five minutes per day.

  17. Novel texture-based descriptors for tool wear condition monitoring

    NASA Astrophysics Data System (ADS)

    Antić, Aco; Popović, Branislav; Krstanović, Lidija; Obradović, Ratko; Milošević, Mijodrag

    2018-01-01

    All state-of-the-art tool condition monitoring systems (TCM) in the tool wear recognition task, especially those that use vibration sensors, heavily depend on the choice of descriptors containing information about the tool wear state which are extracted from the particular sensor signals. All other post-processing techniques do not manage to increase the recognition precision if those descriptors are not discriminative enough. In this work, we propose a tool wear monitoring strategy which relies on the novel texture based descriptors. We consider the module of the Short Term Discrete Fourier Transform (STDFT) spectra obtained from the particular vibration sensors signal utterance as the 2D textured image. This is done by identifying the time scale of STDFT as the first dimension, and the frequency scale as the second dimension of the particular textured image. The obtained textured image is then divided into particular 2D texture patches, covering a part of the frequency range of interest. After applying the appropriate filter bank, 2D textons are extracted for each predefined frequency band. By averaging in time, we extract from the textons for each band of interest the information regarding the Probability Density Function (PDF) in the form of lower order moments, thus obtaining robust tool wear state descriptors. We validate the proposed features by the experiments conducted on the real TCM system, obtaining the high recognition accuracy.

  18. Cleanliness inspection tool for RSRM bond surfaces

    NASA Technical Reports Server (NTRS)

    Mattes, Robert A.

    1995-01-01

    Using optically stimulated electron emission (OSEE), Thiokol has monitored bond surfaces in process for contamination on the Redesigned Solid Rocket Motor (RSRM). This technique provides process control information to help assure bond surface quality and repeatability prior to bonding. This paper will describe OSEE theory of operation and the instrumentation implemented at Thiokol Corporation since 1987. Data from process hardware will be presented.

  19. Planning and Evaluation Framework: A Five Year Program for the Department of Libraries, Commonwealth of Kentucky.

    ERIC Educational Resources Information Center

    Kentucky State Dept. of Libraries, Frankfort.

    This document is the beginning of a process. The objects of the process are to improve decisions between alternate choices in the development of statewide library services. Secondary functions are to develop the tools for providing information relevant to decisions, to measure and monitor services, and to aid in the communication process. The…

  20. Carbon footprint analysis as a tool for energy and environmental management in small and medium-sized enterprises

    NASA Astrophysics Data System (ADS)

    Giama, E.; Papadopoulos, A. M.

    2018-01-01

    The reduction of carbon emissions has become a top priority in the decision-making process for governments and companies, the strict European legislation framework being a major driving force behind this effort. On the other hand, many companies face difficulties in estimating their footprint and in linking the results derived from environmental evaluation processes with an integrated energy management strategy, which will eventually lead to energy-efficient and cost-effective solutions. The paper highlights the need of companies to establish integrated environmental management practices, with tools such as carbon footprint analysis to monitor the energy performance of production processes. Concepts and methods are analysed, and selected indicators are presented by means of benchmarking, monitoring and reporting the results in order to be used effectively from the companies. The study is based on data from more than 90 Greek small and medium enterprises, followed by a comprehensive discussion of cost-effective and realistic energy-saving measures.

  1. Hyperspectral imaging using near infrared spectroscopy to monitor coat thickness uniformity in the manufacture of a transdermal drug delivery system.

    PubMed

    Pavurala, Naresh; Xu, Xiaoming; Krishnaiah, Yellela S R

    2017-05-15

    Hyperspectral imaging using near infrared spectroscopy (NIRS) integrates spectroscopy and conventional imaging to obtain both spectral and spatial information of materials. The non-invasive and rapid nature of hyperspectral imaging using NIRS makes it a valuable process analytical technology (PAT) tool for in-process monitoring and control of the manufacturing process for transdermal drug delivery systems (TDS). The focus of this investigation was to develop and validate the use of Near Infra-red (NIR) hyperspectral imaging to monitor coat thickness uniformity, a critical quality attribute (CQA) for TDS. Chemometric analysis was used to process the hyperspectral image and a partial least square (PLS) model was developed to predict the coat thickness of the TDS. The goodness of model fit and prediction were 0.9933 and 0.9933, respectively, indicating an excellent fit to the training data and also good predictability. The % Prediction Error (%PE) for internal and external validation samples was less than 5% confirming the accuracy of the PLS model developed in the present study. The feasibility of the hyperspectral imaging as a real-time process analytical tool for continuous processing was also investigated. When the PLS model was applied to detect deliberate variation in coating thickness, it was able to predict both the small and large variations as well as identify coating defects such as non-uniform regions and presence of air bubbles. Published by Elsevier B.V.

  2. Optimal Design of Multitype Groundwater Monitoring Networks Using Easily Accessible Tools.

    PubMed

    Wöhling, Thomas; Geiges, Andreas; Nowak, Wolfgang

    2016-11-01

    Monitoring networks are expensive to establish and to maintain. In this paper, we extend an existing data-worth estimation method from the suite of PEST utilities with a global optimization method for optimal sensor placement (called optimal design) in groundwater monitoring networks. Design optimization can include multiple simultaneous sensor locations and multiple sensor types. Both location and sensor type are treated simultaneously as decision variables. Our method combines linear uncertainty quantification and a modified genetic algorithm for discrete multilocation, multitype search. The efficiency of the global optimization is enhanced by an archive of past samples and parallel computing. We demonstrate our methodology for a groundwater monitoring network at the Steinlach experimental site, south-western Germany, which has been established to monitor river-groundwater exchange processes. The target of optimization is the best possible exploration for minimum variance in predicting the mean travel time of the hyporheic exchange. Our results demonstrate that the information gain of monitoring network designs can be explored efficiently and with easily accessible tools prior to taking new field measurements or installing additional measurement points. The proposed methods proved to be efficient and can be applied for model-based optimal design of any type of monitoring network in approximately linear systems. Our key contributions are (1) the use of easy-to-implement tools for an otherwise complex task and (2) yet to consider data-worth interdependencies in simultaneous optimization of multiple sensor locations and sensor types. © 2016, National Ground Water Association.

  3. Recent Research applications at the Athens Neutron Monitor Station

    NASA Astrophysics Data System (ADS)

    Mavromichalaki, H.; Gerontidou, M.; Paschalis, P.; Papaioannou, A.; Paouris, E.; Papailiou, M.; Souvatzoglou, G.

    2015-08-01

    The ground based neutron monitor measurements play a key role in the field of space physics, solar-terrestrial relations, and space weather applications. The Athens cosmic ray group has developed several research applications such as an optimized automated Ground Level Enhancement Alert (GLE Alert Plus) and a web interface, providing data from multiple Neutron Monitor stations (Multi-Station tool). These services are actually available via the Space Weather Portal operated by the European Space Agency (http://swe.ssa.esa.int). In addition, two simulation tools, based on Geant4, have also been implemented. The first one is for the simulation of the cosmic ray showers in the atmosphere (DYASTIMA) and the second one is for the simulation of the 6NM-64 neutron monitor. The contribution of the simulation tools to the calculations of the radiation dose received by air crews and passengers within the Earth's atmosphere and to the neutron monitor study is presented as well. Furthermore, the accurate calculation of the barometric coefficient and the primary data processing by filtering algorithms, such as the well known Median Editor and the developed by the Athens group ANN Algorithm and Edge Editor which contribute to the provision of high quality neutron monitor data are also discussed. Finally, a Space Weather Forecasting Center which provides a three day geomagnetic activity report on a daily basis has been set up and has been operating for the last two years at the Athens Neutron Monitor Station.

  4. The Multi-Isotope Process (MIP) Monitor Project: FY13 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meier, David E.; Coble, Jamie B.; Jordan, David V.

    The Multi-Isotope Process (MIP) Monitor provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of “… (minimization of) the risks of nuclear proliferation and terrorism.” The MIP Monitor measures the distribution of the radioactive isotopes in product and waste streams of a nuclear reprocessing facility. These isotopes are monitored online by gamma spectrometry and compared, in near-real-time, to spectral patterns representing “normal” process conditions using multivariate analysis and pattern recognition algorithms. The combination of multivariate analysis and gamma spectroscopy allows us to detect small changes in the gamma spectrum, which may indicatemore » changes in process conditions. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting for systems representing aqueous reprocessing facilities. However, pyroprocessing is emerging as an alternative to aqueous reprocessing techniques.« less

  5. Information Assurance Technology Analysis Center Information Assurance Tools Report Intrusion Detection

    DTIC Science & Technology

    1998-01-01

    such as central processing unit (CPU) usage, disk input/output (I/O), memory usage, user activity, and number of logins attempted. The statistics... EMERALD Commercial anomaly detection, system monitoring SRI porras@csl.sri.com www.csl.sri.com/ emerald /index. html Gabriel Commercial system...sensors, it starts to protect the network with minimal configuration and maximum intelligence. T 11 EMERALD TITLE EMERALD (Event Monitoring

  6. Grid Stability Awareness System (GSAS) Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feuerborn, Scott; Ma, Jian; Black, Clifton

    The project team developed a software suite named Grid Stability Awareness System (GSAS) for power system near real-time stability monitoring and analysis based on synchrophasor measurement. The software suite consists of five analytical tools: an oscillation monitoring tool, a voltage stability monitoring tool, a transient instability monitoring tool, an angle difference monitoring tool, and an event detection tool. These tools have been integrated into one framework to provide power grid operators with both real-time or near real-time stability status of a power grid and historical information about system stability status. These tools are being considered for real-time use in themore » operation environment.« less

  7. Integrated piezoelectric actuators in deep drawing tools

    NASA Astrophysics Data System (ADS)

    Neugebauer, R.; Mainda, P.; Drossel, W.-G.; Kerschner, M.; Wolf, K.

    2011-04-01

    The production of car body panels are defective in succession of process fluctuations. Thus the produced car body panel can be precise or damaged. To reduce the error rate, an intelligent deep drawing tool was developed at the Fraunhofer Institute for Machine Tools and Forming Technology IWU in cooperation with Audi and Volkswagen. Mechatronic components in a closed-loop control is the main differentiating factor between an intelligent and a conventional deep drawing tool. In correlation with sensors for process monitoring, the intelligent tool consists of piezoelectric actuators to actuate the deep drawing process. By enabling the usage of sensors and actuators at the die, the forming tool transform to a smart structure. The interface between sensors and actuators will be realized with a closed-loop control. The content of this research will present the experimental results with the piezoelectric actuator. For the analysis a production-oriented forming tool with all automotive requirements were used. The disposed actuators are monolithic multilayer actuators of the piezo injector system. In order to achieve required force, the actuators are combined in a cluster. The cluster is redundant and economical. In addition to the detailed assembly structures, this research will highlight intensive analysis with the intelligent deep drawing tool.

  8. How To Better Track Effective School Indicators: The Control Chart Techniques.

    ERIC Educational Resources Information Center

    Coutts, Douglas

    1998-01-01

    Control charts are practical tools to monitor various school indicators (attendance rates, standardized test scores, grades, and graduation rates) by displaying data on the same scale over time. This article shows how principals can calculate the upper natural-process limit, lower natural-process limit, and upper control limit for attendance. (15…

  9. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    NASA Technical Reports Server (NTRS)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. Tricia Erhardt and I studied the problem domain for developing an Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy, datasets. From the study and discussion with NASA LeRC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of tile data for GA based multi-resolution optimal search.

  10. Mechanical impedance measurements for improved cost-effective process monitoring

    NASA Astrophysics Data System (ADS)

    Clopet, Caroline R.; Pullen, Deborah A.; Badcock, Rodney A.; Ralph, Brian; Fernando, Gerard F.; Mahon, Steve W.

    1999-06-01

    The aerospace industry has seen a considerably growth in composite usage over the past ten years, especially with the development of cost effective manufacturing techniques such as Resin Transfer Molding and Resin Infusion under Flexible Tooling. The relatively high cost of raw material and conservative processing schedules has limited their growth further in non-aerospace technologies. In-situ process monitoring has been explored for some time as a means to improving the cost efficiency of manufacturing with dielectric spectroscopy and optical fiber sensors being the two primary techniques developed to date. A new emerging technique is discussed here making use of piezoelectric wafers with the ability to sense not only aspects of resin flow but also to detect the change in properties of the resin as it cures. Experimental investigations to date have shown a correlation between mechanical impedance measurements and the mechanical properties of cured epoxy systems with potential for full process monitoring.

  11. Advanced data management for optimising the operation of a full-scale WWTP.

    PubMed

    Beltrán, Sergio; Maiza, Mikel; de la Sota, Alejandro; Villanueva, José María; Ayesa, Eduardo

    2012-01-01

    The lack of appropriate data management tools is presently a limiting factor for a broader implementation and a more efficient use of sensors and analysers, monitoring systems and process controllers in wastewater treatment plants (WWTPs). This paper presents a technical solution for advanced data management of a full-scale WWTP. The solution is based on an efficient and intelligent use of the plant data by a standard centralisation of the heterogeneous data acquired from different sources, effective data processing to extract adequate information, and a straightforward connection to other emerging tools focused on the operational optimisation of the plant such as advanced monitoring and control or dynamic simulators. A pilot study of the advanced data manager tool was designed and implemented in the Galindo-Bilbao WWTP. The results of the pilot study showed its potential for agile and intelligent plant data management by generating new enriched information combining data from different plant sources, facilitating the connection of operational support systems, and developing automatic plots and trends of simulated results and actual data for plant performance and diagnosis.

  12. Estimation of the influence of tool wear on force signals: A finite element approach in AISI 1045 orthogonal cutting

    NASA Astrophysics Data System (ADS)

    Equeter, Lucas; Ducobu, François; Rivière-Lorphèvre, Edouard; Abouridouane, Mustapha; Klocke, Fritz; Dehombreux, Pierre

    2018-05-01

    Industrial concerns arise regarding the significant cost of cutting tools in machining process. In particular, their improper replacement policy can lead either to scraps, or to early tool replacements, which would waste fine tools. ISO 3685 provides the flank wear end-of-life criterion. Flank wear is also the nominal type of wear for longest tool lifetimes in optimal cutting conditions. Its consequences include bad surface roughness and dimensional discrepancies. In order to aid the replacement decision process, several tool condition monitoring techniques are suggested. Force signals were shown in the literature to be strongly linked with tools flank wear. It can therefore be assumed that force signals are highly relevant for monitoring the condition of cutting tools and providing decision-aid information in the framework of their maintenance and replacement. The objective of this work is to correlate tools flank wear with numerically computed force signals. The present work uses a Finite Element Model with a Coupled Eulerian-Lagrangian approach. The geometry of the tool is changed for different runs of the model, in order to obtain results that are specific to a certain level of wear. The model is assessed by comparison with experimental data gathered earlier on fresh tools. Using the model at constant cutting parameters, force signals under different tool wear states are computed and provide force signals for each studied tool geometry. These signals are qualitatively compared with relevant data from the literature. At this point, no quantitative comparison could be performed on worn tools because the reviewed literature failed to provide similar studies in this material, either numerical or experimental. Therefore, further development of this work should include experimental campaigns aiming at collecting cutting forces signals and assessing the numerical results that were achieved through this work.

  13. Interferometric analysis of polishing surface with a petal tool

    NASA Astrophysics Data System (ADS)

    Salas-Sánchez, Alfonso; Leal-Cabrera, Irce; Percino Zacarias, Elizabeth; Granados-Agustín, Fermín S.

    2011-09-01

    In this work, we describe a phase shift interferometric monitoring of polishing processes produced by a petal tool over a spherical surface to obtain a parabolic surface. In the process, we used a commercial polishing machine; the purpose of this work is to have control of polishing time. To achieve this analysis, we used a Fizeau interferometer of ZYGO Company for optical shop testing, and the Durango software from Diffraction International Company. For data acquisition, simulation and evaluation of optical surfaces, we start polishing process with a spherical surface with 15.46 cm of diameter; a 59.9 cm of radius curvature and, with f/# 1.9.

  14. Monitoring chemical reactions by low-field benchtop NMR at 45 MHz: pros and cons.

    PubMed

    Silva Elipe, Maria Victoria; Milburn, Robert R

    2016-06-01

    Monitoring chemical reactions is the key to controlling chemical processes where NMR can provide support. High-field NMR gives detailed structural information on chemical compounds and reactions; however, it is expensive and complex to operate. Conversely, low-field NMR instruments are simple and relatively inexpensive alternatives. While low-field NMR does not provide the detailed information as the high-field instruments as a result of their smaller chemical shift dispersion and the complex secondary coupling, it remains of practical value as a process analytical technology (PAT) tool and is complimentary to other established methods, such as ReactIR and Raman spectroscopy. We have tested a picoSpin-45 (currently under ThermoFisher Scientific) benchtop NMR instrument to monitor three types of reactions by 1D (1) H NMR: a Fischer esterification, a Suzuki cross-coupling, and the formation of an oxime. The Fischer esterification is a relatively simple reaction run at high concentration and served as proof of concept. The Suzuki coupling is an example of a more complex, commonly used reaction involving overlapping signals. Finally, the oxime formation involved a reaction in two phases that cannot be monitored by other PAT tools. Here, we discuss the pros and cons of monitoring these reactions at a low-field of 45 MHz by 1D (1) H NMR. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  15. Increasing Psychotherapists’ Adoption and Implementation of the Evidence-based Practice of Progress Monitoring

    PubMed Central

    Persons, Jacqueline B.; Koerner, Kelly; Eidelman, Polina; Thomas, Cannon; Liu, Howard

    2015-01-01

    Evidence-based practices (EBPs) reach consumers slowly because practitioners are slow to adopt and implement them. We hypothesized that giving psychotherapists a tool + training intervention that was designed to help the therapist integrate the EBP of progress monitoring into his or her usual way of working would be associated with adoption and sustained implementation of the particular progress monitoring tool we trained them to use (the Depression Anxiety Stress Scales on our Online Progress Tracking tool) and would generalize to all types of progress monitoring measures. To test these hypotheses, we developed an online progress monitoring tool and a course that trained psychotherapists to use it, and we assessed progress monitoring behavior in 26 psychotherapists before, during, immediately after, and 12 months after they received the tool and training. Immediately after receiving the tool + training intervention, participants showed statistically significant increases in use of the online tool and of all types of progress monitoring measures. Twelve months later, participants showed sustained use of any type of progress monitoring measure but not the online tool. PMID:26618237

  16. Increasing psychotherapists' adoption and implementation of the evidence-based practice of progress monitoring.

    PubMed

    Persons, Jacqueline B; Koerner, Kelly; Eidelman, Polina; Thomas, Cannon; Liu, Howard

    2016-01-01

    Evidence-based practices (EBPs) reach consumers slowly because practitioners are slow to adopt and implement them. We hypothesized that giving psychotherapists a tool + training intervention that was designed to help the therapist integrate the EBP of progress monitoring into his or her usual way of working would be associated with adoption and sustained implementation of the particular progress monitoring tool we trained them to use (the Depression Anxiety Stress Scales on our Online Progress Tracking tool) and would generalize to all types of progress monitoring measures. To test these hypotheses, we developed an online progress monitoring tool and a course that trained psychotherapists to use it, and we assessed progress monitoring behavior in 26 psychotherapists before, during, immediately after, and 12 months after they received the tool and training. Immediately after receiving the tool + training intervention, participants showed statistically significant increases in use of the online tool and of all types of progress monitoring measures. Twelve months later, participants showed sustained use of any type of progress monitoring measure but not the online tool. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Recent advances in electronic nose techniques for monitoring of fermentation process.

    PubMed

    Jiang, Hui; Zhang, Hang; Chen, Quansheng; Mei, Congli; Liu, Guohai

    2015-12-01

    Microbial fermentation process is often sensitive to even slight changes of conditions that may result in unacceptable end-product quality. Thus, the monitoring of the process is critical for discovering unfavorable deviations as early as possible and taking the appropriate measures. However, the use of traditional analytical techniques is often time-consuming and labor-intensive. In this sense, the most effective way of developing rapid, accurate and relatively economical method for quality assurance in microbial fermentation process is the use of novel chemical sensor systems. Electronic nose techniques have particular advantages in non-invasive monitoring of microbial fermentation process. Therefore, in this review, we present an overview of the most important contributions dealing with the quality control in microbial fermentation process using the electronic nose techniques. After a brief description of the fundamentals of the sensor techniques, some examples of potential applications of electronic nose techniques monitoring are provided, including the implementation of control strategies and the combination with other monitoring tools (i.e. sensor fusion). Finally, on the basis of the review, the electronic nose techniques are critically commented, and its strengths and weaknesses being highlighted. In addition, on the basis of the observed trends, we also propose the technical challenges and future outlook for the electronic nose techniques.

  18. Monitoring of laser material processing using machine integrated low-coherence interferometry

    NASA Astrophysics Data System (ADS)

    Kunze, Rouwen; König, Niels; Schmitt, Robert

    2017-06-01

    Laser material processing has become an indispensable tool in modern production. With the availability of high power pico- and femtosecond laser sources, laser material processing is advancing into applications, which demand for highest accuracies such as laser micro milling or laser drilling. In order to enable narrow tolerance windows, a closedloop monitoring of the geometrical properties of the processed work piece is essential for achieving a robust manufacturing process. Low coherence interferometry (LCI) is a high-precision measuring principle well-known from surface metrology. In recent years, we demonstrated successful integrations of LCI into several different laser material processing methods. Within this paper, we give an overview about the different machine integration strategies, that always aim at a complete and ideally telecentric integration of the measurement device into the existing beam path of the processing laser. Thus, highly accurate depth measurements within machine coordinates and a subsequent process control and quality assurance are possible. First products using this principle have already found its way to the market, which underlines the potential of this technology for the monitoring of laser material processing.

  19. Implementation of transmission NIR as a PAT tool for monitoring drug transformation during HME processing.

    PubMed

    Islam, Muhammad T; Scoutaris, Nikolaos; Maniruzzaman, Mohammed; Moradiya, Hiren G; Halsey, Sheelagh A; Bradley, Michael S A; Chowdhry, Babur Z; Snowden, Martin J; Douroumis, Dennis

    2015-10-01

    The aim of the work reported herein was to implement process analytical technology (PAT) tools during hot melt extrusion (HME) in order to obtain a better understanding of the relationship between HME processing parameters and the extruded formulations. For the first time two in-line NIR probes (transmission and reflectance) have been coupled with HME to monitor the extrusion of the water insoluble drug indomethacin (IND) in the presence of Soluplus (SOL) or Kollidon VA64 hydrophilic polymers. In-line extrusion monitoring of sheets, produced via a specially designed die, was conducted at various drug/polymer ratios and processing parameters. Characterisation of the extruded transparent sheets was also undertaken by using DSC, XRPD and Raman mapping. Analysis of the experimental findings revealed the production of molecular solutions where IND is homogeneously blended (ascertained by Raman mapping) in the polymer matrices, as it acts as a plasticizer for both hydrophilic polymers. PCA analysis of the recorded NIR signals showed that the screw speed used in HME affects the recorded spectra but not the homogeneity of the embedded drug in the polymer sheets. The IND/VA64 and IND/SOL extruded sheets displayed rapid dissolution rates with 80% and 30% of the IND being released, respectively within the first 20min. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Application of process analytical technology for monitoring freeze-drying of an amorphous protein formulation: use of complementary tools for real-time product temperature measurements and endpoint detection.

    PubMed

    Schneid, Stefan C; Johnson, Robert E; Lewis, Lavinia M; Stärtzel, Peter; Gieseler, Henning

    2015-05-01

    Process analytical technology (PAT) and quality by design have gained importance in all areas of pharmaceutical development and manufacturing. One important method for monitoring of critical product attributes and process optimization in laboratory scale freeze-drying is manometric temperature measurement (MTM). A drawback of this innovative technology is that problems are encountered when processing high-concentrated amorphous materials, particularly protein formulations. In this study, a model solution of bovine serum albumin and sucrose was lyophilized at both conservative and aggressive primary drying conditions. Different temperature sensors were employed to monitor product temperatures. The residual moisture content at primary drying endpoints as indicated by temperature sensors and batch PAT methods was quantified from extracted sample vials. The data from temperature probes were then used to recalculate critical product parameters, and the results were compared with MTM data. The drying endpoints indicated by the temperature sensors were not suitable for endpoint indication, in contrast to the batch methods endpoints. The accuracy of MTM Pice data was found to be influenced by water reabsorption. Recalculation of Rp and Pice values based on data from temperature sensors and weighed vials was possible. Overall, extensive information about critical product parameters could be obtained using data from complementary PAT tools. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  1. Priorities for future research into asthma diagnostic tools: A PAN-EU consensus exercise from the European asthma research innovation partnership (EARIP).

    PubMed

    Garcia-Marcos, L; Edwards, J; Kennington, E; Aurora, P; Baraldi, E; Carraro, S; Gappa, M; Louis, R; Moreno-Galdo, A; Peroni, D G; Pijnenburg, M; Priftis, K N; Sanchez-Solis, M; Schuster, A; Walker, S

    2018-02-01

    The diagnosis of asthma is currently based on clinical history, physical examination and lung function, and to date, there are no accurate objective tests either to confirm the diagnosis or to discriminate between different types of asthma. This consensus exercise reviews the state of the art in asthma diagnosis to identify opportunities for future investment based on the likelihood of their successful development, potential for widespread adoption and their perceived impact on asthma patients. Using a two-stage e-Delphi process and a summarizing workshop, a group of European asthma experts including health professionals, researchers, people with asthma and industry representatives ranked the potential impact of research investment in each technique or tool for asthma diagnosis and monitoring. After a systematic review of the literature, 21 statements were extracted and were subject of the two-stage Delphi process. Eleven statements were scored 3 or more and were further discussed and ranked in a face-to-face workshop. The three most important diagnostic/predictive tools ranked were as follows: "New biological markers of asthma (eg genomics, proteomics and metabolomics) as a tool for diagnosis and/or monitoring," "Prediction of future asthma in preschool children with reasonable accuracy" and "Tools to measure volatile organic compounds (VOCs) in exhaled breath." © 2018 John Wiley & Sons Ltd.

  2. A Systematic Approach to Capacity Strengthening of Laboratory Systems for Control of Neglected Tropical Diseases in Ghana, Kenya, Malawi and Sri Lanka

    PubMed Central

    Njelesani, Janet; Dacombe, Russell; Palmer, Tanith; Smith, Helen; Koudou, Benjamin; Bockarie, Moses; Bates, Imelda

    2014-01-01

    Background The lack of capacity in laboratory systems is a major barrier to achieving the aims of the London Declaration (2012) on neglected tropical diseases (NTDs). To counter this, capacity strengthening initiatives have been carried out in NTD laboratories worldwide. Many of these initiatives focus on individuals' skills or institutional processes and structures ignoring the crucial interactions between the laboratory and the wider national and international context. Furthermore, rigorous methods to assess these initiatives once they have been implemented are scarce. To address these gaps we developed a set of assessment and monitoring tools that can be used to determine the capacities required and achieved by laboratory systems at the individual, organizational, and national/international levels to support the control of NTDs. Methodology and principal findings We developed a set of qualitative and quantitative assessment and monitoring tools based on published evidence on optimal laboratory capacity. We implemented the tools with laboratory managers in Ghana, Malawi, Kenya, and Sri Lanka. Using the tools enabled us to identify strengths and gaps in the laboratory systems from the following perspectives: laboratory quality benchmarked against ISO 15189 standards, the potential for the laboratories to provide support to national and regional NTD control programmes, and the laboratory's position within relevant national and international networks and collaborations. Conclusion We have developed a set of mixed methods assessment and monitoring tools based on evidence derived from the components needed to strengthen the capacity of laboratory systems to control NTDs. Our tools help to systematically assess and monitor individual, organizational, and wider system level capacity of laboratory systems for NTD control and can be applied in different country contexts. PMID:24603407

  3. Distributed cyberinfrastructure tools for automated data processing of structural monitoring data

    NASA Astrophysics Data System (ADS)

    Zhang, Yilan; Kurata, Masahiro; Lynch, Jerome P.; van der Linden, Gwendolyn; Sederat, Hassan; Prakash, Atul

    2012-04-01

    The emergence of cost-effective sensing technologies has now enabled the use of dense arrays of sensors to monitor the behavior and condition of large-scale bridges. The continuous operation of dense networks of sensors presents a number of new challenges including how to manage such massive amounts of data that can be created by the system. This paper reports on the progress of the creation of cyberinfrastructure tools which hierarchically control networks of wireless sensors deployed in a long-span bridge. The internet-enabled cyberinfrastructure is centrally managed by a powerful database which controls the flow of data in the entire monitoring system architecture. A client-server model built upon the database provides both data-provider and system end-users with secured access to various levels of information of a bridge. In the system, information on bridge behavior (e.g., acceleration, strain, displacement) and environmental condition (e.g., wind speed, wind direction, temperature, humidity) are uploaded to the database from sensor networks installed in the bridge. Then, data interrogation services interface with the database via client APIs to autonomously process data. The current research effort focuses on an assessment of the scalability and long-term robustness of the proposed cyberinfrastructure framework that has been implemented along with a permanent wireless monitoring system on the New Carquinez (Alfred Zampa Memorial) Suspension Bridge in Vallejo, CA. Many data interrogation tools are under development using sensor data and bridge metadata (e.g., geometric details, material properties, etc.) Sample data interrogation clients including those for the detection of faulty sensors, automated modal parameter extraction.

  4. 1H NMR-based metabolic profiling for evaluating poppy seed rancidity and brewing.

    PubMed

    Jawień, Ewa; Ząbek, Adam; Deja, Stanisław; Łukaszewicz, Marcin; Młynarz, Piotr

    2015-12-01

    Poppy seeds are widely used in household and commercial confectionery. The aim of this study was to demonstrate the application of metabolic profiling for industrial monitoring of the molecular changes which occur during minced poppy seed rancidity and brewing processes performed on raw seeds. Both forms of poppy seeds were obtained from a confectionery company. Proton nuclear magnetic resonance (1H NMR) was applied as the analytical method of choice together with multivariate statistical data analysis. Metabolic fingerprinting was applied as a bioprocess control tool to monitor rancidity with the trajectory of change and brewing progressions. Low molecular weight compounds were found to be statistically significant biomarkers of these bioprocesses. Changes in concentrations of chemical compounds were explained relative to the biochemical processes and external conditions. The obtained results provide valuable and comprehensive information to gain a better understanding of the biology of rancidity and brewing processes, while demonstrating the potential for applying NMR spectroscopy combined with multivariate data analysis tools for quality control in food industries involved in the processing of oilseeds. This precious and versatile information gives a better understanding of the biology of these processes.

  5. Near infrared spectroscopy combined with multivariate analysis for monitoring the ethanol precipitation process of fraction I + II + III supernatant in human albumin separation

    NASA Astrophysics Data System (ADS)

    Li, Can; Wang, Fei; Zang, Lixuan; Zang, Hengchang; Alcalà, Manel; Nie, Lei; Wang, Mingyu; Li, Lian

    2017-03-01

    Nowadays, as a powerful process analytical tool, near infrared spectroscopy (NIRS) has been widely applied in process monitoring. In present work, NIRS combined with multivariate analysis was used to monitor the ethanol precipitation process of fraction I + II + III (FI + II + III) supernatant in human albumin (HA) separation to achieve qualitative and quantitative monitoring at the same time and assure the product's quality. First, a qualitative model was established by using principal component analysis (PCA) with 6 of 8 normal batches samples, and evaluated by the remaining 2 normal batches and 3 abnormal batches. The results showed that the first principal component (PC1) score chart could be successfully used for fault detection and diagnosis. Then, two quantitative models were built with 6 of 8 normal batches to determine the content of the total protein (TP) and HA separately by using partial least squares regression (PLS-R) strategy, and the models were validated by 2 remaining normal batches. The determination coefficient of validation (Rp2), root mean square error of cross validation (RMSECV), root mean square error of prediction (RMSEP) and ratio of performance deviation (RPD) were 0.975, 0.501 g/L, 0.465 g/L and 5.57 for TP, and 0.969, 0.530 g/L, 0.341 g/L and 5.47 for HA, respectively. The results showed that the established models could give a rapid and accurate measurement of the content of TP and HA. The results of this study indicated that NIRS is an effective tool and could be successfully used for qualitative and quantitative monitoring the ethanol precipitation process of FI + II + III supernatant simultaneously. This research has significant reference value for assuring the quality and improving the recovery ratio of HA in industrialization scale by using NIRS.

  6. Near infrared spectroscopy combined with multivariate analysis for monitoring the ethanol precipitation process of fraction I+II+III supernatant in human albumin separation.

    PubMed

    Li, Can; Wang, Fei; Zang, Lixuan; Zang, Hengchang; Alcalà, Manel; Nie, Lei; Wang, Mingyu; Li, Lian

    2017-03-15

    Nowadays, as a powerful process analytical tool, near infrared spectroscopy (NIRS) has been widely applied in process monitoring. In present work, NIRS combined with multivariate analysis was used to monitor the ethanol precipitation process of fraction I+II+III (FI+II+III) supernatant in human albumin (HA) separation to achieve qualitative and quantitative monitoring at the same time and assure the product's quality. First, a qualitative model was established by using principal component analysis (PCA) with 6 of 8 normal batches samples, and evaluated by the remaining 2 normal batches and 3 abnormal batches. The results showed that the first principal component (PC1) score chart could be successfully used for fault detection and diagnosis. Then, two quantitative models were built with 6 of 8 normal batches to determine the content of the total protein (TP) and HA separately by using partial least squares regression (PLS-R) strategy, and the models were validated by 2 remaining normal batches. The determination coefficient of validation (R p 2 ), root mean square error of cross validation (RMSECV), root mean square error of prediction (RMSEP) and ratio of performance deviation (RPD) were 0.975, 0.501g/L, 0.465g/L and 5.57 for TP, and 0.969, 0.530g/L, 0.341g/L and 5.47 for HA, respectively. The results showed that the established models could give a rapid and accurate measurement of the content of TP and HA. The results of this study indicated that NIRS is an effective tool and could be successfully used for qualitative and quantitative monitoring the ethanol precipitation process of FI+II+III supernatant simultaneously. This research has significant reference value for assuring the quality and improving the recovery ratio of HA in industrialization scale by using NIRS. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    NASA Technical Reports Server (NTRS)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. From the total budget of $5,000, Tricia and I studied the problem domain for developing ail Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy datasets. From the study and discussion with NASA LERC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of the data for GA based multi-resolution optimal search. Wavelet processing is proposed to create a coarse resolution representation of data providing two advantages in GA based search: 1. We will have less data to begin with to make search sub-spaces. 2. It will have robustness against the noise because at every level of wavelet based decomposition, we will be decomposing the signal into low pass and high pass filters.

  8. Evolution of an audit and monitoring tool into an infection prevention and control process.

    PubMed

    Denton, A; Topping, A; Humphreys, P

    2016-09-01

    In 2010, an infection prevention and control team in an acute hospital trust integrated an audit and monitoring tool (AMT) into the management regime for patients with Clostridium difficile infection (CDI). To examine the mechanisms through which the implementation of an AMT influenced the care and management of patients with CDI. A constructivist grounded theory approach was used, employing semi-structured interviews with ward staff (N=8), infection prevention and control practitioners (IPCPs) (N=7) and matrons (N=8), and subsequently a theoretical sample of senior managers (N=4). All interviews were transcribed verbatim and analysed using a constant comparison approach until explanatory categories emerged. The AMT evolved into a daily review process (DRP) that became an essential aspect of the management of all patients with CDI. Participants recognized that the DRP had positively influenced the care received by patients with CDI. Two main explanatory themes emerged to offer a framework for understanding the influence of the DRP on care management: education and learning, and the development and maintenance of relationships. The use of auditing and monitoring tools as part of a daily review process may enable ward staff, matrons, and IPCPs to improve patient outcomes and achieve the required levels of environmental hygiene if they act as a focal point for interaction, education, and collaboration. The findings offer insights into the behavioural changes and improved patient outcomes that ensue from the implementation of a DRP. Copyright © 2016 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  9. Heterogeneous Sensor Data Exploration and Sustainable Declarative Monitoring Architecture: Application to Smart Building

    NASA Astrophysics Data System (ADS)

    Servigne, S.; Gripay, Y.; Pinarer, O.; Samuel, J.; Ozgovde, A.; Jay, J.

    2016-09-01

    Concerning energy consumption and monitoring architectures, our goal is to develop a sustainable declarative monitoring architecture for lower energy consumption taking into account the monitoring system itself. Our second is to develop theoretical and practical tools to model, explore and exploit heterogeneous data from various sources in order to understand a phenomenon like energy consumption of smart building vs inhabitants' social behaviours. We focus on a generic model for data acquisition campaigns based on the concept of generic sensor. The concept of generic sensor is centered on acquired data and on their inherent multi-dimensional structure, to support complex domain-specific or field-oriented analysis processes. We consider that a methodological breakthrough may pave the way to deep understanding of voluminous and heterogeneous scientific data sets. Our use case concerns energy efficiency of buildings to understand relationship between physical phenomena and user behaviors. The aim of this paper is to give a presentation of our methodology and results concerning architecture and user-centric tools.

  10. Managing Sustainability with the Support of Business Intelligence Methods and Tools

    NASA Astrophysics Data System (ADS)

    Petrini, Maira; Pozzebon, Marlei

    In this paper we explore the role of business intelligence (BI) in helping to support the management of sustainability in contemporary firms. The concepts of sustainability and corporate social responsibility (CSR) are among the most important themes to have emerged in the last decade at the global level. We suggest that BI methods and tools have an important but not yet well studied role to play in helping organizations implement and monitor sustainable and socially responsible business practices. Using grounded theory, the main contribution of our study is to propose a conceptual model that seeks to support the process of definition and monitoring of socio-environmental indicators and the relationship between their management and business strategy.

  11. Exploring JavaScript and ROOT technologies to create Web-based ATLAS analysis and monitoring tools

    NASA Astrophysics Data System (ADS)

    Sánchez Pineda, A.

    2015-12-01

    We explore the potential of current web applications to create online interfaces that allow the visualization, interaction and real cut-based physics analysis and monitoring of processes through a web browser. The project consists in the initial development of web- based and cloud computing services to allow students and researchers to perform fast and very useful cut-based analysis on a browser, reading and using real data and official Monte- Carlo simulations stored in ATLAS computing facilities. Several tools are considered: ROOT, JavaScript and HTML. Our study case is the current cut-based H → ZZ → llqq analysis of the ATLAS experiment. Preliminary but satisfactory results have been obtained online.

  12. Remote monitoring as a tool in condition assessment of a highway bridge

    NASA Astrophysics Data System (ADS)

    Tantele, Elia A.; Votsis, Renos A.; Onoufriou, Toula; Milis, Marios; Kareklas, George

    2016-08-01

    The deterioration of civil infrastructure and their subsequent maintenance is a significant problem for the responsible managing authorities. The ideal scenario is to detect deterioration and/or structural problems at early stages so that the maintenance cost is kept low and the safety of the infrastructure remains undisputed. The current inspection regimes implemented mostly via visual inspection are planned at specific intervals but are not always executed on time due to shortcomings in expert personnel and finance. However the introduction of technological advances in the assessment of infrastructures provides the tools to alleviate this problem. This study describes the assessment of a highway RC bridge's structural condition using remote structural health monitoring. A monitoring plan is implemented focusing on strain measurements; as strain is a parameter influenced by the environmental conditions supplementary data are provided from temperature and wind sensors. The data are acquired using wired sensors (deployed at specific locations) which are connected to a wireless sensor unit installed at the bridge. This WSN application enables the transmission of the raw data from the field to the office for processing and evaluation. The processed data are then used to assess the condition of the bridge. This case study, which is part of an undergoing RPF research project, illustrates that remote monitoring can alleviate the problem of missing structural inspections. Additionally, shows its potential to be the main part of a fully automated smart procedure of obtaining structural data, processed them and trigger an alarm when certain undesirable conditions are met.

  13. Manufacturing process applications team (MATEAM). [technology transfer in the areas of machine tools and robots

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The transfer of NASA technology to the industrial sector is reported. Presentations to the machine tool and robot industries and direct technology transfers of the Adams Manipulator arm, a-c motor control, and the bolt tension monitor are discussed. A listing of proposed RTOP programs with strong potential is included. A detailed description of the rotor technology available to industry is given.

  14. Development of a matrix-assisted laser desorption ionization mass spectrometric method for rapid process-monitoring of phthalocyanine compounds.

    PubMed

    Chen, Yi-Ting; Wang, Fu-Shing; Li, Zhendong; Li, Liang; Ling, Yong-Chien

    2012-07-29

    Phthalocyanines (PCs), an important class of chemicals widely used in many industrial sectors, are macrocyclic compounds possessing a heteroaromatic π-electron system with optical properties influenced by chemical structures and impurities or by-products introduced during the synthesis process. Analytical tools allowing for rapid monitoring of the synthesis processes are of significance for the development of new PCs with improved performance in many application areas. In this work, we report a matrix-assisted laser desorption/ionization (MALDI) time-of-flight mass spectrometry (TOFMS) method for rapid and convenient monitoring of PC synthesis reactions. For this class of compounds, intact molecular ions could be detected by MALDI using retinoic acid as matrix. It was shown that relative quantification results of two PC compounds could be generated by MALDI MS. This method was applied to monitor the bromination reactions of nickel- and copper-containing PCs. It was demonstrated that, compared to the traditional UV-visible method, the MALDI MS method offers the advantage of higher sensitivity while providing chemical species and relative quantification information on the reactants and products, which are crucial to process monitoring. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Easily configured real-time CPOE Pick Off Tool supporting focused clinical research and quality improvement.

    PubMed

    Rosenbaum, Benjamin P; Silkin, Nikolay; Miller, Randolph A

    2014-01-01

    Real-time alerting systems typically warn providers about abnormal laboratory results or medication interactions. For more complex tasks, institutions create site-wide 'data warehouses' to support quality audits and longitudinal research. Sophisticated systems like i2b2 or Stanford's STRIDE utilize data warehouses to identify cohorts for research and quality monitoring. However, substantial resources are required to install and maintain such systems. For more modest goals, an organization desiring merely to identify patients with 'isolation' orders, or to determine patients' eligibility for clinical trials, may adopt a simpler, limited approach based on processing the output of one clinical system, and not a data warehouse. We describe a limited, order-entry-based, real-time 'pick off' tool, utilizing public domain software (PHP, MySQL). Through a web interface the tool assists users in constructing complex order-related queries and auto-generates corresponding database queries that can be executed at recurring intervals. We describe successful application of the tool for research and quality monitoring.

  16. Fiber‐optic distributed temperature sensing: A new tool for assessment and monitoring of hydrologic processes

    USGS Publications Warehouse

    Lane, John W.; Day-Lewis, Frederick D.; Johnson, Carole D.; Dawson, Cian B.; Nelms, David L.; Miller, Cheryl; Wheeler, Jerrod D.; Harvey, Charles F.; Karam, Hanan N.

    2008-01-01

    Fiber‐optic distributed temperature sensing (FO DTS) is an emerging technology for characterizing and monitoring a wide range of important earth processes. FO DTS utilizes laser light to measure temperature along the entire length of standard telecommunications optical fibers. The technology can measure temperature every meter over FO cables up to 30 kilometers (km) long. Commercially available systems can measure fiber temperature as often as 4 times per minute, with thermal precision ranging from 0.1 to 0.01 °C depending on measurement integration time. In 2006, the U.S. Geological Survey initiated a project to demonstrate and evaluate DTS as a technology to support hydrologic studies. This paper demonstrates the potential of the technology to assess and monitor hydrologic processes through case‐study examples of FO DTS monitoring of stream‐aquifer interaction on the Shenandoah River near Locke's Mill, Virginia, and on Fish Creek, near Jackson Hole, Wyoming, and estuary‐aquifer interaction on Waquoit Bay, Falmouth, Massachusetts. The ability to continuously observe temperature over large spatial scales with high spatial and temporal resolution provides a new opportunity to observe and monitor a wide range of hydrologic processes with application to other disciplines including hazards, climate‐change, and ecosystem monitoring.

  17. Image processing developments and applications for water quality monitoring and trophic state determination

    NASA Technical Reports Server (NTRS)

    Blackwell, R. J.

    1982-01-01

    Remote sensing data analysis of water quality monitoring is evaluated. Data anaysis and image processing techniques are applied to LANDSAT remote sensing data to produce an effective operational tool for lake water quality surveying and monitoring. Digital image processing and analysis techniques were designed, developed, tested, and applied to LANDSAT multispectral scanner (MSS) data and conventional surface acquired data. Utilization of these techniques facilitates the surveying and monitoring of large numbers of lakes in an operational manner. Supervised multispectral classification, when used in conjunction with surface acquired water quality indicators, is used to characterize water body trophic status. Unsupervised multispectral classification, when interpreted by lake scientists familiar with a specific water body, yields classifications of equal validity with supervised methods and in a more cost effective manner. Image data base technology is used to great advantage in characterizing other contributing effects to water quality. These effects include drainage basin configuration, terrain slope, soil, precipitation and land cover characteristics.

  18. Designs for Risk Evaluation and Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The Designs for Risk Evaluation and Management (DREAM) tool was developed as part of the effort to quantify the risk of geologic storage of carbon dioxide (CO 2) under the U.S. Department of Energy's National Risk Assessment Partnership (NRAP). DREAM is an optimization tool created to identify optimal monitoring schemes that minimize the time to first detection of CO 2 leakage from a subsurface storage formation. DREAM acts as a post-processer on user-provided output from subsurface leakage simulations. While DREAM was developed for CO 2 leakage scenarios, it is applicable to any subsurface leakage simulation of the same output format.more » The DREAM tool is comprised of three main components: (1) a Java wizard used to configure and execute the simulations, (2) a visualization tool to view the domain space and optimization results, and (3) a plotting tool used to analyze the results. A secondary Java application is provided to aid users in converting common American Standard Code for Information Interchange (ASCII) output data to the standard DREAM hierarchical data format (HDF5). DREAM employs a simulated annealing approach that searches the solution space by iteratively mutating potential monitoring schemes built of various configurations of monitoring locations and leak detection parameters. This approach has proven to be orders of magnitude faster than an exhaustive search of the entire solution space. The user's manual illustrates the program graphical user interface (GUI), describes the tool inputs, and includes an example application.« less

  19. Friction Stir Welding of Metal Matrix Composites for use in aerospace structures

    NASA Astrophysics Data System (ADS)

    Prater, Tracie

    2014-01-01

    Friction Stir Welding (FSW) is a relatively nascent solid state joining technique developed at The Welding Institute (TWI) in 1991. The process was first used at NASA to weld the super lightweight external tank for the Space Shuttle. Today FSW is used to join structural components of the Delta IV, Atlas V, and Falcon IX rockets as well as the Orion Crew Exploration Vehicle. A current focus of FSW research is to extend the process to new materials which are difficult to weld using conventional fusion techniques. Metal Matrix Composites (MMCs) consist of a metal alloy reinforced with ceramics and have a very high strength to weight ratio, a property which makes them attractive for use in aerospace and defense applications. MMCs have found use in the space shuttle orbiter's structural tubing, the Hubble Space Telescope's antenna mast, control surfaces and propulsion systems for aircraft, and tank armors. The size of MMC components is severely limited by difficulties encountered in joining these materials using fusion welding. Melting of the material results in formation of an undesirable phase (formed when molten Aluminum reacts with the reinforcement) which leaves a strength depleted region along the joint line. Since FSW occurs below the melting point of the workpiece material, this deleterious phase is absent in FSW-ed MMC joints. FSW of MMCs is, however, plagued by rapid wear of the welding tool, a consequence of the large discrepancy in hardness between the steel tool and the reinforcement material. This work characterizes the effect of process parameters (spindle speed, traverse rate, and length of joint) on the wear process. Based on the results of these experiments, a phenomenological model of the wear process was constructed based on the rotating plug model for FSW. The effectiveness of harder tool materials (such as Tungsten Carbide, high speed steel, and tools with diamond coatings) to combat abrasive wear is explored. In-process force, torque, and vibration signals are analyzed to assess the feasibility of on-line monitoring of tool shape changes as a result of wear (an advancement which would eliminate the need for off-line evaluation of tool condition during joining). Monitoring, controlling, and reducing tool wear in FSW of MMCs is essential to the implementation of these materials in structures (such as launch vehicles) where they would be of maximum benefit.

  20. Processing and Quality Monitoring for the ATLAS Tile Hadronic Calorimeter Data

    NASA Astrophysics Data System (ADS)

    Burghgrave, Blake; ATLAS Collaboration

    2017-10-01

    An overview is presented of Data Processing and Data Quality (DQ) Monitoring for the ATLAS Tile Hadronic Calorimeter. Calibration runs are monitored from a data quality perspective and used as a cross-check for physics runs. Data quality in physics runs is monitored extensively and continuously. Any problems are reported and immediately investigated. The DQ efficiency achieved was 99.6% in 2012 and 100% in 2015, after the detector maintenance in 2013-2014. Changes to detector status or calibrations are entered into the conditions database (DB) during a brief calibration loop between the end of a run and the beginning of bulk processing of data collected in it. Bulk processed data are reviewed and certified for the ATLAS Good Run List if no problem is detected. Experts maintain the tools used by DQ shifters and the calibration teams during normal operation, and prepare new conditions for data reprocessing and Monte Carlo (MC) production campaigns. Conditions data are stored in 3 databases: Online DB, Offline DB for data and a special DB for Monte Carlo. Database updates can be performed through a custom-made web interface.

  1. An Application of X-Ray Fluorescence as Process Analytical Technology (PAT) to Monitor Particle Coating Processes.

    PubMed

    Nakano, Yoshio; Katakuse, Yoshimitsu; Azechi, Yasutaka

    2018-06-01

    An attempt to apply X-Ray Fluorescence (XRF) analysis to evaluate small particle coating process as a Process Analytical Technologies (PAT) was made. The XRF analysis was used to monitor coating level in small particle coating process with at-line manner. The small particle coating process usually consists of multiple coating processes. This study was conducted by a simple coating particles prepared by first coating of a model compound (DL-methionine) and second coating by talc on spherical microcrystalline cellulose cores. The particles with two layered coating are enough to demonstrate the small particle coating process. From the result by the small particle coating process, it was found that the XRF signal played different roles, resulting that XRF signals by first coating (layering) and second coating (mask coating) could demonstrate the extent with different mechanisms for the coating process. Furthermore, the particle coating of the different particle size has also been investigated to evaluate size effect of these coating processes. From these results, it was concluded that the XRF could be used as a PAT in monitoring particle coating processes and become powerful tool in pharmaceutical manufacturing.

  2. The use of a quartz crystal microbalance as an analytical tool to monitor particle/surface and particle/particle interactions under dry ambient and pressurized conditions: a study using common inhaler components.

    PubMed

    Turner, N W; Bloxham, M; Piletsky, S A; Whitcombe, M J; Chianella, I

    2016-12-19

    Metered dose inhalers (MDI) and multidose powder inhalers (MPDI) are commonly used for the treatment of chronic obstructive pulmonary diseases and asthma. Currently, analytical tools to monitor particle/particle and particle/surface interaction within MDI and MPDI at the macro-scale do not exist. A simple tool capable of measuring such interactions would ultimately enable quality control of MDI and MDPI, producing remarkable benefits for the pharmaceutical industry and the users of inhalers. In this paper, we have investigated whether a quartz crystal microbalance (QCM) could become such a tool. A QCM was used to measure particle/particle and particle/surface interactions on the macroscale, by additions of small amounts of MDPI components, in the powder form into a gas stream. The subsequent interactions with materials on the surface of the QCM sensor were analyzed. Following this, the sensor was used to measure fluticasone propionate, a typical MDI active ingredient, in a pressurized gas system to assess its interactions with different surfaces under conditions mimicking the manufacturing process. In both types of experiments the QCM was capable of discriminating interactions of different components and surfaces. The results have demonstrated that the QCM is a suitable platform for monitoring macro-scale interactions and could possibly become a tool for quality control of inhalers.

  3. Final technical report. In-situ FT-IR monitoring of a black liquor recovery boiler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James Markham; Joseph Cosgrove; David Marran

    1999-05-31

    This project developed and tested advanced Fourier transform infrared (FT-IR) instruments for process monitoring of black liquor recovery boilers. The state-of-the-art FT-IR instruments successfully operated in the harsh environment of a black liquor recovery boiler and provided a wealth of real-time process information. Concentrations of multiple gas species were simultaneously monitored in-situ across the combustion flow of the boiler and extractively at the stack. Sensitivity to changes of particulate fume and carryover levels in the process flow were also demonstrated. Boiler set-up and operation is a complex balance of conditions that influence the chemical and physical processes in the combustionmore » flow. Operating parameters include black liquor flow rate, liquor temperature, nozzle pressure, primary air, secondary air, tertiary air, boiler excess oxygen and others. The in-process information provided by the FT-IR monitors can be used as a boiler control tool since species indicative of combustion efficiency (carbon monoxide, methane) and pollutant emissions (sulfur dioxide, hydrochloric acid and fume) were monitored in real-time and observed to fluctuate as operating conditions were varied. A high priority need of the U.S. industrial boiler market is improved measurement and control technology. The sensor technology demonstrated in this project is applicable to the need of industry.« less

  4. How conduit models can be used to interpret volcano monitoring data

    NASA Astrophysics Data System (ADS)

    Thomas, M. E.; Neuberg, J. W.; Karl, S.; Collinson, A.; Pascal, K.

    2012-04-01

    During the last decade there have been major advances in the field of volcano monitoring, but to be able to take full advantage of these advances it is vital to link the monitoring data with the physical processes that give rise to the recorded signals. To obtain a better understanding of these physical processes it is necessary to understand the conditions of the system at depth. This can be achieved through numerical modelling. We present the results of conduit models representative of a silicic volcanic system and demonstrate how processes identified and interpreted from these models may manifest in the recorded monitoring data. Links are drawn to seismicity, deformation, and gas emissions. A key point is how these data compliment each other, and through utilising conduit models we are able to interpret how these different data may be recorded in response to a particular process. This is an invaluable tool as it is far easier to draw firm conclusions on what is happening at a volcano if there are several different data sets that suggest the same processes are occurring. Some of these interpretations appear useful in forecasting potentially catastrophic changes in eruptive behaviour, such as a dome collapse leading to violent explosive behaviour, and the role of monitoring data in this capacity will also be addressed.

  5. Perspectives on Wellness Self-Monitoring Tools for Older Adults

    PubMed Central

    Huh, Jina; Le, Thai; Reeder, Blaine; Thompson, Hilaire J.; Demiris, George

    2013-01-01

    Purpose Our purpose was to understand different stakeholder perceptions about the use of self-monitoring tools, specifically in the area of older adults’ personal wellness. In conjunction with the advent of personal health records, tracking personal health using self-monitoring technologies shows promising patient support opportunities. While clinicians’ tools for monitoring of older adults have been explored, we know little about how older adults may self-monitor their wellness and health and how their health care providers would perceive such use. Methods We conducted three focus groups with health care providers (n=10) and four focus groups with community-dwelling older adults (n=31). Results Older adult participants’ found the concept of self-monitoring unfamiliar and this influenced a narrowed interest in the use of wellness self-monitoring tools. On the other hand, health care provider participants showed open attitudes towards wellness monitoring tools for older adults and brainstormed about various stakeholders’ use cases. The two participant groups showed diverging perceptions in terms of: perceived uses, stakeholder interests, information ownership and control, and sharing of wellness monitoring tools. Conclusions Our paper provides implications and solutions for how older adults’ wellness self-monitoring tools can enhance patient-health care provider interaction, patient education, and improvement in overall wellness. PMID:24041452

  6. Perspectives on wellness self-monitoring tools for older adults.

    PubMed

    Huh, Jina; Le, Thai; Reeder, Blaine; Thompson, Hilaire J; Demiris, George

    2013-11-01

    Our purpose was to understand different stakeholder perceptions about the use of self-monitoring tools, specifically in the area of older adults' personal wellness. In conjunction with the advent of personal health records, tracking personal health using self-monitoring technologies shows promising patient support opportunities. While clinicians' tools for monitoring of older adults have been explored, we know little about how older adults may self-monitor their wellness and health and how their health care providers would perceive such use. We conducted three focus groups with health care providers (n=10) and four focus groups with community-dwelling older adults (n=31). Older adult participants' found the concept of self-monitoring unfamiliar and this influenced a narrowed interest in the use of wellness self-monitoring tools. On the other hand, health care provider participants showed open attitudes toward wellness monitoring tools for older adults and brainstormed about various stakeholders' use cases. The two participant groups showed diverging perceptions in terms of: perceived uses, stakeholder interests, information ownership and control, and sharing of wellness monitoring tools. Our paper provides implications and solutions for how older adults' wellness self-monitoring tools can enhance patient-health care provider interaction, patient education, and improvement in overall wellness. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. How chemistry supports cell biology: the chemical toolbox at your service.

    PubMed

    Wijdeven, Ruud H; Neefjes, Jacques; Ovaa, Huib

    2014-12-01

    Chemical biology is a young and rapidly developing scientific field. In this field, chemistry is inspired by biology to create various tools to monitor and modulate biochemical and cell biological processes. Chemical contributions such as small-molecule inhibitors and activity-based probes (ABPs) can provide new and unique insights into previously unexplored cellular processes. This review provides an overview of recent breakthroughs in chemical biology that are likely to have a significant impact on cell biology. We also discuss the application of several chemical tools in cell biology research. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Tool wear modeling using abductive networks

    NASA Astrophysics Data System (ADS)

    Masory, Oren

    1992-09-01

    A tool wear model based on Abductive Networks, which consists of a network of `polynomial' nodes, is described. The model relates the cutting parameters, components of the cutting force, and machining time to flank wear. Thus real time measurements of the cutting force can be used to monitor the machining process. The model is obtained by a training process in which the connectivity between the network's nodes and the polynomial coefficients of each node are determined by optimizing a performance criteria. Actual wear measurements of coated and uncoated carbide inserts were used for training and evaluating the established model.

  9. Cluster Tool for In Situ Processing and Comprehensive Characterization of Thin Films at High Temperatures.

    PubMed

    Wenisch, Robert; Lungwitz, Frank; Hanf, Daniel; Heller, René; Zscharschuch, Jens; Hübner, René; von Borany, Johannes; Abrasonis, Gintautas; Gemming, Sibylle; Escobar-Galindo, Ramon; Krause, Matthias

    2018-06-13

    A new cluster tool for in situ real-time processing and depth-resolved compositional, structural and optical characterization of thin films at temperatures from -100 to 800 °C is described. The implemented techniques comprise magnetron sputtering, ion irradiation, Rutherford backscattering spectrometry, Raman spectroscopy, and spectroscopic ellipsometry. The capability of the cluster tool is demonstrated for a layer stack MgO/amorphous Si (∼60 nm)/Ag (∼30 nm), deposited at room temperature and crystallized with partial layer exchange by heating up to 650 °C. Its initial and final composition, stacking order, and structure were monitored in situ in real time and a reaction progress was defined as a function of time and temperature.

  10. MRI in the assessment and monitoring of multiple sclerosis: an update on best practice

    PubMed Central

    Kaunzner, Ulrike W.; Gauthier, Susan A.

    2017-01-01

    Magnetic resonance imaging (MRI) has developed into the most important tool for the diagnosis and monitoring of multiple sclerosis (MS). Its high sensitivity for the evaluation of inflammatory and neurodegenerative processes in the brain and spinal cord has made it the most commonly used technique for the evaluation of patients with MS. Moreover, MRI has become a powerful tool for treatment monitoring, safety assessment as well as for the prognostication of disease progression. Clinically, the use of MRI has increased in the past couple decades as a result of improved technology and increased availability that now extends well beyond academic centers. Consequently, there are numerous studies supporting the role of MRI in the management of patients with MS. The aim of this review is to summarize the latest insights into the utility of MRI in MS. PMID:28607577

  11. Biochip for Real-Time Monitoring of Hepatitis B Virus (HBV) by Combined Loop-Mediated Isothermal Amplification and Solution-Phase Electrochemical Detection

    NASA Astrophysics Data System (ADS)

    Tien, Bui Quang; Ngoc, Nguyen Thy; Loc, Nguyen Thai; Thu, Vu Thi; Lam, Tran Dai

    2017-06-01

    Accurate in situ diagnostic tests play a key role in patient management and control of most infectious diseases. To achieve this, use of handheld biochips that implement sample handling, sample analysis, and result readout together is an ideal approach. We present herein a fluid-handling biochip for real-time electrochemical monitoring of nucleic acid amplification based on loop-mediated isothermal amplification and real-time electrochemical detection on a microfluidic platform. Intercalation between amplifying DNA and free redox probe in solution phase was used to monitor the number of DNA copies. The whole diagnostic process is completed within 70 min. Our platform offers a fast and easy tool for quantification of viral pathogens in shorter time and with limited risk of all potential forms of cross-contamination. Such diagnostic tools have potential to make a huge difference to the lives of millions of people worldwide.

  12. DynAMo: A Modular Platform for Monitoring Process, Outcome, and Algorithm-Based Treatment Planning in Psychotherapy

    PubMed Central

    Laireiter, Anton Rupert

    2017-01-01

    Background In recent years, the assessment of mental disorders has become more and more personalized. Modern advancements such as Internet-enabled mobile phones and increased computing capacity make it possible to tap sources of information that have long been unavailable to mental health practitioners. Objective Software packages that combine algorithm-based treatment planning, process monitoring, and outcome monitoring are scarce. The objective of this study was to assess whether the DynAMo Web application can fill this gap by providing a software solution that can be used by both researchers to conduct state-of-the-art psychotherapy process research and clinicians to plan treatments and monitor psychotherapeutic processes. Methods In this paper, we report on the current state of a Web application that can be used for assessing the temporal structure of mental disorders using information on their temporal and synchronous associations. A treatment planning algorithm automatically interprets the data and delivers priority scores of symptoms to practitioners. The application is also capable of monitoring psychotherapeutic processes during therapy and of monitoring treatment outcomes. This application was developed using the R programming language (R Core Team, Vienna) and the Shiny Web application framework (RStudio, Inc, Boston). It is made entirely from open-source software packages and thus is easily extensible. Results The capabilities of the proposed application are demonstrated. Case illustrations are provided to exemplify its usefulness in clinical practice. Conclusions With the broad availability of Internet-enabled mobile phones and similar devices, collecting data on psychopathology and psychotherapeutic processes has become easier than ever. The proposed application is a valuable tool for capturing, processing, and visualizing these data. The combination of dynamic assessment and process- and outcome monitoring has the potential to improve the efficacy and effectiveness of psychotherapy. PMID:28729233

  13. Patient preference to use a questionnaire varies according to attributes.

    PubMed

    Kim, Na Yae; Richardson, Lyndsay; He, Weilin; Jones, Glenn

    2011-08-01

    Health care professionals may assume questionnaires are burdensome to patients, and this limits their use in clinical settings and promotes simplification. However, patient adherence may improve by optimizing questionnaire attributes and contexts. This cross-sectional survey used Contingent Valuation methods to directly elicit patient preference for conventional monitoring of symptoms, versus adding a tool to monitoring. Under explicit consideration was the 10-question Edmonton Symptom Assessment System (ESAS). In the questionnaire, attributes of ESAS were sequentially altered to try and force preference reversal. A separate group of participants completed both questionnaire and interviews to explore questionnaire reliability, and extend validity. Overall, 24 of 43 participants preferred using ESAS. Most important attributes to preference were frequency, specificity, and complexity. Where preference is initially against ESAS, it may reverse by simplifying the tool and its administrative processes. Interviews in 10 additional participants supported reproducibility and validity of the questionnaire method. Preference for using tools increases when tools are made relevant and used more appropriately. Questionnaires completed by patients as screening tools or aids to communication may be under-utilized. Optimization of ESAS and similar tools may be guided by empirical findings, including those obtained from Contingent Valuation methodologies. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  14. Tools for automated acoustic monitoring within the R package monitoR

    USGS Publications Warehouse

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors.

  15. Bridging the gap between finance and clinical operations with activity-based cost management.

    PubMed

    Storfjell, J L; Jessup, S

    1996-12-01

    Activity-based cost management (ABCM) is an exciting management tool that links financial information with operations. By determining the costs of specific activities and processes, nurse managers accurately determine true costs of services more accurately than traditional cost accounting methods, and then can target processes for improvement and monitor them for change and improvement. The authors describe the ABCM process applied to nursing management situations.

  16. Quantum cascade laser based monitoring of CF{sub 2} radical concentration as a diagnostic tool of dielectric etching plasma processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hübner, M.; Lang, N.; Röpcke, J.

    2015-01-19

    Dielectric etching plasma processes for modern interlevel dielectrics become more and more complex by the introduction of new ultra low-k dielectrics. One challenge is the minimization of sidewall damage, while etching ultra low-k porous SiCOH by fluorocarbon plasmas. The optimization of this process requires a deeper understanding of the concentration of the CF{sub 2} radical, which acts as precursor in the polymerization of the etch sample surfaces. In an industrial dielectric etching plasma reactor, the CF{sub 2} radical was measured in situ using a continuous wave quantum cascade laser (cw-QCL) around 1106.2 cm{sup −1}. We measured Doppler-resolved ro-vibrational absorption lines andmore » determined absolute densities using transitions in the ν{sub 3} fundamental band of CF{sub 2} with the aid of an improved simulation of the line strengths. We found that the CF{sub 2} radical concentration during the etching plasma process directly correlates to the layer structure of the etched wafer. Hence, this correlation can serve as a diagnostic tool of dielectric etching plasma processes. Applying QCL based absorption spectroscopy opens up the way for advanced process monitoring and etching controlling in semiconductor manufacturing.« less

  17. Integrating reliability and maintainability into a concurrent engineering environment

    NASA Astrophysics Data System (ADS)

    Phillips, Clifton B.; Peterson, Robert R.

    1993-02-01

    This paper describes the results of a reliability and maintainability study conducted at the University of California, San Diego and supported by private industry. Private industry thought the study was important and provided the university access to innovative tools under cooperative agreement. The current capability of reliability and maintainability tools and how they fit into the design process is investigated. The evolution of design methodologies leading up to today's capability is reviewed for ways to enhance the design process while keeping cost under control. A method for measuring the consequences of reliability and maintainability policy for design configurations in an electronic environment is provided. The interaction of selected modern computer tool sets is described for reliability, maintainability, operations, and other elements of the engineering design process. These tools provide a robust system evaluation capability that brings life cycle performance improvement information to engineers and their managers before systems are deployed, and allow them to monitor and track performance while it is in operation.

  18. Guidance Protocol: Application of Nucleic Acid-Based Tools for Monitoring Monitored Natural Attenuation (MNA), Biostimulation, and Bioaugmentation at Chlorinated Solvent Sites

    DTIC Science & Technology

    2011-02-01

    Reductive dechlorination is a promising process for biodegradation of chlorinated solvents. The successful field evaluation and implementation of the...population. These specialized bacteria use the chlorinated ethenes as electron acceptors and gain energy for growth from the reductive...This guidance protocol addresses the use of MBTs to quantitatively assess the Dhc population at chlorinated ethene sites and aims at providing

  19. Implementation of Cyberinfrastructure and Data Management Workflow for a Large-Scale Sensor Network

    NASA Astrophysics Data System (ADS)

    Jones, A. S.; Horsburgh, J. S.

    2014-12-01

    Monitoring with in situ environmental sensors and other forms of field-based observation presents many challenges for data management, particularly for large-scale networks consisting of multiple sites, sensors, and personnel. The availability and utility of these data in addressing scientific questions relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into functional data products. It also depends on the ability of researchers to share and access the data in useable formats. In addition to addressing the challenges presented by the quantity of data, monitoring networks need practices to ensure high data quality, including procedures and tools for post processing. Data quality is further enhanced if practitioners are able to track equipment, deployments, calibrations, and other events related to site maintenance and associate these details with observational data. In this presentation we will describe the overall workflow that we have developed for research groups and sites conducting long term monitoring using in situ sensors. Features of the workflow include: software tools to automate the transfer of data from field sites to databases, a Python-based program for data quality control post-processing, a web-based application for online discovery and visualization of data, and a data model and web interface for managing physical infrastructure. By automating the data management workflow, the time from collection to analysis is reduced and sharing and publication is facilitated. The incorporation of metadata standards and descriptions and the use of open-source tools enhances the sustainability and reusability of the data. We will describe the workflow and tools that we have developed in the context of the iUTAH (innovative Urban Transitions and Aridregion Hydrosustainability) monitoring network. The iUTAH network consists of aquatic and climate sensors deployed in three watersheds to monitor Gradients Along Mountain to Urban Transitions (GAMUT). The variety of environmental sensors and the multi-watershed, multi-institutional nature of the network necessitate a well-planned and efficient workflow for acquiring, managing, and sharing sensor data, which should be useful for similar large-scale and long-term networks.

  20. The Effect of Process Parameters and Tool Geometry on Thermal Field Development and Weld Formation in Friction Stir Welding of the Alloys AZ31 and AZ61

    NASA Astrophysics Data System (ADS)

    Zettler, R.; Blanco, A. C.; dos Santos, J. F.; Marya, S.

    An increase in the use of magnesium (Mg) in the car manufacturing industry has raised questions concerning its weldability. Friction Stir Welding (FSW) has the advantage of achieving metallic bonding below that of the melting point of the base material thus avoiding many of the metallurgical problems associated with the solidification process. The present study presents the results of a development program carried out to investigate the response of Mg alloys AZ31 and AZ61 to different FSW tool geometries and process parameters. Temperature development across the weld zone was monitored and the produced welds have been subjected to microstructural analysis and mechanical testing. Defect free welds have been produced with optimised FSW-tool and parameters. The micro structure of the welded joint resulted in similar ductility and hardness levels as compared to that of the base material. The results also demonstrated that tool geometry plays a fundamental role in the response of the investigated alloys to the FSW process.

  1. Research and guidelines for implementing Fatigue Risk Management Systems for the French regional airlines.

    PubMed

    Cabon, Philippe; Deharvengt, Stephane; Grau, Jean Yves; Maille, Nicolas; Berechet, Ion; Mollard, Régis

    2012-03-01

    This paper describes research that aims to provide the overall scientific basis for implementation of a Fatigue Risk Management System (FRMS) for French regional airlines. The current research has evaluated the use of different tools and indicators that would be relevant candidates for integration into the FRMS. For the Fatigue Risk Management component, results show that biomathematical models of fatigue are useful tools to help an airline to prevent fatigue related to roster design and for the management of aircrew planning. The Fatigue Safety assurance includes two monitoring processes that have been evaluated during this research: systematic monitoring and focused monitoring. Systematic monitoring consists of the analysis of existing safety indicators such as Air Safety Reports (ASR) and Flight Data Monitoring (FDM). Results show a significant relationship between the hours of work and the frequency of ASR. Results for the FDM analysis show that some events are significantly related to the fatigue risk associated with the hours of works. Focused monitoring includes a website survey and specific in-flight observations and data collection. Sleep and fatigue measurements have been collected from 115 aircrews over 12-day periods (including rest periods). Before morning duties, results show a significant sleep reduction of up to 40% of the aircrews' usual sleep needs leading to a clear increase of fatigue during flights. From these results, specific guidelines are developed to help the airlines to implement the FRMS and for the airworthiness to oversight the implementation of the FRMS process. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Improving marine disease surveillance through sea temperature monitoring, outlooks and projections

    PubMed Central

    Maynard, Jeffrey; van Hooidonk, Ruben; Harvell, C. Drew; Eakin, C. Mark; Liu, Gang; Willis, Bette L.; Williams, Gareth J.; Dobson, Andrew; Heron, Scott F.; Glenn, Robert; Reardon, Kathleen; Shields, Jeffrey D.

    2016-01-01

    To forecast marine disease outbreaks as oceans warm requires new environmental surveillance tools. We describe an iterative process for developing these tools that combines research, development and deployment for suitable systems. The first step is to identify candidate host–pathogen systems. The 24 candidate systems we identified include sponges, corals, oysters, crustaceans, sea stars, fishes and sea grasses (among others). To illustrate the other steps, we present a case study of epizootic shell disease (ESD) in the American lobster. Increasing prevalence of ESD is a contributing factor to lobster fishery collapse in southern New England (SNE), raising concerns that disease prevalence will increase in the northern Gulf of Maine under climate change. The lowest maximum bottom temperature associated with ESD prevalence in SNE is 12°C. Our seasonal outlook for 2015 and long-term projections show bottom temperatures greater than or equal to 12°C may occur in this and coming years in the coastal bays of Maine. The tools presented will allow managers to target efforts to monitor the effects of ESD on fishery sustainability and will be iteratively refined. The approach and case example highlight that temperature-based surveillance tools can inform research, monitoring and management of emerging and continuing marine disease threats. PMID:26880840

  3. Improving marine disease surveillance through sea temperature monitoring, outlooks and projections.

    PubMed

    Maynard, Jeffrey; van Hooidonk, Ruben; Harvell, C Drew; Eakin, C Mark; Liu, Gang; Willis, Bette L; Williams, Gareth J; Groner, Maya L; Dobson, Andrew; Heron, Scott F; Glenn, Robert; Reardon, Kathleen; Shields, Jeffrey D

    2016-03-05

    To forecast marine disease outbreaks as oceans warm requires new environmental surveillance tools. We describe an iterative process for developing these tools that combines research, development and deployment for suitable systems. The first step is to identify candidate host-pathogen systems. The 24 candidate systems we identified include sponges, corals, oysters, crustaceans, sea stars, fishes and sea grasses (among others). To illustrate the other steps, we present a case study of epizootic shell disease (ESD) in the American lobster. Increasing prevalence of ESD is a contributing factor to lobster fishery collapse in southern New England (SNE), raising concerns that disease prevalence will increase in the northern Gulf of Maine under climate change. The lowest maximum bottom temperature associated with ESD prevalence in SNE is 12 °C. Our seasonal outlook for 2015 and long-term projections show bottom temperatures greater than or equal to 12 °C may occur in this and coming years in the coastal bays of Maine. The tools presented will allow managers to target efforts to monitor the effects of ESD on fishery sustainability and will be iteratively refined. The approach and case example highlight that temperature-based surveillance tools can inform research, monitoring and management of emerging and continuing marine disease threats. © 2016 The Authors.

  4. Generic Raman-based calibration models enabling real-time monitoring of cell culture bioreactors.

    PubMed

    Mehdizadeh, Hamidreza; Lauri, David; Karry, Krizia M; Moshgbar, Mojgan; Procopio-Melino, Renee; Drapeau, Denis

    2015-01-01

    Raman-based multivariate calibration models have been developed for real-time in situ monitoring of multiple process parameters within cell culture bioreactors. Developed models are generic, in the sense that they are applicable to various products, media, and cell lines based on Chinese Hamster Ovarian (CHO) host cells, and are scalable to large pilot and manufacturing scales. Several batches using different CHO-based cell lines and corresponding proprietary media and process conditions have been used to generate calibration datasets, and models have been validated using independent datasets from separate batch runs. All models have been validated to be generic and capable of predicting process parameters with acceptable accuracy. The developed models allow monitoring multiple key bioprocess metabolic variables, and hence can be utilized as an important enabling tool for Quality by Design approaches which are strongly supported by the U.S. Food and Drug Administration. © 2015 American Institute of Chemical Engineers.

  5. Actualities and Development of Heavy-Duty CNC Machine Tool Thermal Error Monitoring Technology

    NASA Astrophysics Data System (ADS)

    Zhou, Zu-De; Gui, Lin; Tan, Yue-Gang; Liu, Ming-Yao; Liu, Yi; Li, Rui-Ya

    2017-09-01

    Thermal error monitoring technology is the key technological support to solve the thermal error problem of heavy-duty CNC (computer numerical control) machine tools. Currently, there are many review literatures introducing the thermal error research of CNC machine tools, but those mainly focus on the thermal issues in small and medium-sized CNC machine tools and seldom introduce thermal error monitoring technologies. This paper gives an overview of the research on the thermal error of CNC machine tools and emphasizes the study of thermal error of the heavy-duty CNC machine tool in three areas. These areas are the causes of thermal error of heavy-duty CNC machine tool and the issues with the temperature monitoring technology and thermal deformation monitoring technology. A new optical measurement technology called the "fiber Bragg grating (FBG) distributed sensing technology" for heavy-duty CNC machine tools is introduced in detail. This technology forms an intelligent sensing and monitoring system for heavy-duty CNC machine tools. This paper fills in the blank of this kind of review articles to guide the development of this industry field and opens up new areas of research on the heavy-duty CNC machine tool thermal error.

  6. Thermographic Assessment of the HAZ Properties and Structure of Thermomechanically Treated Steel

    NASA Astrophysics Data System (ADS)

    Górka, Jacek; Janicki, Damian; Fidali, Marek; Jamrozik, Wojciech

    2017-12-01

    Thermomechanically processed steels are materials of great mechanical properties connected with more than good weldability. This mixture makes them interesting for different types of industrial applications. When creating welded joints, a specified amount of heat is introduced into the welding area and a so called heat-affected zone (HAZ) is formed. The key issue is to reduce the width of the HAZ, because properties of the material in the HAZ are worse than in the base material. In the paper, thermographic measurements of HAZ temperatures were presented as a potential tool for quality assuring the welding process in terms of monitoring and control. The main issue solved was the precise temperature measurement in terms of varying emissivity during a welding thermal cycle. A model of emissivity changes was elaborated and successfully applied. Additionally, material in the HAZ was tested to reveal its properties and connect changes of those properties with heating parameters. The obtained results prove that correctly modeled emissivity allows measurement of temperature, which is a valuable tool for welding process monitoring.

  7. A Management Tool for Assessing Aquaculture Environmental Impacts in Chilean Patagonian Fjords: Integrating Hydrodynamic and Pellets Dispersion Models

    NASA Astrophysics Data System (ADS)

    Tironi, Antonio; Marin, Víctor H.; Campuzano, Francisco J.

    2010-05-01

    This article introduces a management tool for salmon farming, with a scope in the local sustainability of salmon aquaculture of the Aysen Fjord, Chilean Patagonia. Based on Integrated Coastal Zone Management (ICZM) principles, the tool combines a large 3-level nested hydrodynamic model, a particle tracking module and a GIS application into an assessment tool for particulate waste dispersal of salmon farming activities. The model offers an open source alternative to particulate waste modeling and evaluation, contributing with valuable information for local decision makers in the process of locating new facilities and monitoring stations.

  8. Monitoring the hatch time of individual chicken embryos.

    PubMed

    Romanini, C E B; Exadaktylos, V; Tong, Q; McGonnel, I; Demmers, T G M; Bergoug, H; Eterradossi, N; Roulston, N; Garain, P; Bahr, C; Berckmans, D

    2013-02-01

    This study investigated variations in eggshell temperature (T(egg)) during the hatching process of broiler eggs. Temperature sensors monitored embryo temperature by registering T(egg) every minute. Measurements carried out on a sample of 40 focal eggs revealed temperature drops between 2 to 6°C during the last 3 d of incubation. Video cameras recorded the hatching process and served as the gold standard reference for manually labeling the hatch times of chicks. Comparison between T(egg) drops and the hatch time of individuals revealed a time synchronization with 99% correlation coefficient and an absolute average time difference up to 25 min. Our findings suggest that attaching temperature sensors to eggshells is a precise tool for monitoring the hatch time of individual chicks. Individual hatch monitoring registers the biological age of chicks and facilitates an accurate and reliable means to count hatching results and manage the hatch window.

  9. Tool Wear Prediction in Ti-6Al-4V Machining through Multiple Sensor Monitoring and PCA Features Pattern Recognition.

    PubMed

    Caggiano, Alessandra

    2018-03-09

    Machining of titanium alloys is characterised by extremely rapid tool wear due to the high cutting temperature and the strong adhesion at the tool-chip and tool-workpiece interface, caused by the low thermal conductivity and high chemical reactivity of Ti alloys. With the aim to monitor the tool conditions during dry turning of Ti-6Al-4V alloy, a machine learning procedure based on the acquisition and processing of cutting force, acoustic emission and vibration sensor signals during turning is implemented. A number of sensorial features are extracted from the acquired sensor signals in order to feed machine learning paradigms based on artificial neural networks. To reduce the large dimensionality of the sensorial features, an advanced feature extraction methodology based on Principal Component Analysis (PCA) is proposed. PCA allowed to identify a smaller number of features ( k = 2 features), the principal component scores, obtained through linear projection of the original d features into a new space with reduced dimensionality k = 2, sufficient to describe the variance of the data. By feeding artificial neural networks with the PCA features, an accurate diagnosis of tool flank wear ( VB max ) was achieved, with predicted values very close to the measured tool wear values.

  10. Tool Wear Prediction in Ti-6Al-4V Machining through Multiple Sensor Monitoring and PCA Features Pattern Recognition

    PubMed Central

    2018-01-01

    Machining of titanium alloys is characterised by extremely rapid tool wear due to the high cutting temperature and the strong adhesion at the tool-chip and tool-workpiece interface, caused by the low thermal conductivity and high chemical reactivity of Ti alloys. With the aim to monitor the tool conditions during dry turning of Ti-6Al-4V alloy, a machine learning procedure based on the acquisition and processing of cutting force, acoustic emission and vibration sensor signals during turning is implemented. A number of sensorial features are extracted from the acquired sensor signals in order to feed machine learning paradigms based on artificial neural networks. To reduce the large dimensionality of the sensorial features, an advanced feature extraction methodology based on Principal Component Analysis (PCA) is proposed. PCA allowed to identify a smaller number of features (k = 2 features), the principal component scores, obtained through linear projection of the original d features into a new space with reduced dimensionality k = 2, sufficient to describe the variance of the data. By feeding artificial neural networks with the PCA features, an accurate diagnosis of tool flank wear (VBmax) was achieved, with predicted values very close to the measured tool wear values. PMID:29522443

  11. Unified Geophysical Cloud Platform (UGCP) for Seismic Monitoring and other Geophysical Applications.

    NASA Astrophysics Data System (ADS)

    Synytsky, R.; Starovoit, Y. O.; Henadiy, S.; Lobzakov, V.; Kolesnikov, L.

    2016-12-01

    We present Unified Geophysical Cloud Platform (UGCP) or UniGeoCloud as an innovative approach for geophysical data processing in the Cloud environment with the ability to run any type of data processing software in isolated environment within the single Cloud platform. We've developed a simple and quick method of several open-source widely known software seismic packages (SeisComp3, Earthworm, Geotool, MSNoise) installation which does not require knowledge of system administration, configuration, OS compatibility issues etc. and other often annoying details preventing time wasting for system configuration work. Installation process is simplified as "mouse click" on selected software package from the Cloud market place. The main objective of the developed capability was the software tools conception with which users are able to design and install quickly their own highly reliable and highly available virtual IT-infrastructure for the organization of seismic (and in future other geophysical) data processing for either research or monitoring purposes. These tools provide access to any seismic station data available in open IP configuration from the different networks affiliated with different Institutions and Organizations. It allows also setting up your own network as you desire by selecting either regionally deployed stations or the worldwide global network based on stations selection form the global map. The processing software and products and research results could be easily monitored from everywhere using variety of user's devices form desk top computers to IT gadgets. Currents efforts of the development team are directed to achieve Scalability, Reliability and Sustainability (SRS) of proposed solutions allowing any user to run their applications with the confidence of no data loss and no failure of the monitoring or research software components. The system is suitable for quick rollout of NDC-in-Box software package developed for State Signatories and aimed for promotion of data processing collected by the IMS Network.

  12. Coastal environment: historical and continuous monitoring

    NASA Astrophysics Data System (ADS)

    Ivaldi, Roberta; Surace, Luciano

    2010-05-01

    The monitoring is a tool providing essential data to study the process dynamic. The formation and transformation of coastal environment involve physical, chemical, geological and biological processes. The knowledge of the littoral systems and marine seafloor therefore requires a multidisciplinary approach. Since the phenomena observation occurs in a short period of time it requires the use of high quality data acquired with high accuracy and suitable processing procedures. This knowledge considerable increased during the past 50 years closely following significant progress in the methods of investigation at sea and laboratory. In addition seafloor exploration is deeply rooted in History. A sector actually subject to control results the coastal zone for its position as transition component between continental and marine environments with closely connected natural and human actions. Certainly these activities are important in the time to develop the technologies suited for the knowledge and to increase different protection, prevention, intervention and management tools. In this context the Istituto Idrografico della Marina (Hydrographic Institute of Italian Navy - I.I.M.) is a precursor because since its foundation (in 1872) it contributed to the monitoring activities related to charting and navigation, including hydrologic surveying, seafloor measurements and in consequence the landward limit, the shoreline. The coastal area is certainly the most changeable sector either natural or socio-economic causes. This is the most dynamic environment, subject both to marine (waves and currents) and continental (river and ice) actions, and continuously changing the intended use for the increase of industrial, commercial, recreation and the need for new structures to support. The coast has more recently taken on a growing value determined by some processes, including erosion and retreat are evidence of a transformation of which, however, undermine the system and impoverishing the existing one. The constant monitoring activities of I.I.M. are the production of nautical paper charts and electronic navigational charts (ENC) together other specialised nautical charts and publications to aid safe navigation, the processing of the oldest data from analogical to digital and the care preservation in the archives of all hydrographic survey information. This process is occurred according to an international recognized standard, such as to allow a continuous improvement of all acquired data, even if with more advanced tools and technologies for the development of cartography in constant update both in content and in restitution. In this research the archives infrastructure is used to conduct hydrographic data collection and processing to follow the secular variation and its evolution of the shoreline and coastal seafloor. A key element in monitoring these changes, both of the sub-aerial and submarine beach, is the determination of the shoreline and restitution as the coastline, which already includes the definition of its complexity, in a time period that must be long enough. We present some examples of the Italian littoral evolution with evident changes of coastal morphology in support of present monitoring.

  13. Manufacturing Challenges Associated with the Use of Metal Matrix Composites in Aerospace Structures

    NASA Technical Reports Server (NTRS)

    Prater, Tracie

    2014-01-01

    Metal Matrix Composites (MMCs) consist of a metal alloy reinforced with ceramic particles or fibers. These materials possess a very high strength to weight ratio, good resistance to impact and wear, and a number of other properties which make them attractive for use in aerospace and defense applications. MMCs have found use in the space shuttle orbiter's structural tubing, the Hubble Space Telescope's antenna mast, control surfaces and propulsion systems for aircraft, and tank armors. The size of MMC components is severely limited by difficulties encountered in joining these materials using fusion welding. Melting of the material results in formation of an undesirable phase (formed when molten Aluminum reacts with the reinforcement) which leaves a strength depleted region along the joint line. Friction Stir Welding (FSW) is a relatively nascent solid state joining technique developed at The Welding Institute (TWI) in 1991. The process was first used at NASA to weld the super lightweight external tank for the Space Shuttle. Today FSW is used to join structural components of the Delta IV, Atlas V, and Falcon IX rockets as well as NASA's Orion Crew Exploration Vehicle and Space Launch System. A current focus of FSW research is to extend the process to new materials, such as MMCs, which are difficult to weld using conventional fusion techniques. Since Friction Stir Welding occurs below the melting point of the workpiece material, this deleterious phase is absent in FSW-ed MMC joints. FSW of MMCs is, however, plagued by rapid wear of the welding tool, a consequence of the large discrepancy in hardness between the steel tool and the reinforcement material. This chapter summarizes the challenges encountered when joining MMCs to themselves or to other materials in structures. Specific attention is paid to the influence of process variables in Friction Stir Welding on the wear process characterizes the effect of process parameters (spindle speed, traverse rate, and length of joint) on the wear process. A phenomenological model of the wear process was constructed based on the rotating plug model of Friction Stir Welding. The effectiveness of harder tool materials (such as Tungsten Carbide, high speed steel, and tools with diamond coatings) to combat abrasive wear is also explored. In-process force, torque, and vibration signals are analyzed to assess the feasibility of in situ monitoring of tool shape changes as a result of wear (an advancement which would eliminate the need for off-line evaluation of tool condition during joining). Monitoring, controlling, and reducing tool wear in FSW of MMCs is essential to implementation of these materials in structures (such as launch vehicles) where they would be of maximum benefit. The work presented here is extendable to machining of MMCs, where wear of the tool is also a limiting factor.

  14. Physics-based process model approach for detecting discontinuity during friction stir welding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shrivastava, Amber; Pfefferkorn, Frank E.; Duffie, Neil A.

    2015-02-12

    The goal of this work is to develop a method for detecting the creation of discontinuities during friction stir welding. This in situ weld monitoring method could significantly reduce the need for post-process inspection. A process force model and a discontinuity force model were created based on the state-of-the-art understanding of flow around an friction stir welding (FSW) tool. These models are used to predict the FSW forces and size of discontinuities formed in the weld. Friction stir welds with discontinuities and welds without discontinuities were created, and the differences in force dynamics were observed. In this paper, discontinuities weremore » generated by reducing the tool rotation frequency and increasing the tool traverse speed in order to create "cold" welds. Experimental force data for welds with discontinuities and welds without discontinuities compared favorably with the predicted forces. The model currently overpredicts the discontinuity size.« less

  15. Online monitoring of fermentation processes via non-invasive low-field NMR.

    PubMed

    Kreyenschulte, Dirk; Paciok, Eva; Regestein, Lars; Blümich, Bernhard; Büchs, Jochen

    2015-09-01

    For the development of biotechnological processes in academia as well as in industry new techniques are required which enable online monitoring for process characterization and control. Nuclear magnetic resonance (NMR) spectroscopy is a promising analytical tool, which has already found broad applications in offline process analysis. The use of online monitoring, however, is oftentimes constrained by high complexity of custom-made NMR bioreactors and considerable costs for high-field NMR instruments (>US$200,000). Therefore, low-field (1) H NMR was investigated in this study in a bypass system for real-time observation of fermentation processes. The new technique was validated with two microbial systems. For the yeast Hansenula polymorpha glycerol consumption could accurately be assessed in spite of the presence of high amounts of complex constituents in the medium. During cultivation of the fungal strain Ustilago maydis, which is accompanied by the formation of several by-products, the concentrations of glucose, itaconic acid, and the relative amount of glycolipids could be quantified. While low-field spectra are characterized by reduced spectral resolution compared to high-field NMR, the compact design combined with the high temporal resolution (15 s-8 min) of spectra acquisition allowed online monitoring of the respective processes. Both applications clearly demonstrate that the investigated technique is well suited for reaction monitoring in opaque media while at the same time it is highly robust and chemically specific. It can thus be concluded that low-field NMR spectroscopy has a great potential for non-invasive online monitoring of biotechnological processes at the research and practical industrial scales. © 2015 Wiley Periodicals, Inc.

  16. Beer fermentation: monitoring of process parameters by FT-NIR and multivariate data analysis.

    PubMed

    Grassi, Silvia; Amigo, José Manuel; Lyndgaard, Christian Bøge; Foschino, Roberto; Casiraghi, Ernestina

    2014-07-15

    This work investigates the capability of Fourier-Transform near infrared (FT-NIR) spectroscopy to monitor and assess process parameters in beer fermentation at different operative conditions. For this purpose, the fermentation of wort with two different yeast strains and at different temperatures was monitored for nine days by FT-NIR. To correlate the collected spectra with °Brix, pH and biomass, different multivariate data methodologies were applied. Principal component analysis (PCA), partial least squares (PLS) and locally weighted regression (LWR) were used to assess the relationship between FT-NIR spectra and the abovementioned process parameters that define the beer fermentation. The accuracy and robustness of the obtained results clearly show the suitability of FT-NIR spectroscopy, combined with multivariate data analysis, to be used as a quality control tool in the beer fermentation process. FT-NIR spectroscopy, when combined with LWR, demonstrates to be a perfectly suitable quantitative method to be implemented in the production of beer. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Monitoring the process of pulmonary melanoma metastasis using large area and label-free nonlinear optical microscopy

    NASA Astrophysics Data System (ADS)

    Hua, Daozhu; Qi, Shuhong; Li, Hui; Zhang, Zhihong; Fu, Ling

    2012-06-01

    We performed large area nonlinear optical microscopy (NOM) for label-free monitoring of the process of pulmonary melanoma metastasis ex vivo with subcellular resolution in C57BL/6 mice. Multiphoton autofluorescence (MAF) and second harmonic generation (SHG) images of lung tissue are obtained in a volume of ~2.2 mm×2.2 mm×30 μm. Qualitative differences in morphologic features and quantitative measurement of pathological lung tissues at different time points are characterized. We find that combined with morphological features, the quantitative parameters, such as the intensity ratio of MAF and SHG between pathological tissue and normal tissue and the MAF to SHG index versus depth clearly shows the tissue physiological changes during the process of pulmonary melanoma metastasis. Our results demonstrate that large area NOM succeeds in monitoring the process of pulmonary melanoma metastasis, which can provide a powerful tool for the research in tumor pathophysiology and therapy evaluation.

  18. Development of materials for the rapid manufacture of die cast tooling

    NASA Astrophysics Data System (ADS)

    Hardro, Peter Jason

    The focus of this research is to develop a material composition that can be processed by rapid prototyping (RP) in order to produce tooling for the die casting process. Where these rapidly produced tools will be superior to traditional tooling production methods by offering one or more of the following advantages: reduced tooling cost, shortened tooling creation time, reduced man-hours for tool creation, increased tool life, and shortened die casting cycle time. By utilizing RP's additive build process and vast material selection, there was a prospect that die cast tooling may be produced quicker and with superior material properties. To this end, the material properties that influence die life and cycle time were determined, and a list of materials that fulfill these "optimal" properties were highlighted. Physical testing was conducted in order to grade the processability of each of the material systems and to optimize the manufacturing process for the downselected material system. Sample specimens were produced and microscopy techniques were utilized to determine a number of physical properties of the material system. Additionally, a benchmark geometry was selected and die casting dies were produced from traditional tool materials (H13 steel) and techniques (machining) and from the newly developed materials and RP techniques (selective laser sintering (SLS) and laser engineered net shaping (LENS)). Once the tools were created, a die cast alloy was selected and a preset number of parts were shot into each tool. During tool creation, the manufacturing time and cost was closely monitored and an economic model was developed to compare traditional tooling to RP tooling. This model allows one to determine, in the early design stages, when it is advantageous to implement RP tooling and when traditional tooling would be best. The results of the physical testing and economic analysis has shown that RP tooling is able to achieve a number of the research objectives, namely, reduce tooling cost, shorten tooling creation time, and reduce the man-hours needed for tool creation. Though identifying the appropriate time to use RP tooling appears to be the most important aspect in achieving successful implementation.

  19. FFI: A software tool for ecological monitoring

    Treesearch

    Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman

    2009-01-01

    A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...

  20. Study of Gallium Arsenide Etching in a DC Discharge in Low-Pressure HCl-Containing Mixtures

    NASA Astrophysics Data System (ADS)

    Dunaev, A. V.; Murin, D. B.

    2018-04-01

    Halogen-containing plasmas are often used to form topological structures on semiconductor surfaces; therefore, spectral monitoring of the etching process is an important diagnostic tool in modern electronics. In this work, the emission spectra of gas discharges in mixtures of hydrogen chloride with argon, chlorine, and hydrogen in the presence of a semiconducting gallium arsenide plate were studied. Spectral lines and bands of the GaAs etching products appropriate for monitoring the etching rate were determined. It is shown that the emission intensity of the etching products is proportional to the GaAs etching rate in plasmas of HCl mixtures with Ar and Cl2, which makes it possible to monitor the etching process in real time by means of spectral methods.

  1. Etna_NETVIS: A dedicated tool for automatically pre-processing high frequency data useful to extract geometrical parameters and track the evolution of the lava field

    NASA Astrophysics Data System (ADS)

    Marsella, Maria; Junior Valentino D'Aranno, Peppe; De Bonis, Roberto; Nardinocchi, Carla; Scifoni, Silvia; Scutti, Marianna; Sonnessa, Alberico; Wahbeh, Wissam; Biale, Emilio; Coltelli, Mauro; Pecora, Emilio; Prestifilippo, Michele; Proietti, Cristina

    2016-04-01

    In volcanic areas, where it could be difficult to gain access to the most critical zones for carrying out direct surveys, digital photogrammetry techniques are rarely experimented, although in many cases they proved to have remarkable potentialities, as the possibility to follow the evolution of volcanic (fracturing, vent positions, lava fields, lava front positions) and deformation processes (inflation/deflation and instability phenomena induced by volcanic activity). These results can be obtained, in the framework of standard surveillance activities, by acquiring multi-temporal datasets including Digital Orthophotos (DO) and Digital Elevation Models (DEM) to be used for implementing a quantitative and comparative analysis. The frequency of the surveys can be intensified during emergency phases to implement a quasi real-time monitoring for supporting civil protection actions. The high level of accuracy and the short time required for image processing make digital photogrammetry a suitable tool for controlling the evolution of volcanic processes which are usually characterized by large and rapid mass displacements. In order to optimize and extend the existing permanent ground NEtwork of Thermal and VIsible Sensors located on Mt. Etna (Etna_NETVIS) and to improve the observation of the most active areas, an approach for monitoring surface sin-eruptive processes was implemented. A dedicated tool for automatically pre-processing high frequency data, useful to extract geometrical parameters as well as to track the evolution of the lava field, was developed and tested both in simulated and real scenarios. The tool allows to extract a coherent multi-temporal dataset of orthophotos useful to evaluate active flow area and to estimate effusion rates. Furthermore, Etna_NETVIS data were used to downscale the information derived from satellite data and/or to integrate the satellite datasets in case of incomplete coverage or missing acquisitions. This work was developed in the framework of the EU-FP7 project "MED-SUV" (MEDiterranean SUpersite Volcanoes).

  2. Failure analysis in the identification of synergies between cleaning monitoring methods.

    PubMed

    Whiteley, Greg S; Derry, Chris; Glasbey, Trevor

    2015-02-01

    The 4 monitoring methods used to manage the quality assurance of cleaning outcomes within health care settings are visual inspection, microbial recovery, fluorescent marker assessment, and rapid ATP bioluminometry. These methods each generate different types of information, presenting a challenge to the successful integration of monitoring results. A systematic approach to safety and quality control can be used to interrogate the known qualities of cleaning monitoring methods and provide a prospective management tool for infection control professionals. We investigated the use of failure mode and effects analysis (FMEA) for measuring failure risk arising through each cleaning monitoring method. FMEA uses existing data in a structured risk assessment tool that identifies weaknesses in products or processes. Our FMEA approach used the literature and a small experienced team to construct a series of analyses to investigate the cleaning monitoring methods in a way that minimized identified failure risks. FMEA applied to each of the cleaning monitoring methods revealed failure modes for each. The combined use of cleaning monitoring methods in sequence is preferable to their use in isolation. When these 4 cleaning monitoring methods are used in combination in a logical sequence, the failure modes noted for any 1 can be complemented by the strengths of the alternatives, thereby circumventing the risk of failure of any individual cleaning monitoring method. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  3. Process Diagnostics and Monitoring Using the Multipole Resonance Probe (MRP)

    NASA Astrophysics Data System (ADS)

    Harhausen, J.; Awakowicz, P.; Brinkmann, R. P.; Foest, R.; Lapke, M.; Musch, T.; Mussenbrock, T.; Oberrath, J.; Ohl, A.; Rolfes, I.; Schulz, Ch.; Storch, R.; Styrnoll, T.

    2011-10-01

    In this contribution we present the application of the MRP in an industrial plasma ion assisted deposition (PIAD) chamber (Leybold optics SYRUS-pro). The MRP is a novel plasma diagnostic which is suitable for an industrial environment - which means that the proposed method is robust, calibration free, and economical, and can be used for ideal and reactive plasmas alike. In order to employ the MRP as process diagnostics we mounted the probe on a manipulator to obtain spatially resolved information on the electron density and temperature. As monitoring tool the MRP is installed at a fixed position. Even during the deposition process it provides stable measurement results while other diagnostic methods, e.g. the Langmuir probe, may suffer from dielectric coatings. In this contribution we present the application of the MRP in an industrial plasma ion assisted deposition (PIAD) chamber (Leybold optics SYRUS-pro). The MRP is a novel plasma diagnostic which is suitable for an industrial environment - which means that the proposed method is robust, calibration free, and economical, and can be used for ideal and reactive plasmas alike. In order to employ the MRP as process diagnostics we mounted the probe on a manipulator to obtain spatially resolved information on the electron density and temperature. As monitoring tool the MRP is installed at a fixed position. Even during the deposition process it provides stable measurement results while other diagnostic methods, e.g. the Langmuir probe, may suffer from dielectric coatings. Funded by the German Ministry for Education and Research (BMBF, Fkz. 13N10462).

  4. In line NIR quantification of film thickness on pharmaceutical pellets during a fluid bed coating process.

    PubMed

    Lee, Min-Jeong; Seo, Da-Young; Lee, Hea-Eun; Wang, In-Chun; Kim, Woo-Sik; Jeong, Myung-Yung; Choi, Guang J

    2011-01-17

    Along with the risk-based approach, process analytical technology (PAT) has emerged as one of the key elements to fully implement QbD (quality-by-design). Near-infrared (NIR) spectroscopy has been extensively applied as an in-line/on-line analytical tool in biomedical and chemical industries. In this study, the film thickness on pharmaceutical pellets was examined for quantification using in-line NIR spectroscopy during a fluid-bed coating process. A precise monitoring of coating thickness and its prediction with a suitable control strategy is crucial to the quality assurance of solid dosage forms including dissolution characteristics. Pellets of a test formulation were manufactured and coated in a fluid-bed by spraying a hydroxypropyl methylcellulose (HPMC) coating solution. NIR spectra were acquired via a fiber-optic probe during the coating process, followed by multivariate analysis utilizing partial least squares (PLS) calibration models. The actual coating thickness of pellets was measured by two separate methods, confocal laser scanning microscopy (CLSM) and laser diffraction particle size analysis (LD-PSA). Both characterization methods gave superb correlation results, and all determination coefficient (R(2)) values exceeded 0.995. In addition, a prediction coating experiment for 70min demonstrated that the end-point can be accurately designated via NIR in-line monitoring with appropriate calibration models. In conclusion, our approach combining in-line NIR monitoring with CLSM and LD-PSA can be applied as an effective PAT tool for fluid-bed pellet coating processes. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. Data-driven multi-scale multi-physics models to derive process-structure-property relationships for additive manufacturing

    NASA Astrophysics Data System (ADS)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Lian, Yanping; Yu, Cheng; Liu, Zeliang; Yan, Jinhui; Wolff, Sarah; Wu, Hao; Ndip-Agbor, Ebot; Mozaffar, Mojtaba; Ehmann, Kornel; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-05-01

    Additive manufacturing (AM) possesses appealing potential for manipulating material compositions, structures and properties in end-use products with arbitrary shapes without the need for specialized tooling. Since the physical process is difficult to experimentally measure, numerical modeling is a powerful tool to understand the underlying physical mechanisms. This paper presents our latest work in this regard based on comprehensive material modeling of process-structure-property relationships for AM materials. The numerous influencing factors that emerge from the AM process motivate the need for novel rapid design and optimization approaches. For this, we propose data-mining as an effective solution. Such methods—used in the process-structure, structure-properties and the design phase that connects them—would allow for a design loop for AM processing and materials. We hope this article will provide a road map to enable AM fundamental understanding for the monitoring and advanced diagnostics of AM processing.

  6. Data-driven multi-scale multi-physics models to derive process-structure-property relationships for additive manufacturing

    NASA Astrophysics Data System (ADS)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Lian, Yanping; Yu, Cheng; Liu, Zeliang; Yan, Jinhui; Wolff, Sarah; Wu, Hao; Ndip-Agbor, Ebot; Mozaffar, Mojtaba; Ehmann, Kornel; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-01-01

    Additive manufacturing (AM) possesses appealing potential for manipulating material compositions, structures and properties in end-use products with arbitrary shapes without the need for specialized tooling. Since the physical process is difficult to experimentally measure, numerical modeling is a powerful tool to understand the underlying physical mechanisms. This paper presents our latest work in this regard based on comprehensive material modeling of process-structure-property relationships for AM materials. The numerous influencing factors that emerge from the AM process motivate the need for novel rapid design and optimization approaches. For this, we propose data-mining as an effective solution. Such methods—used in the process-structure, structure-properties and the design phase that connects them—would allow for a design loop for AM processing and materials. We hope this article will provide a road map to enable AM fundamental understanding for the monitoring and advanced diagnostics of AM processing.

  7. Punch stretching process monitoring using acoustic emission signal analysis. II - Application of frequency domain deconvolution

    NASA Technical Reports Server (NTRS)

    Liang, Steven Y.; Dornfeld, David A.; Nickerson, Jackson A.

    1987-01-01

    The coloring effect on the acoustic emission signal due to the frequency response of the data acquisition/processing instrumentation may bias the interpretation of AE signal characteristics. In this paper, a frequency domain deconvolution technique, which involves the identification of the instrumentation transfer functions and multiplication of the AE signal spectrum by the inverse of these system functions, has been carried out. In this way, the change in AE signal characteristics can be better interpreted as the result of the change in only the states of the process. Punch stretching process was used as an example to demonstrate the application of the technique. Results showed that, through the deconvolution, the frequency characteristics of AE signals generated during the stretching became more distinctive and can be more effectively used as tools for process monitoring.

  8. MATLAB tools for improved characterization and quantification of volcanic incandescence in Webcam imagery; applications at Kilauea Volcano, Hawai'i

    USGS Publications Warehouse

    Patrick, Matthew R.; Kauahikaua, James P.; Antolik, Loren

    2010-01-01

    Webcams are now standard tools for volcano monitoring and are used at observatories in Alaska, the Cascades, Kamchatka, Hawai'i, Italy, and Japan, among other locations. Webcam images allow invaluable documentation of activity and provide a powerful comparative tool for interpreting other monitoring datastreams, such as seismicity and deformation. Automated image processing can improve the time efficiency and rigor of Webcam image interpretation, and potentially extract more information on eruptive activity. For instance, Lovick and others (2008) provided a suite of processing tools that performed such tasks as noise reduction, eliminating uninteresting images from an image collection, and detecting incandescence, with an application to dome activity at Mount St. Helens during 2007. In this paper, we present two very simple automated approaches for improved characterization and quantification of volcanic incandescence in Webcam images at Kilauea Volcano, Hawai`i. The techniques are implemented in MATLAB (version 2009b, Copyright: The Mathworks, Inc.) to take advantage of the ease of matrix operations. Incandescence is a useful indictor of the location and extent of active lava flows and also a potentially powerful proxy for activity levels at open vents. We apply our techniques to a period covering both summit and east rift zone activity at Kilauea during 2008?2009 and compare the results to complementary datasets (seismicity, tilt) to demonstrate their integrative potential. A great strength of this study is the demonstrated success of these tools in an operational setting at the Hawaiian Volcano Observatory (HVO) over the course of more than a year. Although applied only to Webcam images here, the techniques could be applied to any type of sequential images, such as time-lapse photography. We expect that these tools are applicable to many other volcano monitoring scenarios, and the two MATLAB scripts, as they are implemented at HVO, are included in the appendixes. These scripts would require minor to moderate modifications for use elsewhere, primarily to customize directory navigation. If the user has some familiarity with MATLAB, or programming in general, these modifications should be easy. Although we originally anticipated needing the Image Processing Toolbox, the scripts in the appendixes do not require it. Thus, only the base installation of MATLAB is needed. Because fairly basic MATLAB functions are used, we expect that the script can be run successfully by versions earlier than 2009b.

  9. Towards a better control of the wastewater treatment process: excitation-emission matrix fluorescence spectroscopy of dissolved organic matter as a predictive tool of soluble BOD5 in influents of six Parisian wastewater treatment plants.

    PubMed

    Goffin, Angélique; Guérin, Sabrina; Rocher, Vincent; Varrault, Gilles

    2018-03-01

    The online monitoring of dissolved organic matter (DOM) in raw sewage water is expected to better control wastewater treatment processes. Fluorescence spectroscopy offers one possibility for both the online and real-time monitoring of DOM, especially as regards the DOM biodegradability assessment. In this study, three-dimensional fluorescence spectroscopy combined with a parallel factor analysis (PARAFAC) has been investigated as a predictive tool of the soluble biological oxygen demand in 5 days (BOD 5 ) for raw sewage water. Six PARAFAC components were highlighted in 69 raw sewage water samples: C2, C5, and C6 related to humic-like compounds, along with C1, C3, and C4 related to protein-like compounds. Since the PARAFAC methodology is not available for online monitoring, a peak-picking approach based on maximum excitation-emission (Ex-Em) localization of the PARAFAC components identified in this study has been used. A good predictive model of soluble BOD 5 using fluorescence spectroscopy parameters was obtained (r 2  = 0.846, adjusted r 2  = 0.839, p < 0.0001). This model is quite straightforward, easy to automate, and applicable to the operational field of wastewater treatment for online monitoring purposes.

  10. THE ROLE OF RAMAN SPECTROSCOPY IN THE ANALYTICAL CHEMISTRY OF POTABLE WATER

    EPA Science Inventory

    Advances in instrumentation are making Raman spectroscopy the tool of choice for an increasing number of chemical applications. For example, many recalcitrant industrial-process monitoring problems have been solved in recent years with in-line Raman spectrometers. Raman is attr...

  11. Preliminary Evaluation of a Diagnostic Tool for Prosthetics

    DTIC Science & Technology

    2017-10-01

    volume change. Processing algorithms for data from the activity monitors were modified to run more efficiently so that large datasets could be...left) and blade style prostheses (right). Figure 4: Ankle ActiGraph correct position demonstrated for a left leg below-knee amputee cylindrical

  12. Hydrological modeling in forested systems

    Treesearch

    H.E. Golden; G.R. Evenson; S. Tian; Devendra Amatya; Ge Sun

    2015-01-01

    Characterizing and quantifying interactions among components of the forest hydrological cycle is complex and usually requires a combination of field monitoring and modelling approaches (Weiler and McDonnell, 2004; National Research Council, 2008). Models are important tools for testing hypotheses, understanding hydrological processes and synthesizing experimental data...

  13. THE ROLE OF RAMAN SPECTROSCOPY IN THE ANALYTICAL CHEMISTRY OF POTABLE WATER

    EPA Science Inventory

    Advances in instrumentation are making Raman spectroscopy the tool of choice for an increasing number of chemical applications. For example, many recalcitrant industrial process monitoring problems have been solved in recent years with in-line Raman spectrometers. Raman is attr...

  14. On-line coupling of a miniaturized bioreactor with capillary electrophoresis, via a membrane interface, for monitoring the production of organic acids by microorganisms.

    PubMed

    Ehala, S; Vassiljeva, I; Kuldvee, R; Vilu, R; Kaljurand, M

    2001-09-01

    Capillary electrophoresis (CE) can be a valuable tool for on-line monitoring of bioprocesses. Production of organic acids by phosphorus-solubilizing bacteria and fermentation of UHT milk were monitored and controlled by use of a membrane-interfaced dialysis device and a home-made microsampler for a capillary electrophoresis unit. Use of this specially designed sampling device enabled rapid consecutive injections without interruption of the high voltage. No additional sample preparation was required. The time resolution of monitoring in this particular work was approximately 2 h, but could be reduced to 2 min. Analytes were detected at low microg mL(-1) levels with a reproducibility of approximately 10%. To demonstrate the potential of CE in processes of biotechnological interest, results from monitoring phosphate solubilization by bacteria were submitted to qualitative and quantitative analysis. Fermentation experiments on UHT milk showed that monitoring of the processes by CE can provide good resolution of complex mixtures, although for more specific, detailed characterization the identification of individual substances is needed.

  15. From scientific understanding to operational utility: New concepts and tools for monitoring space weather effects on satellites

    NASA Astrophysics Data System (ADS)

    Green, J. C.; Rodriguez, J. V.; Denig, W. F.; Redmon, R. J.; Blake, J. B.; Mazur, J. E.; Fennell, J. F.; O'Brien, T. P.; Guild, T. B.; Claudepierre, S. G.; Singer, H. J.; Onsager, T. G.; Wilkinson, D. C.

    2013-12-01

    NOAA space weather sensors have monitored the near Earth space radiation environment for more than three decades providing one of the only long-term records of these energetic particles that can disable satellites and pose a threat to astronauts. These data have demonstrated their value for operations for decades, but they are also invaluable for scientific discovery. Here we describe the development of new NOAA tools for assessing radiation impacts to satellites and astronauts working in space. In particular, we discuss the new system implemented for processing and delivering near real time particle radiation data from the POES/MetOp satellites. We also describe the development of new radiation belt indices from the POES/MetOp data that capture significant global changes in the environment needed for operational decision making. Lastly, we investigate the physical processes responsible for dramatic changes of the inner proton belt region and the potential consequences these new belts may have for satellite operations.

  16. Software for Remote Monitoring of Space-Station Payloads

    NASA Technical Reports Server (NTRS)

    Schneider, Michelle; Lippincott, Jeff; Chubb, Steve; Whitaker, Jimmy; Gillis, Robert; Sellers, Donna; Sims, Chris; Rice, James

    2003-01-01

    Telescience Resource Kit (TReK) is a suite of application programs that enable geographically dispersed users to monitor scientific payloads aboard the International Space Station (ISS). TReK provides local ground support services that can simultaneously receive, process, record, playback, and display data from multiple sources. TReK also provides interfaces to use the remote services provided by the Payload Operations Integration Center which manages all ISS payloads. An application programming interface (API) allows for payload users to gain access to all data processed by TReK and allows payload-specific tools and programs to be built or integrated with TReK. Used in conjunction with other ISS-provided tools, TReK provides the ability to integrate payloads with the operational ground system early in the lifecycle. This reduces the potential for operational problems and provides "cradle-to-grave" end-to-end operations. TReK contains user guides and self-paced tutorials along with training applications to allow the user to become familiar with the system.

  17. A Tool for Automatic Verification of Real-Time Expert Systems

    NASA Technical Reports Server (NTRS)

    Traylor, B.; Schwuttke, U.; Quan, A.

    1994-01-01

    The creation of an automated, user-driven tool for expert system development, validation, and verification is curretly onoging at NASA's Jet Propulsion Laboratory. In the new age of faster, better, cheaper missions, there is an increased willingness to utilize embedded expert systems for encapsulating and preserving mission expertise in systems which combine conventional algorithmic processing and artifical intelligence. The once-questioned role of automation in spacecraft monitoring is now becoming one of increasing importance.

  18. Using robotics construction kits as metacognitive tools: a research in an Italian primary school.

    PubMed

    La Paglia, Filippo; Caci, Barbara; La Barbera, Daniele; Cardaci, Maurizio

    2010-01-01

    The present paper is aimed at analyzing the process of building and programming robots as a metacognitive tool. Quantitative data and qualitative observations from a research performed in a sample of children attending an Italian primary school are described in this work. Results showed that robotics activities may be intended as a new metacognitive environment that allows children to monitor themselves and control their learning actions in an autonomous and self-centered way.

  19. The technique of entropy optimization in motor current signature analysis and its application in the fault diagnosis of gear transmission

    NASA Astrophysics Data System (ADS)

    Chen, Xiaoguang; Liang, Lin; Liu, Fei; Xu, Guanghua; Luo, Ailing; Zhang, Sicong

    2012-05-01

    Nowadays, Motor Current Signature Analysis (MCSA) is widely used in the fault diagnosis and condition monitoring of machine tools. However, although the current signal has lower SNR (Signal Noise Ratio), it is difficult to identify the feature frequencies of machine tools from complex current spectrum that the feature frequencies are often dense and overlapping by traditional signal processing method such as FFT transformation. With the study in the Motor Current Signature Analysis (MCSA), it is found that the entropy is of importance for frequency identification, which is associated with the probability distribution of any random variable. Therefore, it plays an important role in the signal processing. In order to solve the problem that the feature frequencies are difficult to be identified, an entropy optimization technique based on motor current signal is presented in this paper for extracting the typical feature frequencies of machine tools which can effectively suppress the disturbances. Some simulated current signals were made by MATLAB, and a current signal was obtained from a complex gearbox of an iron works made in Luxembourg. In diagnosis the MCSA is combined with entropy optimization. Both simulated and experimental results show that this technique is efficient, accurate and reliable enough to extract the feature frequencies of current signal, which provides a new strategy for the fault diagnosis and the condition monitoring of machine tools.

  20. Bringing the CMS distributed computing system into scalable operations

    NASA Astrophysics Data System (ADS)

    Belforte, S.; Fanfani, A.; Fisk, I.; Flix, J.; Hernández, J. M.; Kress, T.; Letts, J.; Magini, N.; Miccio, V.; Sciabà, A.

    2010-04-01

    Establishing efficient and scalable operations of the CMS distributed computing system critically relies on the proper integration, commissioning and scale testing of the data and workload management tools, the various computing workflows and the underlying computing infrastructure, located at more than 50 computing centres worldwide and interconnected by the Worldwide LHC Computing Grid. Computing challenges periodically undertaken by CMS in the past years with increasing scale and complexity have revealed the need for a sustained effort on computing integration and commissioning activities. The Processing and Data Access (PADA) Task Force was established at the beginning of 2008 within the CMS Computing Program with the mandate of validating the infrastructure for organized processing and user analysis including the sites and the workload and data management tools, validating the distributed production system by performing functionality, reliability and scale tests, helping sites to commission, configure and optimize the networking and storage through scale testing data transfers and data processing, and improving the efficiency of accessing data across the CMS computing system from global transfers to local access. This contribution reports on the tools and procedures developed by CMS for computing commissioning and scale testing as well as the improvements accomplished towards efficient, reliable and scalable computing operations. The activities include the development and operation of load generators for job submission and data transfers with the aim of stressing the experiment and Grid data management and workload management systems, site commissioning procedures and tools to monitor and improve site availability and reliability, as well as activities targeted to the commissioning of the distributed production, user analysis and monitoring systems.

  1. Bioaccumulation Using Surrogate Samplers (Bass): Evaluation Of A Passive Sampler As An Alternative Monitoring Tool For Environmental Contaminants At The Savannah River Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paller, M.; Knox, A.; Kuhne, W.

    2015-10-15

    DOE sites conduct traditional environmental monitoring programs that require collecting, processing, and analyzing water, sediment, and fish samples. However, recently developed passive sampling technologies, such as Diffusive Gradient in Thin films (DGT), may measure the chemical phases that are available and toxic to organisms (the bioavailable fraction), thereby producing more accurate and economical results than traditional methods.  Our laboratory study showed that dissolved copper concentrations measured by DGT probes were strongly correlated with the uptake of copper by Lumbriculus variegatus, an aquatic worm, and with concentrations of copper measured by conventional methods.  Dissolved copper concentrations in DGT probes increased with timemore » of exposure, paralleling the increase in copper with time that ocurred in Lumbriculus.  Additional studies with a combination of seven dissolved metals showed similar results.  These findings support the use of DGT as a biomimetic monitoring tool and provide a basis for refinement of these methods for cost-effective environmental monitoring at DOE sites.« less

  2. Serial Interface through Stream Protocol on EPICS Platform for Distributed Control and Monitoring

    NASA Astrophysics Data System (ADS)

    Das Gupta, Arnab; Srivastava, Amit K.; Sunil, S.; Khan, Ziauddin

    2017-04-01

    Remote operation of any equipment or device is implemented in distributed systems in order to control and proper monitoring of process values. For such remote operations, Experimental Physics and Industrial Control System (EPICS) is used as one of the important software tool for control and monitoring of a wide range of scientific parameters. A hardware interface is developed for implementation of EPICS software so that different equipment such as data converters, power supplies, pump controllers etc. could be remotely operated through stream protocol. EPICS base was setup on windows as well as Linux operating system for control and monitoring while EPICS modules such as asyn and stream device were used to interface the equipment with standard RS-232/RS-485 protocol. Stream Device protocol communicates with the serial line with an interface to asyn drivers. Graphical user interface and alarm handling were implemented with Motif Editor and Display Manager (MEDM) and Alarm Handler (ALH) command line channel access utility tools. This paper will describe the developed application which was tested with different equipment and devices serially interfaced to the PCs on a distributed network.

  3. A knowledge authoring tool for clinical decision support.

    PubMed

    Dunsmuir, Dustin; Daniels, Jeremy; Brouse, Christopher; Ford, Simon; Ansermino, J Mark

    2008-06-01

    Anesthesiologists in the operating room are unable to constantly monitor all data generated by physiological monitors. They are further distracted by clinical and educational tasks. An expert system would ideally provide assistance to the anesthesiologist in this data-rich environment. Clinical monitoring expert systems have not been widely adopted, as traditional methods of knowledge encoding require both expert medical and programming skills, making knowledge acquisition difficult. A software application was developed for use as a knowledge authoring tool for physiological monitoring. This application enables clinicians to create knowledge rules without the need of a knowledge engineer or programmer. These rules are designed to provide clinical diagnosis, explanations and treatment advice for optimal patient care to the clinician in real time. By intelligently combining data from physiological monitors and demographical data sources the expert system can use these rules to assist in monitoring the patient. The knowledge authoring process is simplified by limiting connective relationships between rules. The application is designed to allow open collaboration between communities of clinicians to build a library of rules for clinical use. This design provides clinicians with a system for parameter surveillance and expert advice with a transparent pathway of reasoning. A usability evaluation demonstrated that anesthesiologists can rapidly develop useful rules for use in a predefined clinical scenario.

  4. Freeze-drying process monitoring using a cold plasma ionization device.

    PubMed

    Mayeresse, Y; Veillon, R; Sibille, P H; Nomine, C

    2007-01-01

    A cold plasma ionization device has been designed to monitor freeze-drying processes in situ by monitoring lyophilization chamber moisture content. This plasma device, which consists of a probe that can be mounted directly on the lyophilization chamber, depends upon the ionization of nitrogen and water molecules using a radiofrequency generator and spectrometric signal collection. The study performed on this probe shows that it is steam sterilizable, simple to integrate, reproducible, and sensitive. The limitations include suitable positioning in the lyophilization chamber, calibration, and signal integration. Sensitivity was evaluated in relation to the quantity of vials and the probe positioning, and correlation with existing methods, such as microbalance, was established. These tests verified signal reproducibility through three freeze-drying cycles. Scaling-up studies demonstrated a similar product signature for the same product using pilot-scale and larger-scale equipment. On an industrial scale, the method efficiently monitored the freeze-drying cycle, but in a larger industrial freeze-dryer the signal was slightly modified. This was mainly due to the positioning of the plasma device, in relation to the vapor flow pathway, which is not necessarily homogeneous within the freeze-drying chamber. The plasma tool is a relevant method for monitoring freeze-drying processes and may in the future allow the verification of current thermodynamic freeze-drying models. This plasma technique may ultimately represent a process analytical technology (PAT) approach for the freeze-drying process.

  5. Using the scanning electron microscope on the production line to assure quality semiconductors

    NASA Technical Reports Server (NTRS)

    Adolphsen, J. W.; Anstead, R. J.

    1972-01-01

    The use of the scanning electron microscope to detect metallization defects introduced during batch processing of semiconductor devices is discussed. A method of determining metallization integrity was developed which culminates in a procurement specification using the scanning microscope on the production line as a quality control tool. Batch process control of the metallization operation is monitored early in the manufacturing cycle.

  6. Management of the water balance and quality in mining areas

    NASA Astrophysics Data System (ADS)

    Pasanen, Antti; Krogerus, Kirsti; Mroueh, Ulla-Maija; Turunen, Kaisa; Backnäs, Soile; Vento, Tiia; Veijalainen, Noora; Hentinen, Kimmo; Korkealaakso, Juhani

    2015-04-01

    Although mining companies have long been conscious of water related risks they still face environmental management problems. These problems mainly emerge because mine sites' water balances have not been adequately assessed in the stage of the planning of mines. More consistent approach is required to help mining companies identify risks and opportunities related to the management of water resources in all stages of mining. This approach requires that the water cycle of a mine site is interconnected with the general hydrologic water cycle. In addition to knowledge on hydrological conditions, the control of the water balance in the mining processes require knowledge of mining processes, the ability to adjust process parameters to variable hydrological conditions, adaptation of suitable water management tools and systems, systematic monitoring of amounts and quality of water, adequate capacity in water management infrastructure to handle the variable water flows, best practices to assess the dispersion, mixing and dilution of mine water and pollutant loading to receiving water bodies, and dewatering and separation of water from tailing and precipitates. WaterSmart project aims to improve the awareness of actual quantities of water, and water balances in mine areas to improve the forecasting and the management of the water volumes. The study is executed through hydrogeological and hydrological surveys and online monitoring procedures. One of the aims is to exploit on-line water quantity and quality monitoring for the better management of the water balances. The target is to develop a practical and end-user-specific on-line input and output procedures. The second objective is to develop mathematical models to calculate combined water balances including the surface, ground and process waters. WSFS, the Hydrological Modeling and Forecasting System of SYKE is being modified for mining areas. New modelling tools are developed on spreadsheet and system dynamics platforms to systematically integrate all water balance components (groundwater, surface water, infiltration, precipitation, mine water facilities and operations etc.) into overall dynamic mine site considerations. After coupling the surface and ground water models (e.g. Feflow and WSFS) with each other, they are compared with Goldsim. The third objective is to integrate the monitoring and modelling tools into the mine management system and process control. The modelling and predictive process control can prevent flood situations, ensure water adequacy, and enable the controlled mine water treatment. The project will develop a constantly updated management system for water balance including both natural waters and process waters.

  7. Angular approach combined to mechanical model for tool breakage detection by eddy current sensors

    NASA Astrophysics Data System (ADS)

    Ritou, M.; Garnier, S.; Furet, B.; Hascoet, J. Y.

    2014-02-01

    The paper presents a new complete approach for Tool Condition Monitoring (TCM) in milling. The aim is the early detection of small damages so that catastrophic tool failures are prevented. A versatile in-process monitoring system is introduced for reliability concerns. The tool condition is determined by estimates of the radial eccentricity of the teeth. An adequate criterion is proposed combining mechanical model of milling and angular approach.Then, a new solution is proposed for the estimate of cutting force using eddy current sensors implemented close to spindle nose. Signals are analysed in the angular domain, notably by synchronous averaging technique. Phase shifts induced by changes of machining direction are compensated. Results are compared with cutting forces measured with a dynamometer table.The proposed method is implemented in an industrial case of pocket machining operation. One of the cutting edges has been slightly damaged during the machining, as shown by a direct measurement of the tool. A control chart is established with the estimates of cutter eccentricity obtained during the machining from the eddy current sensors signals. Efficiency and reliability of the method is demonstrated by a successful detection of the damage.

  8. 17 CFR 49.17 - Access to SDR data.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...

  9. 17 CFR 49.17 - Access to SDR data.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...

  10. Machine Learning: A Crucial Tool for Sensor Design

    PubMed Central

    Zhao, Weixiang; Bhushan, Abhinav; Santamaria, Anthony D.; Simon, Melinda G.; Davis, Cristina E.

    2009-01-01

    Sensors have been widely used for disease diagnosis, environmental quality monitoring, food quality control, industrial process analysis and control, and other related fields. As a key tool for sensor data analysis, machine learning is becoming a core part of novel sensor design. Dividing a complete machine learning process into three steps: data pre-treatment, feature extraction and dimension reduction, and system modeling, this paper provides a review of the methods that are widely used for each step. For each method, the principles and the key issues that affect modeling results are discussed. After reviewing the potential problems in machine learning processes, this paper gives a summary of current algorithms in this field and provides some feasible directions for future studies. PMID:20191110

  11. Tool Wear Feature Extraction Based on Hilbert Marginal Spectrum

    NASA Astrophysics Data System (ADS)

    Guan, Shan; Song, Weijie; Pang, Hongyang

    2017-09-01

    In the metal cutting process, the signal contains a wealth of tool wear state information. A tool wear signal’s analysis and feature extraction method based on Hilbert marginal spectrum is proposed. Firstly, the tool wear signal was decomposed by empirical mode decomposition algorithm and the intrinsic mode functions including the main information were screened out by the correlation coefficient and the variance contribution rate. Secondly, Hilbert transform was performed on the main intrinsic mode functions. Hilbert time-frequency spectrum and Hilbert marginal spectrum were obtained by Hilbert transform. Finally, Amplitude domain indexes were extracted on the basis of the Hilbert marginal spectrum and they structured recognition feature vector of tool wear state. The research results show that the extracted features can effectively characterize the different wear state of the tool, which provides a basis for monitoring tool wear condition.

  12. In-line and Real-time Monitoring of Resonant Acoustic Mixing by Near-infrared Spectroscopy Combined with Chemometric Technology for Process Analytical Technology Applications in Pharmaceutical Powder Blending Systems.

    PubMed

    Tanaka, Ryoma; Takahashi, Naoyuki; Nakamura, Yasuaki; Hattori, Yusuke; Ashizawa, Kazuhide; Otsuka, Makoto

    2017-01-01

    Resonant acoustic ® mixing (RAM) technology is a system that performs high-speed mixing by vibration through the control of acceleration and frequency. In recent years, real-time process monitoring and prediction has become of increasing interest, and process analytical technology (PAT) systems will be increasingly introduced into actual manufacturing processes. This study examined the application of PAT with the combination of RAM, near-infrared spectroscopy, and chemometric technology as a set of PAT tools for introduction into actual pharmaceutical powder blending processes. Content uniformity was based on a robust partial least squares regression (PLSR) model constructed to manage the RAM configuration parameters and the changing concentration of the components. As a result, real-time monitoring may be possible and could be successfully demonstrated for in-line real-time prediction of active pharmaceutical ingredients and other additives using chemometric technology. This system is expected to be applicable to the RAM method for the risk management of quality.

  13. Veno-occlusive disease nurse management: development of a dynamic monitoring tool by the GITMO nursing group.

    PubMed

    Botti, Stefano; Orlando, Laura; Gargiulo, Gianpaolo; Cecco, Valentina De; Banfi, Marina; Duranti, Lorenzo; Samarani, Emanuela; Netti, Maria Giovanna; Deiana, Marco; Galuppini, Vera; Pignatelli, Adriana Concetta; Ceresoli, Rosanna; Vedovetto, Alessio; Rostagno, Elena; Bambaci, Marilena; Dellaversana, Cristina; Luminari, Stefano; Bonifazi, Francesca

    2016-01-01

    Veno-occlusive disease (VOD) is a complication arising from the toxicity of conditioning regimens that have a significant impact on the survival of patients who undergo stem cell transplantation. There are several known risk factors for developing VOD and their assessment before the start of conditioning regimens could improve the quality of care. Equally important are early identification of signs and symptoms ascribable to VOD, rapid diagnosis, and timely adjustment of support therapy and treatment. Nurses have a fundamental role at the stages of assessment and monitoring for signs and symptoms; therefore, they should have documented skills and training. The literature defines nurses' areas of competence in managing VOD, but in the actual clinical practice, this is not so clear. Moreover, there is an intrinsic difficulty in managing VOD due to its rapid and often dramatic evolution, together with a lack of care tools to guide nurses. Through a complex evidence-based process, the Gruppo Italiano per il Trapianto di Midollo Osseo (GITMO), cellule staminali emopoietiche e terapia cellulare nursing board has developed an operational flowchart and a dynamic monitoring tool applicable to haematopoietic stem cell transplantation patients, whether they develop this complication or not.

  14. Veno-occlusive disease nurse management: development of a dynamic monitoring tool by the GITMO nursing group

    PubMed Central

    Botti, Stefano; Orlando, Laura; Gargiulo, Gianpaolo; Cecco, Valentina De; Banfi, Marina; Duranti, Lorenzo; Samarani, Emanuela; Netti, Maria Giovanna; Deiana, Marco; Galuppini, Vera; Pignatelli, Adriana Concetta; Ceresoli, Rosanna; Vedovetto, Alessio; Rostagno, Elena; Bambaci, Marilena; Dellaversana, Cristina; Luminari, Stefano; Bonifazi, Francesca

    2016-01-01

    Veno-occlusive disease (VOD) is a complication arising from the toxicity of conditioning regimens that have a significant impact on the survival of patients who undergo stem cell transplantation. There are several known risk factors for developing VOD and their assessment before the start of conditioning regimens could improve the quality of care. Equally important are early identification of signs and symptoms ascribable to VOD, rapid diagnosis, and timely adjustment of support therapy and treatment. Nurses have a fundamental role at the stages of assessment and monitoring for signs and symptoms; therefore, they should have documented skills and training. The literature defines nurses’ areas of competence in managing VOD, but in the actual clinical practice, this is not so clear. Moreover, there is an intrinsic difficulty in managing VOD due to its rapid and often dramatic evolution, together with a lack of care tools to guide nurses. Through a complex evidence-based process, the Gruppo Italiano per il Trapianto di Midollo Osseo (GITMO), cellule staminali emopoietiche e terapia cellulare nursing board has developed an operational flowchart and a dynamic monitoring tool applicable to haematopoietic stem cell transplantation patients, whether they develop this complication or not. PMID:27594906

  15. Non-Destructive Monitoring of Charge-Discharge Cycles on Lithium Ion Batteries using 7Li Stray-Field Imaging

    PubMed Central

    Tang, Joel A.; Dugar, Sneha; Zhong, Guiming; Dalal, Naresh S.; Zheng, Jim P.; Yang, Yong; Fu, Riqiang

    2013-01-01

    Magnetic resonance imaging provides a noninvasive method for in situ monitoring of electrochemical processes involved in charge/discharge cycling of batteries. Determining how the electrochemical processes become irreversible, ultimately resulting in degraded battery performance, will aid in developing new battery materials and designing better batteries. Here we introduce the use of an alternative in situ diagnostic tool to monitor the electrochemical processes. Utilizing a very large field-gradient in the fringe field of a magnet, stray-field-imaging (STRAFI) technique significantly improves the image resolution. These STRAFI images enable the real time monitoring of the electrodes at a micron level. It is demonstrated by two prototype half-cells, graphite∥Li and LiFePO4∥Li, that the high-resolution 7Li STRAFI profiles allow one to visualize in situ Li-ions transfer between the electrodes during charge/discharge cyclings as well as the formation and changes of irreversible microstructures of the Li components, and particularly reveal a non-uniform Li-ion distribution in the graphite. PMID:24005580

  16. MatSeis and the GNEM R&E regional seismic anaylsis tools.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chael, Eric Paul; Hart, Darren M.; Young, Christopher John

    2003-08-01

    To improve the nuclear event monitoring capability of the U.S., the NNSA Ground-based Nuclear Explosion Monitoring Research & Engineering (GNEM R&E) program has been developing a collection of products known as the Knowledge Base (KB). Though much of the focus for the KB has been on the development of calibration data, we have also developed numerous software tools for various purposes. The Matlab-based MatSeis package and the associated suite of regional seismic analysis tools were developed to aid in the testing and evaluation of some Knowledge Base products for which existing applications were either not available or ill-suited. This presentationmore » will provide brief overviews of MatSeis and each of the tools, emphasizing features added in the last year. MatSeis was begun in 1996 and is now a fairly mature product. It is a highly flexible seismic analysis package that provides interfaces to read data from either flatfiles or an Oracle database. All of the standard seismic analysis tasks are supported (e.g. filtering, 3 component rotation, phase picking, event location, magnitude calculation), as well as a variety of array processing algorithms (beaming, FK, coherency analysis, vespagrams). The simplicity of Matlab coding and the tremendous number of available functions make MatSeis/Matlab an ideal environment for developing new monitoring research tools (see the regional seismic analysis tools below). New MatSeis features include: addition of evid information to events in MatSeis, options to screen picks by author, input and output of origerr information, improved performance in reading flatfiles, improved speed in FK calculations, and significant improvements to Measure Tool (filtering, multiple phase display), Free Plot (filtering, phase display and alignment), Mag Tool (maximum likelihood options), and Infra Tool (improved calculation speed, display of an F statistic stream). Work on the regional seismic analysis tools (CodaMag, EventID, PhaseMatch, and Dendro) began in 1999 and the tools vary in their level of maturity. All rely on MatSeis to provide necessary data (waveforms, arrivals, origins, and travel time curves). CodaMag Tool implements magnitude calculation by scaling to fit the envelope shape of the coda for a selected phase type (Mayeda, 1993; Mayeda and Walter, 1996). New tool features include: calculation of a yield estimate based on the source spectrum, display of a filtered version of the seismogram based on the selected band, and the output of codamag data records for processed events. EventID Tool implements event discrimination using phase ratios of regional arrivals (Hartse et al., 1997; Walter et al., 1999). New features include: bandpass filtering of displayed waveforms, screening of reference events based on SNR, multivariate discriminants, use of libcgi to access correction surfaces, and the output of discrim{_}data records for processed events. PhaseMatch Tool implements match filtering to isolate surface waves (Herrin and Goforth, 1977). New features include: display of the signal's observed dispersion and an option to use a station-based dispersion surface. Dendro Tool implements agglomerative hierarchical clustering using dendrograms to identify similar events based on waveform correlation (Everitt, 1993). New features include: modifications to include arrival information within the tool, and the capability to automatically add/re-pick arrivals based on the picked arrivals for similar events.« less

  17. [Assessment of a software application tool for managing nursing care processes in the period 2005-2010].

    PubMed

    Medina-Valverde, M José; Rodríguez-Borrego, M Aurora; Luque-Alcaraz, Olga; de la Torre-Barbero, M José; Parra-Perea, Julia; Moros-Molina, M del Pilar

    2012-01-01

    To identify problems and critical points in the software application. Assessment of the implementation of the software tool "Azahar" used to manage nursing care processes. The monitored population consisted of nurses who were users of the tool, at the Hospital and those who benefited from it in Primary Care. Each group was selected randomly and the number was determined by data saturation. A qualitative approach was employed using in-depth interviews and group discussion as data collection techniques. The nurses considered that the most beneficial and useful application of the tool was the initial assessment and the continuity of care release forms, as well as the recording of all data on the nursing process to ensure quality. The disadvantages and weaknesses identified were associated with the continuous variability in their daily care. The nurses related an increase in workload with the impossibility of entering the records into the computer, making paper records, thus duplicating the recording process. Likewise, they consider that the operating system of the software should be improved in terms of simplicity and functionality. The simplicity of the tool and the adjustment of workloads would favour its use and as a result, continuity of care. Copyright © 2010 Elsevier España, S.L. All rights reserved.

  18. Digital Image Support in the ROADNet Real-time Monitoring Platform

    NASA Astrophysics Data System (ADS)

    Lindquist, K. G.; Hansen, T. S.; Newman, R. L.; Vernon, F. L.; Nayak, A.; Foley, S.; Fricke, T.; Orcutt, J.; Rajasekar, A.

    2004-12-01

    The ROADNet real-time monitoring infrastructure has allowed researchers to integrate geophysical monitoring data from a wide variety of signal domains. Antelope-based data transport, relational-database buffering and archiving, backup/replication/archiving through the Storage Resource Broker, and a variety of web-based distribution tools create a powerful monitoring platform. In this work we discuss our use of the ROADNet system for the collection and processing of digital image data. Remote cameras have been deployed at approximately 32 locations as of September 2004, including the SDSU Santa Margarita Ecological Reserve, the Imperial Beach pier, and the Pinon Flats geophysical observatory. Fire monitoring imagery has been obtained through a connection to the HPWREN project. Near-real-time images obtained from the R/V Roger Revelle include records of seafloor operations by the JASON submersible, as part of a maintenance mission for the H2O underwater seismic observatory. We discuss acquisition mechanisms and the packet architecture for image transport via Antelope orbservers, including multi-packet support for arbitrarily large images. Relational database storage supports archiving of timestamped images, image-processing operations, grouping of related images and cameras, support for motion-detect triggers, thumbnail images, pre-computed video frames, support for time-lapse movie generation and storage of time-lapse movies. Available ROADNet monitoring tools include both orbserver-based display of incoming real-time images and web-accessible searching and distribution of images and movies driven by the relational database (http://mercali.ucsd.edu/rtapps/rtimbank.php). An extension to the Kepler Scientific Workflow System also allows real-time image display via the Ptolemy project. Custom time-lapse movies may be made from the ROADNet web pages.

  19. The Mobile Monitoring of fugitive methane emissions from natural gas consumer industries

    EPA Science Inventory

    Natural gas is used as a feedstock for major industrial processes, such as ammonia and fertilizer production. However, fugitive methane emissions from many major end-use sectors of the natural gas supply chain have not been quantified yet. This presentation introduces new tools ...

  20. Applicability of near-infrared spectroscopy in the monitoring of film coating and curing process of the prolonged release coated pellets.

    PubMed

    Korasa, Klemen; Hudovornik, Grega; Vrečer, Franc

    2016-10-10

    Although process analytical technology (PAT) guidance has been introduced to the pharmaceutical industry just a decade ago, this innovative approach has already become an important part of efficient pharmaceutical development, manufacturing, and quality assurance. PAT tools are especially important in technologically complex operations which require strict control of critical process parameters and have significant effect on final product quality. Manufacturing of prolonged release film coated pellets is definitely one of such processes. The aim of the present work was to study the applicability of the at-line near-infrared spectroscopy (NIR) approach in the monitoring of pellet film coating and curing steps. Film coated pellets were manufactured by coating the active ingredient containing pellets with film coating based on polymethacrylate polymers (Eudragit® RS/RL). The NIR proved as a useful tool for the monitoring of the curing process since it was able to determine the extent of the curing and hence predict drug release rate by using partial least square (PLS) model. However, such approach also showed a number of limitations, such as low reliability and high susceptibility to pellet moisture content, and was thus not able to predict drug release from pellets with high moisture content. On the other hand, the at-line NIR was capable to predict the thickness of Eudragit® RS/RL film coating in a wide range (up to 40μm) with good accuracy even in the pellets with high moisture content. To sum up, high applicability of the at-line NIR in the monitoring of the prolonged release pellets production was demonstrated in the present study. The present findings may contribute to more efficient and reliable PAT solutions in the manufacturing of prolonged release dosage forms. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Extending BPM Environments of Your Choice with Performance Related Decision Support

    NASA Astrophysics Data System (ADS)

    Fritzsche, Mathias; Picht, Michael; Gilani, Wasif; Spence, Ivor; Brown, John; Kilpatrick, Peter

    What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools, process optimizations or a combination of such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.

  2. Visualization of multiple influences on ocellar flight control in giant honeybees with the data-mining tool Viscovery SOMine.

    PubMed

    Kastberger, G; Kranner, G

    2000-02-01

    Viscovery SOMine is a software tool for advanced analysis and monitoring of numerical data sets. It was developed for professional use in business, industry, and science and to support dependency analysis, deviation detection, unsupervised clustering, nonlinear regression, data association, pattern recognition, and animated monitoring. Based on the concept of self-organizing maps (SOMs), it employs a robust variant of unsupervised neural networks--namely, Kohonen's Batch-SOM, which is further enhanced with a new scaling technique for speeding up the learning process. This tool provides a powerful means by which to analyze complex data sets without prior statistical knowledge. The data representation contained in the trained SOM is systematically converted to be used in a spectrum of visualization techniques, such as evaluating dependencies between components, investigating geometric properties of the data distribution, searching for clusters, or monitoring new data. We have used this software tool to analyze and visualize multiple influences of the ocellar system on free-flight behavior in giant honeybees. Occlusion of ocelli will affect orienting reactivities in relation to flight target, level of disturbance, and position of the bee in the flight chamber; it will induce phototaxis and make orienting imprecise and dependent on motivational settings. Ocelli permit the adjustment of orienting strategies to environmental demands by enforcing abilities such as centering or flight kinetics and by providing independent control of posture and flight course.

  3. The “Common Solutions” Strategy of the Experiment Support group at CERN for the LHC Experiments

    NASA Astrophysics Data System (ADS)

    Girone, M.; Andreeva, J.; Barreiro Megino, F. H.; Campana, S.; Cinquilli, M.; Di Girolamo, A.; Dimou, M.; Giordano, D.; Karavakis, E.; Kenyon, M. J.; Kokozkiewicz, L.; Lanciotti, E.; Litmaath, M.; Magini, N.; Negri, G.; Roiser, S.; Saiz, P.; Saiz Santos, M. D.; Schovancova, J.; Sciabà, A.; Spiga, D.; Trentadue, R.; Tuckett, D.; Valassi, A.; Van der Ster, D. C.; Shiers, J. D.

    2012-12-01

    After two years of LHC data taking, processing and analysis and with numerous changes in computing technology, a number of aspects of the experiments’ computing, as well as WLCG deployment and operations, need to evolve. As part of the activities of the Experiment Support group in CERN's IT department, and reinforced by effort from the EGI-InSPIRE project, we present work aimed at common solutions across all LHC experiments. Such solutions allow us not only to optimize development manpower but also offer lower long-term maintenance and support costs. The main areas cover Distributed Data Management, Data Analysis, Monitoring and the LCG Persistency Framework. Specific tools have been developed including the HammerCloud framework, automated services for data placement, data cleaning and data integrity (such as the data popularity service for CMS, the common Victor cleaning agent for ATLAS and CMS and tools for catalogue/storage consistency), the Dashboard Monitoring framework (job monitoring, data management monitoring, File Transfer monitoring) and the Site Status Board. This talk focuses primarily on the strategic aspects of providing such common solutions and how this relates to the overall goals of long-term sustainability and the relationship to the various WLCG Technical Evolution Groups. The success of the service components has given us confidence in the process, and has developed the trust of the stakeholders. We are now attempting to expand the development of common solutions into the more critical workflows. The first is a feasibility study of common analysis workflow execution elements between ATLAS and CMS. We look forward to additional common development in the future.

  4. [Construction of NIRS-based process analytical system for production of salvianolic acid for injection and relative discussion].

    PubMed

    Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang

    2016-10-01

    Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.

  5. Development of a Dmt Monitor for Statistical Tracking of Gravitational-Wave Burst Triggers Generated from the Omega Pipeline

    NASA Astrophysics Data System (ADS)

    Li, Jun-Wei; Cao, Jun-Wei

    2010-04-01

    One challenge in large-scale scientific data analysis is to monitor data in real-time in a distributed environment. For the LIGO (Laser Interferometer Gravitational-wave Observatory) project, a dedicated suit of data monitoring tools (DMT) has been developed, yielding good extensibility to new data type and high flexibility to a distributed environment. Several services are provided, including visualization of data information in various forms and file output of monitoring results. In this work, a DMT monitor, OmegaMon, is developed for tracking statistics of gravitational-wave (OW) burst triggers that are generated from a specific OW burst data analysis pipeline, the Omega Pipeline. Such results can provide diagnostic information as reference of trigger post-processing and interferometer maintenance.

  6. Runtime Performance Monitoring Tool for RTEMS System Software

    NASA Astrophysics Data System (ADS)

    Cho, B.; Kim, S.; Park, H.; Kim, H.; Choi, J.; Chae, D.; Lee, J.

    2007-08-01

    RTEMS is a commercial-grade real-time operating system that supports multi-processor computers. However, there are not many development tools for RTEMS. In this paper, we report new RTEMS-based runtime performance monitoring tool. We have implemented a light weight runtime monitoring task with an extension to the RTEMS APIs. Using our tool, software developers can verify various performance- related parameters during runtime. Our tool can be used during software development phase and in-orbit operation as well. Our implemented target agent is light weight and has small overhead using SpaceWire interface. Efforts to reduce overhead and to add other monitoring parameters are currently under research.

  7. Validity evidence as a key marker of quality of technical skill assessment in OTL-HNS.

    PubMed

    Labbé, Mathilde; Young, Meredith; Nguyen, Lily H P

    2018-01-13

    Quality monitoring of assessment practices should be a priority in all residency programs. Validity evidence is one of the main hallmarks of assessment quality and should be collected to support the interpretation and use of assessment data. Our objective was to identify, synthesize, and present the validity evidence reported supporting different technical skill assessment tools in otolaryngology-head and neck surgery (OTL-HNS). We performed a secondary analysis of data generated through a systematic review of all published tools for assessing technical skills in OTL-HNS (n = 16). For each tool, we coded validity evidence according to the five types of evidence described by the American Educational Research Association's interpretation of Messick's validity framework. Descriptive statistical analyses were conducted. All 16 tools included in our analysis were supported by internal structure and relationship to variables validity evidence. Eleven articles presented evidence supporting content. Response process was discussed only in one article, and no study reported on evidence exploring consequences. We present the validity evidence reported for 16 rater-based tools that could be used for work-based assessment of OTL-HNS residents in the operating room. The articles included in our review were consistently deficient in evidence for response process and consequences. Rater-based assessment tools that support high-stakes decisions that impact the learner and programs should include several sources of validity evidence. Thus, use of any assessment should be done with careful consideration of the context-specific validity evidence supporting score interpretation, and we encourage deliberate continual assessment quality-monitoring. NA. Laryngoscope, 2018. © 2018 The American Laryngological, Rhinological and Otological Society, Inc.

  8. Monitoring of an antigen manufacturing process.

    PubMed

    Zavatti, Vanessa; Budman, Hector; Legge, Raymond; Tamer, Melih

    2016-06-01

    Fluorescence spectroscopy in combination with multivariate statistical methods was employed as a tool for monitoring the manufacturing process of pertactin (PRN), one of the virulence factors of Bordetella pertussis utilized in whopping cough vaccines. Fluorophores such as amino acids and co-enzymes were detected throughout the process. The fluorescence data collected at different stages of the fermentation and purification process were treated employing principal component analysis (PCA). Through PCA, it was feasible to identify sources of variability in PRN production. Then, partial least square (PLS) was employed to correlate the fluorescence spectra obtained from pure PRN samples and the final protein content measured by a Kjeldahl test from these samples. In view that a statistically significant correlation was found between fluorescence and PRN levels, this approach could be further used as a method to predict the final protein content.

  9. Practice Evaluation Strategies Among Social Workers: Why an Evidence-Informed Dual-Process Theory Still Matters.

    PubMed

    Davis, Thomas D

    2017-01-01

    Practice evaluation strategies range in style from the formal-analytic tools of single-subject designs, rapid assessment instruments, algorithmic steps in evidence-informed practice, and computer software applications, to the informal-interactive tools of clinical supervision, consultation with colleagues, use of client feedback, and clinical experience. The purpose of this article is to provide practice researchers in social work with an evidence-informed theory that is capable of explaining both how and why social workers use practice evaluation strategies to self-monitor the effectiveness of their interventions in terms of client change. The author delineates the theoretical contours and consequences of what is called dual-process theory. Drawing on evidence-informed advances in the cognitive and social neurosciences, the author identifies among everyday social workers a theoretically stable, informal-interactive tool preference that is a cognitively necessary, sufficient, and stand-alone preference that requires neither the supplementation nor balance of formal-analytic tools. The author's delineation of dual-process theory represents a theoretical contribution in the century-old attempt to understand how and why social workers evaluate their practice the way they do.

  10. Machine assisted histogram classification

    NASA Astrophysics Data System (ADS)

    Benyó, B.; Gaspar, C.; Somogyi, P.

    2010-04-01

    LHCb is one of the four major experiments under completion at the Large Hadron Collider (LHC). Monitoring the quality of the acquired data is important, because it allows the verification of the detector performance. Anomalies, such as missing values or unexpected distributions can be indicators of a malfunctioning detector, resulting in poor data quality. Spotting faulty or ageing components can be either done visually using instruments, such as the LHCb Histogram Presenter, or with the help of automated tools. In order to assist detector experts in handling the vast monitoring information resulting from the sheer size of the detector, we propose a graph based clustering tool combined with machine learning algorithm and demonstrate its use by processing histograms representing 2D hitmaps events. We prove the concept by detecting ion feedback events in the LHCb experiment's RICH subdetector.

  11. Dynamic Analyses of Result Quality in Energy-Aware Approximate Programs

    NASA Astrophysics Data System (ADS)

    RIngenburg, Michael F.

    Energy efficiency is a key concern in the design of modern computer systems. One promising approach to energy-efficient computation, approximate computing, trades off output precision for energy efficiency. However, this tradeoff can have unexpected effects on computation quality. This thesis presents dynamic analysis tools to study, debug, and monitor the quality and energy efficiency of approximate computations. We propose three styles of tools: prototyping tools that allow developers to experiment with approximation in their applications, online tools that instrument code to determine the key sources of error, and online tools that monitor the quality of deployed applications in real time. Our prototyping tool is based on an extension to the functional language OCaml. We add approximation constructs to the language, an approximation simulator to the runtime, and profiling and auto-tuning tools for studying and experimenting with energy-quality tradeoffs. We also present two online debugging tools and three online monitoring tools. The first online tool identifies correlations between output quality and the total number of executions of, and errors in, individual approximate operations. The second tracks the number of approximate operations that flow into a particular value. Our online tools comprise three low-cost approaches to dynamic quality monitoring. They are designed to monitor quality in deployed applications without spending more energy than is saved by approximation. Online monitors can be used to perform real time adjustments to energy usage in order to meet specific quality goals. We present prototype implementations of all of these tools and describe their usage with several applications. Our prototyping, profiling, and autotuning tools allow us to experiment with approximation strategies and identify new strategies, our online tools succeed in providing new insights into the effects of approximation on output quality, and our monitors succeed in controlling output quality while still maintaining significant energy efficiency gains.

  12. Long Term Geoelectrical Monitoring of Deep-water Horizon Oil Spill in the Gulf Coast

    NASA Astrophysics Data System (ADS)

    Heenan, J. W.; Ntarlagiannis, D.; Slater, L. D.; Atekwana, E. A.; Ross, C.; Nolan, J. T.; Atekwana, E. A.

    2011-12-01

    In the aftermath of the catastrophic Deep-water Horizon (DWH) spill in the Gulf Coast, opportunities exist to study the evolution of fresh crude oil contamination in beach sediments and marshes. Grand Terre 1 Island, off the coast of Grand Isle in southern Louisiana, is an uninhabited barrier island, heavily impacted by the DWH spill, and ideal for undisturbed long term monitoring of crude oil degradation processes. A 10 channel Syscal-Pro resistivity / IP instrument (IRIS Instruments, France) is the heart of the fully autonomous geoelectrical monitoring system; the system, which is housed in a weatherproof container, relies solely on solar power, is controlled by an energy efficient PC and can be accessed remotely via web tools. The monitoring scheme involves collecting bi-daily resistivity measurements from surface and shallow boreholes, ranging from January 2011 to the present; environmental parameters, such as T, are continuously recorded at several depths. During regular field trips we perform larger scale geophysical surveys, and geochemical measurements (pH, DO, T, fluid C) to support the continuous geophysical monitoring. The contaminated layer on site is a visually distinctive layer of crude oil, isolated by cleaner sands above and below which is identified by a clear and obvious resistive anomaly in preliminary surveys. Early results show a decrease in average of the resistance values of each dataset over time. Further processing of the data yields a linearly shaped resistive anomaly, which coincides with the location of the oil layer. The changes in subsurface resistivity appear to be focused within this anomaly. Time filtering of the data by the time that they were collected, morning or evening, reveals a diurnal variation. While both time frames follow the same overall trend, the measurements in the morning are slightly more resistive than those in the evening. This indicates that there are environmental factors, such as temperature, that need to be accounted for when analyzing the data for evidence of biological processes. These preliminary findings indicate changes in the subsurface properties of the contaminated area and suggest that geoelectrical methods are sensitive to contamination evolution processes. Such geophysical data, constrained by geochemical and microbiological information, have the potential to be used as a long term monitoring tool for biological and geochemical processes in the subsurface.

  13. Wireless acceleration sensor of moving elements for condition monitoring of mechanisms

    NASA Astrophysics Data System (ADS)

    Sinitsin, Vladimir V.; Shestakov, Aleksandr L.

    2017-09-01

    Comprehensive analysis of the angular and linear accelerations of moving elements (shafts, gears) allows an increase in the quality of the condition monitoring of mechanisms. However, existing tools and methods measure either linear or angular acceleration with postprocessing. This paper suggests a new construction design of an angular acceleration sensor for moving elements. The sensor is mounted on a moving element and, among other things, the data transfer and electric power supply are carried out wirelessly. In addition, the authors introduce a method for processing the received information which makes it possible to divide the measured acceleration into the angular and linear components. The design has been validated by the results of laboratory tests of an experimental model of the sensor. The study has shown that this method provides a definite separation of the measured acceleration into linear and angular components, even in noise. This research contributes an advance in the range of methods and tools for condition monitoring of mechanisms.

  14. Adaptive noise cancelling and time-frequency techniques for rail surface defect detection

    NASA Astrophysics Data System (ADS)

    Liang, B.; Iwnicki, S.; Ball, A.; Young, A. E.

    2015-03-01

    Adaptive noise cancelling (ANC) is a technique which is very effective to remove additive noises from the contaminated signals. It has been widely used in the fields of telecommunication, radar and sonar signal processing. However it was seldom used for the surveillance and diagnosis of mechanical systems before late of 1990s. As a promising technique it has gradually been exploited for the purpose of condition monitoring and fault diagnosis. Time-frequency analysis is another useful tool for condition monitoring and fault diagnosis purpose as time-frequency analysis can keep both time and frequency information simultaneously. This paper presents an ANC and time-frequency application for railway wheel flat and rail surface defect detection. The experimental results from a scaled roller test rig show that this approach can significantly reduce unwanted interferences and extract the weak signals from strong background noises. The combination of ANC and time-frequency analysis may provide us one of useful tools for condition monitoring and fault diagnosis of railway vehicles.

  15. Structural Health Monitoring with Fiber Bragg Grating and Piezo Arrays

    NASA Technical Reports Server (NTRS)

    Black, Richard J.; Faridian, Ferey; Moslehi, Behzad; Sotoudeh, Vahid

    2012-01-01

    Structural health monitoring (SHM) is one of the most important tools available for the maintenance, safety, and integrity of aerospace structural systems. Lightweight, electromagnetic-interference- immune, fiber-optic sensor-based SHM will play an increasing role in more secure air transportation systems. Manufacturers and maintenance personnel have pressing needs for significantly improving safety and reliability while providing for lower inspection and maintenance costs. Undetected or untreated damage may grow and lead to catastrophic structural failure. Damage can originate from the strain/stress history of the material, imperfections of domain boundaries in metals, delamination in multi-layer materials, or the impact of machine tools in the manufacturing process. Damage can likewise develop during service life from wear and tear, or under extraordinary circumstances such as with unusual forces, temperature cycling, or impact of flying objects. Monitoring and early detection are key to preventing a catastrophic failure of structures, especially when these are expected to perform near their limit conditions.

  16. The Earth Observation Monitor - Automated monitoring and alerting for spatial time-series data based on OGC web services

    NASA Astrophysics Data System (ADS)

    Eberle, J.; Hüttich, C.; Schmullius, C.

    2014-12-01

    Spatial time series data are freely available around the globe from earth observation satellites and meteorological stations for many years until now. They provide useful and important information to detect ongoing changes of the environment; but for end-users it is often too complex to extract this information out of the original time series datasets. This issue led to the development of the Earth Observation Monitor (EOM), an operational framework and research project to provide simple access, analysis and monitoring tools for global spatial time series data. A multi-source data processing middleware in the backend is linked to MODIS data from Land Processes Distributed Archive Center (LP DAAC) and Google Earth Engine as well as daily climate station data from NOAA National Climatic Data Center. OGC Web Processing Services are used to integrate datasets from linked data providers or external OGC-compliant interfaces to the EOM. Users can either use the web portal (webEOM) or the mobile application (mobileEOM) to execute these processing services and to retrieve the requested data for a given point or polygon in userfriendly file formats (CSV, GeoTiff). Beside providing just data access tools, users can also do further time series analyses like trend calculations, breakpoint detections or the derivation of phenological parameters from vegetation time series data. Furthermore data from climate stations can be aggregated over a given time interval. Calculated results can be visualized in the client and downloaded for offline usage. Automated monitoring and alerting of the time series data integrated by the user is provided by an OGC Sensor Observation Service with a coupled OGC Web Notification Service. Users can decide which datasets and parameters are monitored with a given filter expression (e.g., precipitation value higher than x millimeter per day, occurrence of a MODIS Fire point, detection of a time series anomaly). Datasets integrated in the SOS service are updated in near-realtime based on the linked data providers mentioned above. An alert is automatically pushed to the user if the new data meets the conditions of the registered filter expression. This monitoring service is available on the web portal with alerting by email and within the mobile app with alerting by email and push notification.

  17. Analysis of acoustic emission signals and monitoring of machining processes

    PubMed

    Govekar; Gradisek; Grabec

    2000-03-01

    Monitoring of a machining process on the basis of sensor signals requires a selection of informative inputs in order to reliably characterize and model the process. In this article, a system for selection of informative characteristics from signals of multiple sensors is presented. For signal analysis, methods of spectral analysis and methods of nonlinear time series analysis are used. With the aim of modeling relationships between signal characteristics and the corresponding process state, an adaptive empirical modeler is applied. The application of the system is demonstrated by characterization of different parameters defining the states of a turning machining process, such as: chip form, tool wear, and onset of chatter vibration. The results show that, in spite of the complexity of the turning process, the state of the process can be well characterized by just a few proper characteristics extracted from a representative sensor signal. The process characterization can be further improved by joining characteristics from multiple sensors and by application of chaotic characteristics.

  18. Essentials of LIDAR multiangle data processing methodology for smoke polluted atmospheres

    Treesearch

    V. A. Kovalev; A. Petkov; C. Wold; S. Urbanski; W. M. Hao

    2009-01-01

    Mobile scanning lidar is the most appropriate tool for monitoring wildfire smoke-plume dynamics and optical properties. Lidar is the only remote sensing instrument capable of obtaining detailed three-dimensional range-resolved information for smoke distributions and optical properties over ranges of 10+ km at different wavelengths simultaneously.

  19. Evaluating a Technology Supported Interactive Response System during the Laboratory Section of a Histology Course

    ERIC Educational Resources Information Center

    Rinaldi, Vera D.; Lorr, Nancy A.; Williams, Kimberly

    2017-01-01

    Monitoring of student learning through systematic formative assessment is important for adjusting pedagogical strategies. However, traditional formative assessments, such as quizzes and written assignments, may not be sufficiently timely for making adjustments to a learning process. Technology supported formative assessment tools assess student…

  20. A Management Tool for Reallocating College Resources

    ERIC Educational Resources Information Center

    Dellow, Donald; Losinger, Regina

    2004-01-01

    Community college administrators frequently need to deal with program enrollment shifts resulting from economic and demographic shifts. Reallocating resources between areas of the college to adjust to those enrollment shifts can become a difficult process if there isn't a careful monitoring of enrollment patterns and program costs. The authors…

  1. DynAMo: A Modular Platform for Monitoring Process, Outcome, and Algorithm-Based Treatment Planning in Psychotherapy.

    PubMed

    Kaiser, Tim; Laireiter, Anton Rupert

    2017-07-20

    In recent years, the assessment of mental disorders has become more and more personalized. Modern advancements such as Internet-enabled mobile phones and increased computing capacity make it possible to tap sources of information that have long been unavailable to mental health practitioners. Software packages that combine algorithm-based treatment planning, process monitoring, and outcome monitoring are scarce. The objective of this study was to assess whether the DynAMo Web application can fill this gap by providing a software solution that can be used by both researchers to conduct state-of-the-art psychotherapy process research and clinicians to plan treatments and monitor psychotherapeutic processes. In this paper, we report on the current state of a Web application that can be used for assessing the temporal structure of mental disorders using information on their temporal and synchronous associations. A treatment planning algorithm automatically interprets the data and delivers priority scores of symptoms to practitioners. The application is also capable of monitoring psychotherapeutic processes during therapy and of monitoring treatment outcomes. This application was developed using the R programming language (R Core Team, Vienna) and the Shiny Web application framework (RStudio, Inc, Boston). It is made entirely from open-source software packages and thus is easily extensible. The capabilities of the proposed application are demonstrated. Case illustrations are provided to exemplify its usefulness in clinical practice. With the broad availability of Internet-enabled mobile phones and similar devices, collecting data on psychopathology and psychotherapeutic processes has become easier than ever. The proposed application is a valuable tool for capturing, processing, and visualizing these data. The combination of dynamic assessment and process- and outcome monitoring has the potential to improve the efficacy and effectiveness of psychotherapy. ©Tim Kaiser, Anton Rupert Laireiter. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 20.07.2017.

  2. Alarms Philosophy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Karen S; Kasemir, Kay

    2009-01-01

    An effective alarm system consists of a mechanism to monitor control points and generate alarm notifications, tools for operators to view, hear, acknowledge and handle alarms and a good configuration. Despite the availability of numerous fully featured tools, accelerator alarm systems continue to be disappointing to operations, frequently to the point of alarms being permanently silenced or totally ignored. This is often due to configurations that produce an excessive number of alarms or fail to communicate the required operator response. Most accelerator controls systems do a good job of monitoring specified points and generating notifications when parameters exceed predefined limits.more » In some cases, improved tools can help, but more often, poor configuration is the root cause of ineffective alarm systems. A SNS, we have invested considerable effort in generating appropriate configurations using a rigorous set of rules based on best practices in the industrial process controls community. This paper will discuss our alarm configuration philosophy and operator response to our new system.« less

  3. Strategy proposed by Electricite de France in the development of automatic tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castaing, C.; Cazin, B.

    1995-03-01

    The strategy proposed by EDF in the development of a means to limit personal and collective dosimetry is recent. It follows in the steps of a policy that consisted of developing remote operation means for those activities of inspection and maintenance on the reactor, pools bottom, steam generators (SGs), also reactor building valves; target activities because of their high dosimetric cost. One of the main duties of the UTO (Technical Support Department), within the EDF, is the maintenance of Pressurized Water Reactors in French Nuclear Power Plant Operations (consisting of 54 units) and the development and monitoring of specialized tools.more » To achieve this, the UTO has started a national think-tank on the implementation of the ALARA process in its field of activity and created an ALARA Committee responsible for running and monitoring it, as well as a policy for developing tools. This point will be illustrated in the second on reactor vessel heads.« less

  4. Improving the All-Hazards Homeland Security Enterprise Through the Use of an Emergency Management Intelligence Model

    DTIC Science & Technology

    2013-09-01

    Office of the Inspector General OSINT Open Source Intelligence PPD Presidential Policy Directive SIGINT Signals Intelligence SLFC State/Local Fusion...Geospatial Intelligence (GEOINT) from Geographic Information Systems (GIS), and Open Source Intelligence ( OSINT ) from Social Media. GIS is widely...and monitor make it a feasible tool to capitalize on for OSINT . A formalized EM intelligence process would help expedite the processing of such

  5. Launching GUPPI: the Green Bank Ultimate Pulsar Processing Instrument

    NASA Astrophysics Data System (ADS)

    DuPlain, Ron; Ransom, Scott; Demorest, Paul; Brandt, Patrick; Ford, John; Shelton, Amy L.

    2008-08-01

    The National Radio Astronomy Observatory (NRAO) is launching the Green Bank Ultimate Pulsar Processing Instrument (GUPPI), a prototype flexible digital signal processor designed for pulsar observations with the Robert C. Byrd Green Bank Telescope (GBT). GUPPI uses field programmable gate array (FPGA) hardware and design tools developed by the Center for Astronomy Signal Processing and Electronics Research (CASPER) at the University of California, Berkeley. The NRAO has been concurrently developing GUPPI software and hardware using minimal software resources. The software handles instrument monitor and control, data acquisition, and hardware interfacing. GUPPI is currently an expert-only spectrometer, but supports future integration with the full GBT production system. The NRAO was able to take advantage of the unique flexibility of the CASPER FPGA hardware platform, develop hardware and software in parallel, and build a suite of software tools for monitoring, controlling, and acquiring data with a new instrument over a short timeline of just a few months. The NRAO interacts regularly with CASPER and its users, and GUPPI stands as an example of what reconfigurable computing and open-source development can do for radio astronomy. GUPPI is modular for portability, and the NRAO provides the results of development as an open-source resource.

  6. A New Paradigm for Tissue Diagnostics: Tools and Techniques to Standardize Tissue Collection, Transport, and Fixation.

    PubMed

    Bauer, Daniel R; Otter, Michael; Chafin, David R

    2018-01-01

    Studying and developing preanalytical tools and technologies for the purpose of obtaining high-quality samples for histological assays is a growing field. Currently, there does not exist a standard practice for collecting, fixing, and monitoring these precious samples. There has been some advancement in standardizing collection for the highest profile tumor types, such as breast, where HER2 testing drives therapeutic decisions. This review examines the area of tissue collection, transport, and monitoring of formalin diffusion and details a prototype system that could be used to help standardize tissue collection efforts. We have surveyed recent primary literature sources and conducted several site visits to understand the most error-prone processes in histology laboratories. This effort identified errors that resulted from sample collection techniques and subsequent transport delays from the operating room (OR) to the histology laboratories. We have therefore devised a prototype sample collection and transport concept. The system consists of a custom data logger and cold transport box and takes advantage of a novel cold + warm (named 2 + 2) fixation method. This review highlights the beneficial aspects of standardizing tissue collection, fixation, and monitoring. In addition, a prototype system is introduced that could help standardize these processes and is compatible with use directly in the OR and from remote sites.

  7. An OSEE Based Portable Surface Contamination Monitor

    NASA Technical Reports Server (NTRS)

    Perey, Daniel F.

    1997-01-01

    Many industrial and aerospace processes involving the joining of materials, require sufficient surface cleanliness to insure proper bonding. Processes as diverse as painting, welding, or the soldering of electronic circuits will be compromised if prior inspection and removal of surface contaminants is inadequate. As process requirements become more stringent and the number of different materials and identified contaminants increases, various instruments and techniques have been developed for improved inspection. One such technique based on the principle of Optically Stimulated Electron Emission (OSEE) has been explored for a number of years as a tool for surface contamination monitoring. Some of the benefits of OSEE are: it's non-contacting; requires little operator training; and has very high contamination sensitivity. This paper describes the development of a portable OSEE based surface contamination monitor. The instrument is suitable for both hand-held and robotic inspections with either manual or automated control of instrument operation. In addition, instrument output data is visually displayed to the operator and may be output to an external computer for archiving or analysis.

  8. An Overview of the Runtime Verification Tool Java PathExplorer

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We present an overview of the Java PathExplorer runtime verification tool, in short referred to as JPAX. JPAX can monitor the execution of a Java program and check that it conforms with a set of user provided properties formulated in temporal logic. JPAX can in addition analyze the program for concurrency errors such as deadlocks and data races. The concurrency analysis requires no user provided specification. The tool facilitates automated instrumentation of a program's bytecode, which when executed will emit an event stream, the execution trace, to an observer. The observer dispatches the incoming event stream to a set of observer processes, each performing a specialized analysis, such as the temporal logic verification, the deadlock analysis and the data race analysis. Temporal logic specifications can be formulated by the user in the Maude rewriting logic, where Maude is a high-speed rewriting system for equational logic, but here extended with executable temporal logic. The Maude rewriting engine is then activated as an event driven monitoring process. Alternatively, temporal specifications can be translated into efficient automata, which check the event stream. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems.

  9. Resilient Monitoring Systems: Architecture, Design, and Application to Boiler/Turbine Plant

    DOE PAGES

    Garcia, Humberto E.; Lin, Wen-Chiao; Meerkov, Semyon M.; ...

    2014-11-01

    Resilient monitoring systems, considered in this paper, are sensor networks that degrade gracefully under malicious attacks on their sensors, causing them to project misleading information. The goal of this work is to design, analyze, and evaluate the performance of a resilient monitoring system intended to monitor plant conditions (normal or anomalous). The architecture developed consists of four layers: data quality assessment, process variable assessment, plant condition assessment, and sensor network adaptation. Each of these layers is analyzed by either analytical or numerical tools. The performance of the overall system is evaluated using a simplified boiler/turbine plant. The measure of resiliencymore » is quantified using Kullback-Leibler divergence, and is shown to be sufficiently high in all scenarios considered.« less

  10. Resilient monitoring systems: architecture, design, and application to boiler/turbine plant.

    PubMed

    Garcia, Humberto E; Lin, Wen-Chiao; Meerkov, Semyon M; Ravichandran, Maruthi T

    2014-11-01

    Resilient monitoring systems, considered in this paper, are sensor networks that degrade gracefully under malicious attacks on their sensors, causing them to project misleading information. The goal of this paper is to design, analyze, and evaluate the performance of a resilient monitoring system intended to monitor plant conditions (normal or anomalous). The architecture developed consists of four layers: data quality assessment, process variable assessment, plant condition assessment, and sensor network adaptation. Each of these layers is analyzed by either analytical or numerical tools. The performance of the overall system is evaluated using a simplified boiler/turbine plant. The measure of resiliency is quantified based on the Kullback-Leibler divergence and shown to be sufficiently high in all scenarios considered.

  11. Image edge detection based tool condition monitoring with morphological component analysis.

    PubMed

    Yu, Xiaolong; Lin, Xin; Dai, Yiquan; Zhu, Kunpeng

    2017-07-01

    The measurement and monitoring of tool condition are keys to the product precision in the automated manufacturing. To meet the need, this study proposes a novel tool wear monitoring approach based on the monitored image edge detection. Image edge detection has been a fundamental tool to obtain features of images. This approach extracts the tool edge with morphological component analysis. Through the decomposition of original tool wear image, the approach reduces the influence of texture and noise for edge measurement. Based on the target image sparse representation and edge detection, the approach could accurately extract the tool wear edge with continuous and complete contour, and is convenient in charactering tool conditions. Compared to the celebrated algorithms developed in the literature, this approach improves the integrity and connectivity of edges, and the results have shown that it achieves better geometry accuracy and lower error rate in the estimation of tool conditions. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  12. The construction of control chart for PM10 functional data

    NASA Astrophysics Data System (ADS)

    Shaadan, Norshahida; Jemain, Abdul Aziz; Deni, Sayang Mohd

    2014-06-01

    In this paper, a statistical procedure to construct a control chart for monitoring air quality (PM10) using functional data is proposed. A set of daily indices that represent the daily PM10 curves were obtained using Functional Principal Component Analysis (FPCA). By means of an iterative charting procedure, a reference data set that represented a stable PM10 process was obtained. The data were then used as a reference for monitoring future data. The application of the procedure was conducted using seven-year (2004-2010) period of recorded data from the Klang air quality monitoring station located in the Klang Valley region of Peninsular Malaysia. The study showed that the control chart provided a useful visualization tool for monitoring air quality and was capable in detecting abnormality in the process system. As in the case of Klang station, the results showed that with reference to 2004-2008, the air quality (PM10) in 2010 was better than that in 2009.

  13. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  14. Telecommunications end-to-end systems monitoring on TOPEX/Poseidon: Tools and techniques

    NASA Technical Reports Server (NTRS)

    Calanche, Bruno J.

    1994-01-01

    The TOPEX/Poseidon Project Satellite Performance Analysis Team's (SPAT) roles and responsibilities have grown to include functions that are typically performed by other teams on JPL Flight Projects. In particular, SPAT Telecommunication's role has expanded beyond the nominal function of monitoring, assessing, characterizing, and trending the spacecraft (S/C) RF/Telecom subsystem to one of End-to-End Information Systems (EEIS) monitoring. This has been accomplished by taking advantage of the spacecraft and ground data system structures and protocols. By processing both the received spacecraft telemetry minor frame ground generated CRC flags and NASCOM block poly error flags, bit error rates (BER) for each link segment can be determined. This provides the capability to characterize the separate link segments, determine science data recovery, and perform fault/anomaly detection and isolation. By monitoring and managing the links, TOPEX has successfully recovered approximately 99.9 percent of the science data with an integrity (BER) of better than 1 x 10(exp 8). This paper presents the algorithms used to process the above flags and the techniques used for EEIS monitoring.

  15. Application of GNSS Methods for Monitoring Offshore Platform Deformation

    NASA Astrophysics Data System (ADS)

    Myint, Khin Cho; Nasir Matori, Abd; Gohari, Adel

    2018-03-01

    Global Navigation Satellite System (GNSS) has become a powerful tool for high-precision deformation monitoring application. Monitoring of deformation and subsidence of offshore platform due to factors such as shallow gas phenomena. GNSS is the technical interoperability and compatibility between various satellite navigation systems such as modernized GPS, Galileo, reconstructed GLONASS to be used by civilian users. It has been known that excessive deformation affects platform structurally, causing loss of production and affects the efficiency of the machinery on board the platform. GNSS have been proven to be one of the most precise positioning methods where by users can get accuracy to the nearest centimeter of a given position from carrier phase measurement processing of GPS signals. This research is aimed at using GNSS technique, which is one of the most standard methods to monitor the deformation of offshore platforms. Therefore, station modeling, which accounts for the spatial correlated errors, and hence speeds up the ambiguity resolution process is employed. It was found that GNSS combines the high accuracy of the results monitoring the offshore platforms deformation with the possibility of survey.

  16. Implementation of an Electronic Data Collection Tool to Monitor Nursing-Sensitive Indicators in a Large Academic Health Sciences Centre.

    PubMed

    Backman, Chantal; Vanderloo, Saskia; Momtahan, Kathy; d'Entremont, Barb; Freeman, Lisa; Kachuik, Lynn; Rossy, Dianne; Mille, Toba; Mojaverian, Naghmeh; Lemire-Rodger, Ginette; Forster, Alan

    2015-09-01

    Monitoring the quality of nursing care is essential to identify patients at risk, measure adherence to hospital policies and evaluate the effectiveness of best practice interventions. However, monitoring nursing-sensitive indicators (NSI) is a challenge. Prevalence surveys are one method used by some organizations to monitor NSI, which are patient outcomes that are directly affected by the quantity or quality of nursing care that the patient receives. The aim of this paper is to describe the development of an innovative electronic data collection tool to monitor NSI. In the preliminary development work, we designed a mobile computing application with pre-populated patient census information to collect the nursing quality data. In subsequent phases, we refined this process by designing an electronic trigger using The Ottawa Hospital's Patient Safety Learning System, which automatically generated a case report form for each inpatient based on the hospital's daily patient census on the day of the prevalence survey. Both of these electronic data collection tools were accessible on tablet computers, which substantially reduced data collection, analysis and reporting time compared to previous paper-based methods. The electronic trigger provided improved completeness of the data. This work leveraged the use of tablet computers combined with a web-based application for patient data collection at point of care. Overall, the electronic methods improved data completeness and timeliness compared to traditional paper-based methods. This initiative has resulted in the ability to collect and report on NSI organization-wide to advance decision-making support and identify quality improvement opportunities within the organization. Copyright © 2015 Longwoods Publishing.

  17. Batch process fault detection and identification based on discriminant global preserving kernel slow feature analysis.

    PubMed

    Zhang, Hanyuan; Tian, Xuemin; Deng, Xiaogang; Cao, Yuping

    2018-05-16

    As an attractive nonlinear dynamic data analysis tool, global preserving kernel slow feature analysis (GKSFA) has achieved great success in extracting the high nonlinearity and inherently time-varying dynamics of batch process. However, GKSFA is an unsupervised feature extraction method and lacks the ability to utilize batch process class label information, which may not offer the most effective means for dealing with batch process monitoring. To overcome this problem, we propose a novel batch process monitoring method based on the modified GKSFA, referred to as discriminant global preserving kernel slow feature analysis (DGKSFA), by closely integrating discriminant analysis and GKSFA. The proposed DGKSFA method can extract discriminant feature of batch process as well as preserve global and local geometrical structure information of observed data. For the purpose of fault detection, a monitoring statistic is constructed based on the distance between the optimal kernel feature vectors of test data and normal data. To tackle the challenging issue of nonlinear fault variable identification, a new nonlinear contribution plot method is also developed to help identifying the fault variable after a fault is detected, which is derived from the idea of variable pseudo-sample trajectory projection in DGKSFA nonlinear biplot. Simulation results conducted on a numerical nonlinear dynamic system and the benchmark fed-batch penicillin fermentation process demonstrate that the proposed process monitoring and fault diagnosis approach can effectively detect fault and distinguish fault variables from normal variables. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  18. GEOGLAM Crop Assessment Tool: Adapting from global agricultural monitoring to food security monitoring

    NASA Astrophysics Data System (ADS)

    Humber, M. L.; Becker-Reshef, I.; Nordling, J.; Barker, B.; McGaughey, K.

    2014-12-01

    The GEOGLAM Crop Monitor's Crop Assessment Tool was released in August 2013 in support of the GEOGLAM Crop Monitor's objective to develop transparent, timely crop condition assessments in primary agricultural production areas, highlighting potential hotspots of stress/bumper crops. The Crop Assessment Tool allows users to view satellite derived products, best available crop masks, and crop calendars (created in collaboration with GEOGLAM Crop Monitor partners), then in turn submit crop assessment entries detailing the crop's condition, drivers, impacts, trends, and other information. Although the Crop Assessment Tool was originally intended to collect data on major crop production at the global scale, the types of data collected are also relevant to the food security and rangelands monitoring communities. In line with the GEOGLAM Countries at Risk philosophy of "foster[ing] the coordination of product delivery and capacity building efforts for national and regional organizations, and the development of harmonized methods and tools", a modified version of the Crop Assessment Tool is being developed for the USAID Famine Early Warning Systems Network (FEWS NET). As a member of the Countries at Risk component of GEOGLAM, FEWS NET provides agricultural monitoring, timely food security assessments, and early warnings of potential significant food shortages focusing specifically on countries at risk of food security emergencies. While the FEWS NET adaptation of the Crop Assessment Tool focuses on crop production in the context of food security rather than large scale production, the data collected is nearly identical to the data collected by the Crop Monitor. If combined, the countries monitored by FEWS NET and GEOGLAM Crop Monitor would encompass over 90 countries representing the most important regions for crop production and food security.

  19. Demonstrating Functional Equivalence of Pilot and Production Scale Freeze-Drying of BCG

    PubMed Central

    ten Have, R.; Reubsaet, K.; van Herpen, P.; Kersten, G.; Amorij, J.-P.

    2016-01-01

    Process analytical technology (PAT)-tools were used to monitor freeze-drying of Bacille Calmette-Guérin (BCG) at pilot and production scale. Among the evaluated PAT-tools, there is the novel use of the vacuum valve open/close frequency for determining the endpoint of primary drying at production scale. The duration of primary drying, the BCG survival rate, and the residual moisture content (RMC) were evaluated using two different freeze-drying protocols and were found to be independent of the freeze-dryer scale evidencing functional equivalence. The absence of an effect of the freeze-dryer scale on the process underlines the feasibility of the pilot scale freeze-dryer for further BCG freeze-drying process optimization which may be carried out using a medium without BCG. PMID:26981867

  20. Demonstrating Functional Equivalence of Pilot and Production Scale Freeze-Drying of BCG.

    PubMed

    Ten Have, R; Reubsaet, K; van Herpen, P; Kersten, G; Amorij, J-P

    2016-01-01

    Process analytical technology (PAT)-tools were used to monitor freeze-drying of Bacille Calmette-Guérin (BCG) at pilot and production scale. Among the evaluated PAT-tools, there is the novel use of the vacuum valve open/close frequency for determining the endpoint of primary drying at production scale. The duration of primary drying, the BCG survival rate, and the residual moisture content (RMC) were evaluated using two different freeze-drying protocols and were found to be independent of the freeze-dryer scale evidencing functional equivalence. The absence of an effect of the freeze-dryer scale on the process underlines the feasibility of the pilot scale freeze-dryer for further BCG freeze-drying process optimization which may be carried out using a medium without BCG.

  1. CIM at GE's factory of the future

    NASA Astrophysics Data System (ADS)

    Waldman, H.

    Functional features of a highly automated aircraft component batch processing factory are described. The system has processing, working, and methodology components. A rotating parts operation installed 20 yr ago features a high density of numerically controlled machines, and is connected to a hierarchical network of data communications and apparatus for moving the rotating parts and tools of engines. Designs produced at one location in the country are sent by telephone link to other sites for development of manufacturing plans, tooling, numerical control programs, and process instructions for the rotating parts. Direct numerical control is implemented at the work stations, which have instructions stored on tape for back-up in case the host computer goes down. Each machine is automatically monitored at 48 points and notice of failure can originate from any point in the system.

  2. Image Navigation and Registration Performance Assessment Evaluation Tools for GOES-R ABI and GLM

    NASA Technical Reports Server (NTRS)

    Houchin, Scott; Porter, Brian; Graybill, Justin; Slingerland, Philip

    2017-01-01

    The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24 hour evaluation period. This paper describes the software design and implementation of IPATS and provides preliminary test results.

  3. Bacteriophage removal efficiency as a validation and operational monitoring tool for virus reduction in wastewater reclamation: Review.

    PubMed

    Amarasiri, Mohan; Kitajima, Masaaki; Nguyen, Thanh H; Okabe, Satoshi; Sano, Daisuke

    2017-09-15

    The multiple-barrier concept is widely employed in international and domestic guidelines for wastewater reclamation and reuse for microbiological risk management, in which a wastewater reclamation system is designed to achieve guideline values of the performance target of microbe reduction. Enteric viruses are one of the pathogens for which the target reduction values are stipulated in guidelines, but frequent monitoring to validate human virus removal efficacy is challenging in a daily operation due to the cumbersome procedures for virus quantification in wastewater. Bacteriophages have been the first choice surrogate for this task, because of the well-characterized nature of strains and the presence of established protocols for quantification. Here, we performed a meta-analysis to calculate the average log 10 reduction values (LRVs) of somatic coliphages, F-specific phages, MS2 coliphage and T4 phage by membrane bioreactor, activated sludge, constructed wetlands, pond systems, microfiltration and ultrafiltration. The calculated LRVs of bacteriophages were then compared with reported human enteric virus LRVs. MS2 coliphage LRVs in MBR processes were shown to be lower than those of norovirus GII and enterovirus, suggesting it as a possible validation and operational monitoring tool. The other bacteriophages provided higher LRVs compared to human viruses. The data sets on LRVs of human viruses and bacteriophages are scarce except for MBR and conventional activated sludge processes, which highlights the necessity of investigating LRVs of human viruses and bacteriophages in multiple treatment unit processes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Real-Time Fault Classification for Plasma Processes

    PubMed Central

    Yang, Ryan; Chen, Rongshun

    2011-01-01

    Plasma process tools, which usually cost several millions of US dollars, are often used in the semiconductor fabrication etching process. If the plasma process is halted due to some process fault, the productivity will be reduced and the cost will increase. In order to maximize the product/wafer yield and tool productivity, a timely and effective fault process detection is required in a plasma reactor. The classification of fault events can help the users to quickly identify fault processes, and thus can save downtime of the plasma tool. In this work, optical emission spectroscopy (OES) is employed as the metrology sensor for in-situ process monitoring. Splitting into twelve different match rates by spectrum bands, the matching rate indicator in our previous work (Yang, R.; Chen, R.S. Sensors 2010, 10, 5703–5723) is used to detect the fault process. Based on the match data, a real-time classification of plasma faults is achieved by a novel method, developed in this study. Experiments were conducted to validate the novel fault classification. From the experimental results, we may conclude that the proposed method is feasible inasmuch that the overall accuracy rate of the classification for fault event shifts is 27 out of 28 or about 96.4% in success. PMID:22164001

  5. Evaluation of Earthquake Detection Performance in Terms of Quality and Speed in SEISCOMP3 Using New Modules Qceval, Npeval and Sceval

    NASA Astrophysics Data System (ADS)

    Roessler, D.; Weber, B.; Ellguth, E.; Spazier, J.

    2017-12-01

    The geometry of seismic monitoring networks, site conditions and data availability as well as monitoring targets and strategies typically impose trade-offs between data quality, earthquake detection sensitivity, false detections and alert times. Network detection capabilities typically change with alteration of the seismic noise level by human activity or by varying weather and sea conditions. To give helpful information to operators and maintenance coordinators, gempa developed a range of tools to evaluate earthquake detection and network performance including qceval, npeval and sceval. qceval is a module which analyzes waveform quality parameters in real-time and deactivates and reactivates data streams based on waveform quality thresholds for automatic processing. For example, thresholds can be defined for latency, delay, timing quality, spikes and gaps count and rms. As changes in the automatic processing have a direct influence on detection quality and speed, another tool called "npeval" was designed to calculate in real-time the expected time needed to detect and locate earthquakes by evaluating the effective network geometry. The effective network geometry is derived from the configuration of stations participating in the detection. The detection times are shown as an additional layer on the map and updated in real-time as soon as the effective network geometry changes. Yet another new tool, "sceval", is an automatic module which classifies located seismic events (Origins) in real-time. sceval evaluates the spatial distribution of the stations contributing to an Origin. It confirms or rejects the status of Origins, adds comments or leaves the Origin unclassified. The comments are passed to an additional sceval plug-in where the end user can customize event types. This unique identification of real and fake events in earthquake catalogues allows to lower network detection thresholds. In real-time monitoring situations operators can limit the processing to events with unclassified Origins, reducing their workload. Classified Origins can be treated specifically by other procedures. These modules have been calibrated and fully tested by several complex seismic monitoring networks in the region of Indonesia and Northern Chile.

  6. Near Real Time Analytics of Human Sensor Networks in the Realm of Big Data

    NASA Astrophysics Data System (ADS)

    Aulov, O.; Halem, M.

    2012-12-01

    With the prolific development of social media, emergency responders have an increasing interest in harvesting social media from outlets such as Flickr, Twitter, and Facebook, in order to assess the scale and specifics of extreme events including wild fires, earthquakes, terrorist attacks, oil spills, etc. A number of experimental platforms have successfully been implemented to demonstrate the utilization of social media data in extreme events, including Twitter Earthquake Detector, which relied on tweets for earthquake monitoring; AirTwitter, which used tweets for air quality reporting; and our previous work, using Flickr data as boundary value forcings to improve the forecast of oil beaching in the aftermath of the Deepwater Horizon oil spill. The majority of these platforms addressed a narrow, specific type of emergency and harvested data from a particular outlet. We demonstrate an interactive framework for monitoring, mining and analyzing a plethora of heterogeneous social media sources for a diverse range of extreme events. Our framework consists of three major parts: a real time social media aggregator, a data processing and analysis engine, and a web-based visualization and reporting tool. The aggregator gathers tweets, Facebook comments from fan pages, Google+ posts, forum discussions, blog posts (such as LiveJournal and Blogger.com), images from photo-sharing platforms (such as Flickr, Picasa), videos from video-sharing platforms (youtube, Vimeo), and so forth. The data processing and analysis engine pre-processes the aggregated information and annotates it with geolocation and sentiment information. In many cases, the metadata of the social media posts does not contain geolocation information—-however, a human reader can easily guess from the body of the text what location is discussed. We are automating this task by use of Named Entity Recognition (NER) algorithms and a gazetteer service. The visualization and reporting tool provides a web-based, user-friendly interface that provides time-series analysis and plotting tools, geo-spacial visualization tools with interactive maps, and cause-effect inference tools. We demonstrate how we address big data challenges of monitoring, aggregating and analyzing vast amounts of social media data at a near realtime. As a result, our framework not only allows emergency responders to augment their situational awareness with social media information, but can also allow them to extract geophysical data and incorporate it into their analysis models.

  7. Deep Learning and Image Processing for Automated Crack Detection and Defect Measurement in Underground Structures

    NASA Astrophysics Data System (ADS)

    Panella, F.; Boehm, J.; Loo, Y.; Kaushik, A.; Gonzalez, D.

    2018-05-01

    This work presents the combination of Deep-Learning (DL) and image processing to produce an automated cracks recognition and defect measurement tool for civil structures. The authors focus on tunnel civil structures and survey and have developed an end to end tool for asset management of underground structures. In order to maintain the serviceability of tunnels, regular inspection is needed to assess their structural status. The traditional method of carrying out the survey is the visual inspection: simple, but slow and relatively expensive and the quality of the output depends on the ability and experience of the engineer as well as on the total workload (stress and tiredness may influence the ability to observe and record information). As a result of these issues, in the last decade there is the desire to automate the monitoring using new methods of inspection. The present paper has the goal of combining DL with traditional image processing to create a tool able to detect, locate and measure the structural defect.

  8. Development and pilot testing of an online monitoring tool of depression symptoms and side effects for young people being treated for depression.

    PubMed

    Hetrick, Sarah E; Dellosa, Maria Kristina; Simmons, Magenta B; Phillips, Lisa

    2015-02-01

    To develop and examine the feasibility of an online monitoring tool of depressive symptoms, suicidality and side effects. The online tool was developed based on guideline recommendations, and employed already validated and widely used measures. Quantitative data about its use, and qualitative information on its functionality and usefulness were collected from surveys, a focus group and individual interviews. Fifteen young people completed the tool between 1 and 12 times, and reported it was easy to use. Clinicians suggested it was too long and could be completed in the waiting room to lessen impact on session time. Overall, clients and clinicians who used the tool found it useful. Results show that an online monitoring tool is potentially useful as a systematic means for monitoring symptoms, but further research is needed including how to embed the tool within clinical practice. © 2014 Wiley Publishing Asia Pty Ltd.

  9. Remote sensing techniques in monitoring areas affected by forest fire

    NASA Astrophysics Data System (ADS)

    Karagianni, Aikaterini Ch.; Lazaridou, Maria A.

    2017-09-01

    Forest fire is a part of nature playing a key role in shaping ecosystems. However, fire's environmental impacts can be significant, affecting wildlife habitat and timber, human settlements, man-made technical constructions and various networks (road, power networks) and polluting the air with emissions harmful to human health. Furthermore, fire's effect on the landscape may be long-lasting. Monitoring the development of a fire occurs as an important aspect at the management of natural hazards in general. Among the used methods for monitoring, satellite data and remote sensing techniques can be proven of particular importance. Satellite remote sensing offers a useful tool for forest fire detection, monitoring, management and damage assessment. Especially for fire scars detection and monitoring, satellite data derived from Landsat 8 can be a useful research tool. This paper includes critical considerations of the above and concerns in particular an example of the Greek area (Thasos Island). This specific area was hit by fires several times in the past and recently as well (September 2016). Landsat 8 satellite data are being used (pre and post fire imagery) and digital image processing techniques are applied (enhancement techniques, calculation of various indices) for fire scars detection. Visual interpretation of the example area affected by the fires is also being done, contributing to the overall study.

  10. Making intelligent systems team players. A guide to developing intelligent monitoring systems

    NASA Technical Reports Server (NTRS)

    Land, Sherry A.; Malin, Jane T.; Thronesberry, Carroll; Schreckenghost, Debra L.

    1995-01-01

    This reference guide for developers of intelligent monitoring systems is based on lessons learned by developers of the DEcision Support SYstem (DESSY), an expert system that monitors Space Shuttle telemetry data in real time. DESSY makes inferences about commands, state transitions, and simple failures. It performs failure detection rather than in-depth failure diagnostics. A listing of rules from DESSY and cue cards from DESSY subsystems are included to give the development community a better understanding of the selected model system. The G-2 programming tool used in developing DESSY provides an object-oriented, rule-based environment, but many of the principles in use here can be applied to any type of monitoring intelligent system. The step-by-step instructions and examples given for each stage of development are in G-2, but can be used with other development tools. This guide first defines the authors' concept of real-time monitoring systems, then tells prospective developers how to determine system requirements, how to build the system through a combined design/development process, and how to solve problems involved in working with real-time data. It explains the relationships among operational prototyping, software evolution, and the user interface. It also explains methods of testing, verification, and validation. It includes suggestions for preparing reference documentation and training users.

  11. Implementation of a Portable Personal EKG Signal Monitoring System

    NASA Astrophysics Data System (ADS)

    Tan, Tan-Hsu; Chang, Ching-Su; Chen, Yung-Fu; Lee, Cheng

    This research develops a portable personal EKG signal monitoring system to help patients monitor their EKG signals instantly to avoid the occurrence of tragedies. This system is built with two main units: signal pro-cessing unit and monitoring and evaluation unit. The first unit consists of EKG signal sensor, signal amplifier, digitalization circuit, and related control circuits. The second unit is a software tool developed on an embedded Linux platform (called CSA). Experimental result indicates that the proposed system has the practical potential for users in health monitoring. It is demonstrated to be more convenient and with greater portability than the conventional PC-based EKG signal monitoring systems. Furthermore, all the application units embedded in the system are built with open source codes, no licensed fee is required for operating systems and authorized applications. Thus, the building cost is much lower than the traditional systems.

  12. LabVIEW: a software system for data acquisition, data analysis, and instrument control.

    PubMed

    Kalkman, C J

    1995-01-01

    Computer-based data acquisition systems play an important role in clinical monitoring and in the development of new monitoring tools. LabVIEW (National Instruments, Austin, TX) is a data acquisition and programming environment that allows flexible acquisition and processing of analog and digital data. The main feature that distinguishes LabVIEW from other data acquisition programs is its highly modular graphical programming language, "G," and a large library of mathematical and statistical functions. The advantage of graphical programming is that the code is flexible, reusable, and self-documenting. Subroutines can be saved in a library and reused without modification in other programs. This dramatically reduces development time and enables researchers to develop or modify their own programs. LabVIEW uses a large amount of processing power and computer memory, thus requiring a powerful computer. A large-screen monitor is desirable when developing larger applications. LabVIEW is excellently suited for testing new monitoring paradigms, analysis algorithms, or user interfaces. The typical LabVIEW user is the researcher who wants to develop a new monitoring technique, a set of new (derived) variables by integrating signals from several existing patient monitors, closed-loop control of a physiological variable, or a physiological simulator.

  13. Assessment of laboratory test utilization for HIV/AIDS care in urban ART clinics of Lilongwe, Malawi.

    PubMed

    Palchaudhuri, Sonali; Tweya, Hannock; Hosseinipour, Mina

    2014-06-01

    The 2011 Malawi HIV guidelines promote CD4 monitoring for pre-ART assessment and considering HIVRNA monitoring for ART response assessment, while some clinics used CD4 for both. We assessed clinical ordering practices as compared to guidelines, and determined whether the samples were successfully and promptly processed. We conducted a retrospective review of all patients seen in from August 2010 through July 2011,, in two urban HIV-care clinics that utilized 6-monthly CD4 monitoring regardless of ART status. We calculated the percentage of patients on whom clinicians ordered CD4 or HIVRNA analysis. For all samples sent, we determined rates of successful lab-processing, and mean time to returned results. Of 20581 patients seen, 8029 (39%) had at least one blood draw for CD4 count. Among pre-ART patients, 2668/2844 (93.8%) had CD4 counts performed for eligibility. Of all CD4 samples sent, 8082/9207 (89%) samples were successfully processed. Of those, mean time to processing was 1.6 days (s.d 1.5) but mean time to results being available to clinician was 9.3 days (s.d. 3.7). Regarding HIVRNA, 172 patients of 17737 on ART had a blood draw and only 118/213 (55%) samples were successfully processed. Mean processing time was 39.5 days (s.d. 21.7); mean time to results being available to clinician was 43.1 days (s.d. 25.1). During the one-year evaluated, there were multiple lapses in processing HIVRNA samples for up to 2 months. Clinicians underutilize CD4 and HIVRNA as monitoring tools in HIV care. Laboratory processing failures and turnaround times are unacceptably high for viral load analysis. Alternative strategies need to be considered in order to meet laboratory monitoring needs.

  14. ESH assessment of advanced lithography materials and processes

    NASA Astrophysics Data System (ADS)

    Worth, Walter F.; Mallela, Ram

    2004-05-01

    The ESH Technology group at International SEMATECH is conducting environment, safety, and health (ESH) assessments in collaboration with the lithography technologists evaluating the performance of an increasing number of new materials and technologies being considered for advanced lithography such as 157nm photresist and extreme ultraviolet (EUV). By performing data searches for 75 critical data types, emissions characterizations, and industrial hygiene (IH) monitoring during the use of the resist candidates, it has been shown that the best performing resist formulations, so far, appear to be free of potential ESH concerns. The ESH assessment of the EUV lithography tool that is being developed for SEMATECH has identified several features of the tool that are of ESH concern: high energy consumption, poor energy conversion efficiency, tool complexity, potential ergonomic and safety interlock issues, use of high powered laser(s), generation of ionizing radiation (soft X-rays), need for adequate shielding, and characterization of the debris formed by the extreme temperature of the plasma. By bringing these ESH challenges to the attention of the technologists and tool designers, it is hoped that the processes and tools can be made more ESH friendly.

  15. Implementation of quality by design toward processing of food products.

    PubMed

    Rathore, Anurag S; Kapoor, Gautam

    2017-05-28

    Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.

  16. Monitoring programs to assess reintroduction efforts: A critical component in recovery

    USGS Publications Warehouse

    Muths, E.; Dreitz, V.

    2008-01-01

    Reintroduction is a powerful tool in our conservation toolbox. However, the necessary follow-up, i.e. long-term monitoring, is not commonplace and if instituted may lack rigor. We contend that valid monitoring is possible, even with sparse data. We present a means to monitor based on demographic data and a projection model using the Wyoming toad (Bufo baxten) as an example. Using an iterative process, existing data is built upon gradually such that demographic estimates and subsequent inferences increase in reliability. Reintroduction and defensible monitoring may become increasingly relevant as the outlook for amphibians, especially in tropical regions, continues to deteriorate and emergency collection, captive breeding, and reintroduction become necessary. Rigorous use of appropriate modeling and an adaptive approach can validate the use of reintroduction and substantially increase its value to recovery programs. ?? 2008 Museu de Cie??ncies Naturals.

  17. CNC machine tool's wear diagnostic and prognostic by using dynamic Bayesian networks

    NASA Astrophysics Data System (ADS)

    Tobon-Mejia, D. A.; Medjaher, K.; Zerhouni, N.

    2012-04-01

    The failure of critical components in industrial systems may have negative consequences on the availability, the productivity, the security and the environment. To avoid such situations, the health condition of the physical system, and particularly of its critical components, can be constantly assessed by using the monitoring data to perform on-line system diagnostics and prognostics. The present paper is a contribution on the assessment of the health condition of a computer numerical control (CNC) tool machine and the estimation of its remaining useful life (RUL). The proposed method relies on two main phases: an off-line phase and an on-line phase. During the first phase, the raw data provided by the sensors are processed to extract reliable features. These latter are used as inputs of learning algorithms in order to generate the models that represent the wear's behavior of the cutting tool. Then, in the second phase, which is an assessment one, the constructed models are exploited to identify the tool's current health state, predict its RUL and the associated confidence bounds. The proposed method is applied on a benchmark of condition monitoring data gathered during several cuts of a CNC tool. Simulation results are obtained and discussed at the end of the paper.

  18. Monitoring non-thermal plasma processes for nanoparticle synthesis

    NASA Astrophysics Data System (ADS)

    Mangolini, Lorenzo

    2017-09-01

    Process characterization tools have played a crucial role in the investigation of dusty plasmas. The presence of dust in certain non-thermal plasma processes was first detected by laser light scattering measurements. Techniques like laser induced particle explosive evaporation and ion mass spectrometry have provided the experimental evidence necessary for the development of the theory of particle nucleation in silane-containing non-thermal plasmas. This review provides first a summary of these early efforts, and then discusses recent investigations using in situ characterization techniques to understand the interaction between nanoparticles and plasmas. The advancement of such monitoring techniques is necessary to fully develop the potential of non-thermal plasmas as unique materials synthesis and processing platforms. At the same time, the strong coupling between materials and plasma properties suggest that it is also necessary to advance techniques for the measurement of plasma properties while in presence of dust. Recent progress in this area will be discussed.

  19. Lunar Impact Flash Locations

    NASA Technical Reports Server (NTRS)

    Moser, D. E.; Suggs, R. M.; Kupferschmidt, L.; Feldman, J.

    2015-01-01

    A bright impact flash detected by the NASA Lunar Impact Monitoring Program in March 2013 brought into focus the importance of determining the impact flash location. A process for locating the impact flash, and presumably its associated crater, was developed using commercially available software tools. The process was successfully applied to the March 2013 impact flash and put into production on an additional 300 impact flashes. The goal today: provide a description of the geolocation technique developed.

  20. A combined geostatistical-optimization model for the optimal design of a groundwater quality monitoring network

    NASA Astrophysics Data System (ADS)

    Kolosionis, Konstantinos; Papadopoulou, Maria P.

    2017-04-01

    Monitoring networks provide essential information for water resources management especially in areas with significant groundwater exploitation due to extensive agricultural activities. In this work, a simulation-optimization framework is developed based on heuristic optimization methodologies and geostatistical modeling approaches to obtain an optimal design for a groundwater quality monitoring network. Groundwater quantity and quality data obtained from 43 existing observation locations at 3 different hydrological periods in Mires basin in Crete, Greece will be used in the proposed framework in terms of Regression Kriging to develop the spatial distribution of nitrates concentration in the aquifer of interest. Based on the existing groundwater quality mapping, the proposed optimization tool will determine a cost-effective observation wells network that contributes significant information to water managers and authorities. The elimination of observation wells that add little or no beneficial information to groundwater level and quality mapping of the area can be obtain using estimations uncertainty and statistical error metrics without effecting the assessment of the groundwater quality. Given the high maintenance cost of groundwater monitoring networks, the proposed tool could used by water regulators in the decision-making process to obtain a efficient network design that is essential.

  1. Active Job Monitoring in Pilots

    NASA Astrophysics Data System (ADS)

    Kuehn, Eileen; Fischer, Max; Giffels, Manuel; Jung, Christopher; Petzold, Andreas

    2015-12-01

    Recent developments in high energy physics (HEP) including multi-core jobs and multi-core pilots require data centres to gain a deep understanding of the system to monitor, design, and upgrade computing clusters. Networking is a critical component. Especially the increased usage of data federations, for example in diskless computing centres or as a fallback solution, relies on WAN connectivity and availability. The specific demands of different experiments and communities, but also the need for identification of misbehaving batch jobs, requires an active monitoring. Existing monitoring tools are not capable of measuring fine-grained information at batch job level. This complicates network-aware scheduling and optimisations. In addition, pilots add another layer of abstraction. They behave like batch systems themselves by managing and executing payloads of jobs internally. The number of real jobs being executed is unknown, as the original batch system has no access to internal information about the scheduling process inside the pilots. Therefore, the comparability of jobs and pilots for predicting run-time behaviour or network performance cannot be ensured. Hence, identifying the actual payload is important. At the GridKa Tier 1 centre a specific tool is in use that allows the monitoring of network traffic information at batch job level. This contribution presents the current monitoring approach and discusses recent efforts and importance to identify pilots and their substructures inside the batch system. It will also show how to determine monitoring data of specific jobs from identified pilots. Finally, the approach is evaluated.

  2. Microbial trophic interactions and mcrA gene expression in monitoring of anaerobic digesters.

    PubMed

    Alvarado, Alejandra; Montañez-Hernández, Lilia E; Palacio-Molina, Sandra L; Oropeza-Navarro, Ricardo; Luévanos-Escareño, Miriam P; Balagurusamy, Nagamani

    2014-01-01

    Anaerobic digestion (AD) is a biological process where different trophic groups of microorganisms break down biodegradable organic materials in the absence of oxygen. A wide range of AD technologies is being used to convert livestock manure, municipal and industrial wastewaters, and solid organic wastes into biogas. AD gains importance not only because of its relevance in waste treatment but also because of the recovery of carbon in the form of methane, which is a renewable energy and is used to generate electricity and heat. Despite the advances on the engineering and design of new bioreactors for AD, the microbiology component always poses challenges. Microbiology of AD processes is complicated as the efficiency of the process depends on the interactions of various trophic groups involved. Due to the complex interdependence of microbial activities for the functionality of the anaerobic bioreactors, the genetic expression of mcrA, which encodes a key enzyme in methane formation, is proposed as a parameter to monitor the process performance in real time. This review evaluates the current knowledge on microbial groups, their interactions, and their relationship to the performance of anaerobic biodigesters with a focus on using mcrA gene expression as a tool to monitor the process.

  3. Semi-autonomous remote sensing time series generation tool

    NASA Astrophysics Data System (ADS)

    Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher

    2017-10-01

    High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.

  4. SenseMyHeart: A cloud service and API for wearable heart monitors.

    PubMed

    Pinto Silva, P M; Silva Cunha, J P

    2015-01-01

    In the era of ubiquitous computing, the growing adoption of wearable systems and body sensor networks is trailing the path for new research and software for cardiovascular intensity, energy expenditure and stress and fatigue detection through cardiovascular monitoring. Several systems have received clinical-certification and provide huge amounts of reliable heart-related data in a continuous basis. PhysioNet provides equally reliable open-source software tools for ECG processing and analysis that can be combined with these devices. However, this software remains difficult to use in a mobile environment and for researchers unfamiliar with Linux-based systems. In the present paper we present an approach that aims at tackling these limitations by developing a cloud service that provides an API for a PhysioNet-based pipeline for ECG processing and Heart Rate Variability measurement. We describe the proposed solution, along with its advantages and tradeoffs. We also present some client tools (windows and Android) and several projects where the developed cloud service has been used successfully as a standard for Heart Rate and Heart Rate Variability studies in different scenarios.

  5. Soft sensor for monitoring biomass subpopulations in mammalian cell culture processes.

    PubMed

    Kroll, Paul; Stelzer, Ines V; Herwig, Christoph

    2017-11-01

    Biomass subpopulations in mammalian cell culture processes cause impurities and influence productivity, which requires this critical process parameter to be monitored in real-time. For this reason, a novel soft sensor concept for estimating viable, dead and lysed cell concentration was developed, based on the robust and cheap in situ measurements of permittivity and turbidity in combination with a simple model. It could be shown that the turbidity measurements contain information about all investigated biomass subpopulations. The novelty of the developed soft sensor is the real-time estimation of lysed cell concentration, which is directly correlated to process-related impurities such as DNA and host cell protein in the supernatant. Based on data generated by two fed-batch processes the developed soft sensor is described and discussed. The presented soft sensor concept provides a tool for viable, dead and lysed cell concentration estimation in real-time with adequate accuracy and enables further applications with respect to process optimization and control.

  6. Adaptation of the Tool to Estimate Patient Costs Questionnaire into Indonesian Context for Tuberculosis-affected Households.

    PubMed

    Fuady, Ahmad; Houweling, Tanja A; Mansyur, Muchtaruddin; Richardus, Jan H

    2018-01-01

    Indonesia is the second-highest country for tuberculosis (TB) incidence worldwide. Hence, it urgently requires improvements and innovations beyond the strategies that are currently being implemented throughout the country. One fundamental step in monitoring its progress is by preparing a validated tool to measure total patient costs and catastrophic total costs. The World Health Organization (WHO) recommends using a version of the generic questionnaire that has been adapted to the local cultural context in order to interpret findings correctly. This study is aimed to adapt the Tool to Estimate Patient Costs questionnaire into the Indonesian context, which measures total costs and catastrophic total costs for tuberculosis-affected households. the tool was adapted using best-practice guidelines. On the basis of a pre-test performed in a previous study (referred to as Phase 1 Study), we refined the adaptation process by comparing it with the generic tool introduced by the WHO. We also held an expert committee review and performed pre-testing by interviewing 30 TB patients. After pre-testing, the tool was provided with complete explanation sheets for finalization. seventy-two major changes were made during the adaptation process including changing the answer choices to match the Indonesian context, refining the flow of questions, deleting questions, changing some words and restoring original questions that had been changed in Phase 1 Study. Participants indicated that most questions were clear and easy to understand. To address recall difficulties by the participants, we made some adaptations to obtain data that might be missing, such as tracking data to medical records, developing a proxy of costs and guiding interviewers to ask for a specific value when participants were uncertain about the estimated market value of property they had sold. the adapted Tool to Estimate Patient Costs in Bahasa Indonesia is comprehensive and ready for use in future studies on TB-related catastrophic costs and is suitable for monitoring progress to achieve the target of the End TB Strategy.

  7. Process Analytical Technology for High Shear Wet Granulation: Wet Mass Consistency Reported by In-Line Drag Flow Force Sensor Is Consistent With Powder Rheology Measured by At-Line FT4 Powder Rheometer.

    PubMed

    Narang, Ajit S; Sheverev, Valery; Freeman, Tim; Both, Douglas; Stepaniuk, Vadim; Delancy, Michael; Millington-Smith, Doug; Macias, Kevin; Subramanian, Ganeshkumar

    2016-01-01

    Drag flow force (DFF) sensor that measures the force exerted by wet mass in a granulator on a thin cylindrical probe was shown as a promising process analytical technology for real-time in-line high-resolution monitoring of wet mass consistency during high shear wet granulation. Our previous studies indicated that this process analytical technology tool could be correlated to granulation end point established independently through drug product critical quality attributes. In this study, the measurements of flow force by a DFF sensor, taken during wet granulation of 3 placebo formulations with different binder content, are compared with concurrent at line FT4 Powder Rheometer characterization of wet granules collected at different time points of the processing. The wet mass consistency measured by the DFF sensor correlated well with the granulation's resistance to flow and interparticulate interactions as measured by FT4 Powder Rheometer. This indicated that the force pulse magnitude measured by the DFF sensor was indicative of fundamental material properties (e.g., shear viscosity and granule size/density), as they were changing during the granulation process. These studies indicate that DFF sensor can be a valuable tool for wet granulation formulation and process development and scale up, as well as for routine monitoring and control during manufacturing. Copyright © 2016. Published by Elsevier Inc.

  8. CISN ShakeAlert Earthquake Early Warning System Monitoring Tools

    NASA Astrophysics Data System (ADS)

    Henson, I. H.; Allen, R. M.; Neuhauser, D. S.

    2015-12-01

    CISN ShakeAlert is a prototype earthquake early warning system being developed and tested by the California Integrated Seismic Network. The system has recently been expanded to support redundant data processing and communications. It now runs on six machines at three locations with ten Apache ActiveMQ message brokers linking together 18 waveform processors, 12 event association processes and 4 Decision Module alert processes. The system ingests waveform data from about 500 stations and generates many thousands of triggers per day, from which a small portion produce earthquake alerts. We have developed interactive web browser system-monitoring tools that display near real time state-of-health and performance information. This includes station availability, trigger statistics, communication and alert latencies. Connections to regional earthquake catalogs provide a rapid assessment of the Decision Module hypocenter accuracy. Historical performance can be evaluated, including statistics for hypocenter and origin time accuracy and alert time latencies for different time periods, magnitude ranges and geographic regions. For the ElarmS event associator, individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers. Individual station trigger and latency statistics are available. Detailed information about the ElarmS trigger association process for both alerted events and rejected events is also available. The Google Web Toolkit and Map API have been used to develop interactive web pages that link tabular and geographic information. Statistical analysis is provided by the R-Statistics System linked to a PostgreSQL database.

  9. Implementation of a process analytical technology system in a freeze-drying process using Raman spectroscopy for in-line process monitoring.

    PubMed

    De Beer, T R M; Allesø, M; Goethals, F; Coppens, A; Heyden, Y Vander; De Diego, H Lopez; Rantanen, J; Verpoort, F; Vervaet, C; Remon, J P; Baeyens, W R G

    2007-11-01

    The aim of the present study was to propose a strategy for the implementation of a Process Analytical Technology system in freeze-drying processes. Mannitol solutions, some of them supplied with NaCl, were used as models to freeze-dry. Noninvasive and in-line Raman measurements were continuously performed during lyophilization of the solutions to monitor real time the mannitol solid state, the end points of the different process steps (freezing, primary drying, secondary drying), and physical phenomena occurring during the process. At-line near-infrared (NIR) and X-ray powder diffractometry (XRPD) measurements were done to confirm the Raman conclusions and to find out additional information. The collected spectra during the processes were analyzed using principal component analysis and multivariate curve resolution. A two-level full factorial design was used to study the significant influence of process (freezing rate) and formulation variables (concentration of mannitol, concentration of NaCl, volume of freeze-dried sample) upon freeze-drying. Raman spectroscopy was able to monitor (i) the mannitol solid state (amorphous, alpha, beta, delta, and hemihydrate), (ii) several process step end points (end of mannitol crystallization during freezing, primary drying), and (iii) physical phenomena occurring during freeze-drying (onset of ice nucleation, onset of mannitol crystallization during the freezing step, onset of ice sublimation). NIR proved to be a more sensitive tool to monitor sublimation than Raman spectroscopy, while XRPD helped to unravel the mannitol hemihydrate in the samples. The experimental design results showed that several process and formulation variables significantly influence different aspects of lyophilization and that both are interrelated. Raman spectroscopy (in-line) and NIR spectroscopy and XRPD (at-line) not only allowed the real-time monitoring of mannitol freeze-drying processes but also helped (in combination with experimental design) us to understand the process.

  10. Complementary Tools to Empower and Sustain Behavior Change: Motivational Interviewing and Mindfulness.

    PubMed

    Sohl, Stephanie Jean; Birdee, Gurjeet; Elam, Roy

    2016-11-01

    Improving health behaviors is fundamental to preventing and controlling chronic disease. Healthcare providers who have a patient-centered communication style and appropriate behavioral change tools can empower patients to engage in and sustain healthy behaviors. This review highlights motivational interviewing and mindfulness along with other evidence-based strategies for enhancing patient-centered communication and the behavior change process. Motivational interviewing and mindfulness are especially useful for empowering patients to set self-determined, or autonomous, goals for behavior change. This is important because autonomously motivated behavioral change is more sustainable. Additional strategies such as self-monitoring are discussed as useful for supporting the implementation and maintenance of goals. Thus, there is a need for healthcare providers to develop such tools to empower sustained behavior change. The additional support of a new role, a health coach who specializes in facilitating the process of health-related behavior change, may be required to substantially impact public health.

  11. Complementary Tools to Empower and Sustain Behavior Change: Motivational Interviewing and Mindfulness

    PubMed Central

    Sohl, Stephanie Jean; Birdee, Gurjeet; Elam, Roy

    2015-01-01

    Improving health behaviors is fundamental to preventing and controlling chronic disease. Healthcare providers who have a patient-centered communication style and appropriate behavioral change tools can empower patients to engage in and sustain healthy behaviors. This review highlights motivational interviewing and mindfulness along with other evidence-based strategies for enhancing patient-centered communication and the behavior change process. Motivational interviewing and mindfulness are especially useful for empowering patients to set self-determined, or autonomous, goals for behavior change. This is important because autonomously motivated behavioral change is more sustainable. Additional strategies such as self-monitoring are discussed as useful for supporting the implementation and maintenance of goals. Thus, there is a need for healthcare providers to develop such tools to empower sustained behavior change. The additional support of a new role, a health coach who specializes in facilitating the process of health-related behavior change, may be required to substantially impact public health. PMID:28239308

  12. Computer assisted blast design and assessment tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cameron, A.R.; Kleine, T.H.; Forsyth, W.W.

    1995-12-31

    In general the software required by a blast designer includes tools that graphically present blast designs (surface and underground), can analyze a design or predict its result, and can assess blasting results. As computers develop and computer literacy continues to rise the development of and use of such tools will spread. An example of the tools that are becoming available includes: Automatic blast pattern generation and underground ring design; blast design evaluation in terms of explosive distribution and detonation simulation; fragmentation prediction; blast vibration prediction and minimization; blast monitoring for assessment of dynamic performance; vibration measurement, display and signal processing;more » evaluation of blast results in terms of fragmentation; and risk and reliability based blast assessment. The authors have identified a set of criteria that are essential in choosing appropriate software blasting tools.« less

  13. Four challenges in selecting and implementing methods to monitor and evaluate participatory processes: Example from the Rwenzori region, Uganda.

    PubMed

    Hassenforder, Emeline; Ducrot, Raphaëlle; Ferrand, Nils; Barreteau, Olivier; Anne Daniell, Katherine; Pittock, Jamie

    2016-09-15

    Participatory approaches are now increasingly recognized and used as an essential element of policies and programs, especially in regards to natural resource management (NRM). Most practitioners, decision-makers and researchers having adopted participatory approaches also acknowledge the need to monitor and evaluate such approaches in order to audit their effectiveness, support decision-making or improve learning. Many manuals and frameworks exist on how to carry out monitoring and evaluation (M&E) for participatory processes. However, few provide guidelines on the selection and implementation of M&E methods, an aspect which is also often obscure in published studies, at the expense of the transparency, reliability and validity of the study. In this paper, we argue that the selection and implementation of M&E methods are particularly strategic when monitoring and evaluating a participatory process. We demonstrate that evaluators of participatory processes have to tackle a quadruple challenge when selecting and implementing methods: using mixed-methods, both qualitative and quantitative; assessing the participatory process, its outcomes, and its context; taking into account both the theory and participants' views; and being both rigorous and adaptive. The M&E of a participatory planning process in the Rwenzori Region, Uganda, is used as an example to show how these challenges unfold on the ground and how they can be tackled. Based on this example, we conclude by providing tools and strategies that can be used by evaluators to ensure that they make utile, feasible, coherent, transparent and adaptive methodological choices when monitoring and evaluating participatory processes for NRM. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Droughts and water scarcity: facing challenges

    NASA Astrophysics Data System (ADS)

    Pereira, Luis S.

    2014-05-01

    Water scarcity characterizes large portions of the world, particularly the Mediterranean area. It is due to natural causes - climate aridity, which is permanent, and droughts, that are temporary - and to human causes - long term desertification and short term water shortages. Droughts aggravate water scarcity. Knowledge has well developed relative to all processes but management tools still are insufficient as well as the tools required to support appropriate planning and management. Particularly, new approaches on tools for assessing related impacts in agriculture and other economic and social activities are required. Droughts occur in all climates but their characteristics largely differ among regions both in terms frequency, duration and intensity. Research has already produced a large number of tools that allow appropriate monitoring of droughts occurrence and intensity, including dynamics of drought occurrence and time evolution. Advances in drought prediction already are available but we still are far from knowing when a drought will start, how it will evolve and when it dissipates. New developments using teleconnections and GCM are being considered. Climate change is a fact. Are droughts occurrence and severity changing with global change? Opinions are divided about this subject since driving factors and processes are varied and tools for the corresponding analysis are also various. Particularly, weather data series are often too short for obtaining appropriate answers. In a domain where research is producing improved knowledge and innovative approaches, research faces however a variety of challenges. The main ones, dealt in this keynote, refer to concepts and definitions, use of monitoring indices, prediction of drought initiation and evolution, improved assessment of drought impacts, and possible influence of climate change on drought occurrence and severity.

  15. XTCE (XML Telemetric and Command Exchange) Standard Making It Work at NASA. Can It Work For You?

    NASA Technical Reports Server (NTRS)

    Munoz-Fernandez, Michela; Smith, Danford S.; Rice, James K.; Jones, Ronald A.

    2017-01-01

    The XML Telemetric and Command Exchange (XTCE) standard is intended as a way to describe telemetry and command databases to be exchanged across centers and space agencies. XTCE usage has the potential to lead to consolidation of the Mission Operations Center (MOC) Monitor and Control displays for mission cross-support, reducing equipment and configuration costs, as well as a decrease in the turnaround time for telemetry and command modifications during all the mission phases. The adoption of XTCE will reduce software maintenance costs by reducing the variation between our existing mission dictionaries. The main objective of this poster is to show how powerful XTCE is in terms of interoperability across centers and missions. We will provide results for a use case where two centers can use their local tools to process and display the same mission telemetry in their MOC independently of one another. In our use case we have first quantified the ability for XTCE to capture the telemetry definitions of the mission by use of our suite of support tools (Conversion, Validation, and Compliance measurement). The next step was to show processing and monitoring of the same telemetry in two mission centers. Once the database was converted to XTCE using our tool, the XTCE file became our primary database and was shared among the various tool chains through their XTCE importers and ultimately configured to ingest the telemetry stream and display or capture the telemetered information in similar ways.Summary results include the ability to take a real mission database and real mission telemetry and display them on various tools from two centers, as well as using commercially free COTS.

  16. Enhancement of Self-Monitoring in a Web-Based Weight Loss Program by Extra Individualized Feedback and Reminders: Randomized Trial

    PubMed Central

    Hutchesson, Melinda Jane; Tan, Chor Yin; Morgan, Philip; Callister, Robin

    2016-01-01

    Background Self-monitoring is an essential behavioral strategy for effective weight loss programs. Traditionally, self-monitoring has been achieved using paper-based records. However, technology is now more frequently used to deliver treatment programs to overweight and obese adults. Information technologies, such as the Internet and mobile phones, allow innovative intervention features to be incorporated into treatment that may facilitate greater adherence to self-monitoring processes, provide motivation for behavior change, and ultimately lead to greater weight loss success. Objective The objective of our study was to determine whether the consistency of self-monitoring differed between participants randomly assigned to a basic or an enhanced 12-week commercial Web-based weight loss program. Methods We randomly assigned a sample of 301 adults (mean age 42.3 years; body mass index 31.3 kg/m2; female 176/301, 58.5%) to the basic or enhanced group. The basic program included tools for self-monitoring (online food and exercise diary, and a weekly weigh-in log) with some feedback and reminders to weigh in (by text or email). The enhanced program included the basic components, as well as extra individualized feedback on self-monitoring entries and reminders (by text, email, or telephone) to engage with self-monitoring tools. We evaluated the level of self-monitoring by examining the consistency of self-monitoring of food, exercise, and weight during the 12 weeks. Consistency was defined as the number of weeks during which participants completed a criterion number of entries (ie, ≥3 days of online food or exercise diary records per week and ≥1 weigh-in per week). Results The enhanced group’s consistency of use of self-monitoring tools was significantly greater than that of the basic group throughout the 12 weeks (median consistency for food 8 vs 3 weeks, respectively, P<.001; for exercise 2.5 vs 1 weeks, respectively, P=.003). Conclusions Enhanced features, including additional individualized feedback and reminders, are effective in enhancing self-monitoring behaviors in a Web-based weight loss program. ClinicalTrial Australian New Zealand Clinical Trials Registry (ANZCTR): ACTRN12610000197033; https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=335159 (Archived by WebCite at http://www.webcitation.org/6gCQdj21G) PMID:27072817

  17. GSMS and space views: Advanced spacecraft monitoring tools

    NASA Technical Reports Server (NTRS)

    Carlton, Douglas; Vaules, David, Jr.; Mandl, Daniel

    1993-01-01

    The Graphical Spacecraft Monitoring System (GSMS) processes and translates real-time telemetry data from the Gamma Ray Observatory (GRO) spacecraft into high resolution 2-D and 3-D color displays showing the spacecraft's position relative to the Sun, Earth, Moon, and stars, its predicted orbit path, its attitude, instrument field of views, and other items of interest to the GRO Flight Operations Team (FOT). The GSMS development project is described and the approach being undertaken for implementing Space Views, the next version of GSMS, is presented. Space Views is an object-oriented graphical spacecraft monitoring system that will become a standard component of Goddard Space Flight Center's Transportable Payload Operations Control Center (TPOCC).

  18. Optimizing process and equipment efficiency using integrated methods

    NASA Astrophysics Data System (ADS)

    D'Elia, Michael J.; Alfonso, Ted F.

    1996-09-01

    The semiconductor manufacturing industry is continually riding the edge of technology as it tries to push toward higher design limits. Mature fabs must cut operating costs while increasing productivity to remain profitable and cannot justify large capital expenditures to improve productivity. Thus, they must push current tool production capabilities to cut manufacturing costs and remain viable. Working to continuously improve mature production methods requires innovation. Furthermore, testing and successful implementation of these ideas into modern production environments require both supporting technical data and commitment from those working with the process daily. At AMD, natural work groups (NWGs) composed of operators, technicians, engineers, and supervisors collaborate to foster innovative thinking and secure commitment. Recently, an AMD NWG improved equipment cycle time on the Genus tungsten silicide (WSi) deposition system. The team used total productive manufacturing (TPM) to identify areas for process improvement. Improved in-line equipment monitoring was achieved by constructing a real time overall equipment effectiveness (OEE) calculator which tracked equipment down, idle, qualification, and production times. In-line monitoring results indicated that qualification time associated with slow Inspex turn-around time and machine downtime associated with manual cleans contributed greatly to reduced availability. Qualification time was reduced by 75% by implementing a new Inspex monitor pre-staging technique. Downtime associated with manual cleans was reduced by implementing an in-situ plasma etch back to extend the time between manual cleans. A designed experiment was used to optimize the process. Time between 18 hour manual cleans has been improved from every 250 to every 1500 cycles. Moreover defect density realized a 3X improvement. Overall, the team achieved a 35% increase in tool availability. This paper details the above strategies and accomplishments.

  19. Focused beam reflectance measurement as a tool for in situ monitoring of the lactose crystallization process.

    PubMed

    Pandalaneni, K; Amamcharla, J K

    2016-07-01

    Lactose accounts for about 75 and 85% of the solids in whey and deproteinized whey, respectively. Production of lactose is usually carried out by a process called crystallization. Several factors including rate of cooling, presence of impurities, and mixing speed influence the crystal size characteristics. To optimize the lactose crystallization process parameters to maximize the lactose yield, it is important to monitor the crystallization process. However, efficient in situ tools to implement at concentrations relevant to the dairy industry are lacking. The objective of the present work was to use a focused beam reflectance measurement (FBRM) system for in situ monitoring of lactose crystallization at supersaturated concentrations (wt/wt) 50, 55, and 60% at 20 and 30°C. The FBRM data were compared with Brix readings collected using a refractometer during isothermal crystallization. Chord length distributions obtained from FBRM in the ranges of <50 µm (fine crystals) and 50 to 300 µm (coarse crystals) were recorded and evaluated in relation to the extent of crystallization and rate constants deduced from the refractometer measurements. Extent of crystallization and rate constants increased with increasing supersaturation concentration and temperature. The measured fine crystal counts from FBRM increased at higher supersaturated concentration and temperature during isothermal crystallization. On the other hand, coarse counts were observed to increase with decreasing supersaturated concentration and temperature. Square weighted chord length distribution obtained from FBRM showed that as concentration increased, a decrease in chord lengths occurred at 20°C and similar observations were made from microscopic images. The robustness of FBRM in understanding isothermal lactose crystallization at various concentrations and temperatures was successfully assessed in the study. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  20. Tool Wear Monitoring Using Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Song, Dong Yeul; Ohara, Yasuhiro; Tamaki, Haruo; Suga, Masanobu

    A tool wear monitoring approach considering the nonlinear behavior of cutting mechanism caused by tool wear and/or localized chipping is proposed, and its effectiveness is verified through the cutting experiment and actual turning machining. Moreover, the variation in the surface roughness of the machined workpiece is also discussed using this approach. In this approach, the residual error between the actually measured vibration signal and the estimated signal obtained from the time series model corresponding to dynamic model of cutting is introduced as the feature of diagnosis. Consequently, it is found that the early tool wear state (i.e. flank wear under 40µm) can be monitored, and also the optimal tool exchange time and the tool wear state for actual turning machining can be judged by this change in the residual error. Moreover, the variation of surface roughness Pz in the range of 3 to 8µm can be estimated by the monitoring of the residual error.

  1. Optical Method For Monitoring Tool Control For Green Burnishing With Using Of Algorithms With Adaptive Settings

    NASA Astrophysics Data System (ADS)

    Lukyanov, A. A.; Grigoriev, S. N.; Bobrovskij, I. N.; Melnikov, P. A.; Bobrovskij, N. M.

    2017-05-01

    With regard to the complexity of the new technology and increase its reliability requirements laboriousness of control operations in industrial quality control systems increases significantly. The importance of quality management control due to the fact that its promotes the correct use of production conditions, the relevant requirements are required. Digital image processing allows to reach a new technological level of production (new technological way). The most complicated automated interpretation of information is the basis for decision-making in the management of production processes. In the case of surface analysis of tools used for processing with the using of metalworking fluids (MWF) it is more complicated. The authors suggest new algorithm for optical inspection of the wear of the cylinder tool for burnishing, which used in surface plastic deformation without using of MWF. The main advantage of proposed algorithm is the possibility of automatic recognition of images of burnisher tool with the subsequent allocation of its boundaries, finding a working surface and automatically allocating the defects and wear area. Software that implements the algorithm was developed by the authors in Matlab programming environment, but can be implemented using other programming languages.

  2. An efficient framework for Java data processing systems in HPC environments

    NASA Astrophysics Data System (ADS)

    Fries, Aidan; Castañeda, Javier; Isasi, Yago; Taboada, Guillermo L.; Portell de Mora, Jordi; Sirvent, Raül

    2011-11-01

    Java is a commonly used programming language, although its use in High Performance Computing (HPC) remains relatively low. One of the reasons is a lack of libraries offering specific HPC functions to Java applications. In this paper we present a Java-based framework, called DpcbTools, designed to provide a set of functions that fill this gap. It includes a set of efficient data communication functions based on message-passing, thus providing, when a low latency network such as Myrinet is available, higher throughputs and lower latencies than standard solutions used by Java. DpcbTools also includes routines for the launching, monitoring and management of Java applications on several computing nodes by making use of JMX to communicate with remote Java VMs. The Gaia Data Processing and Analysis Consortium (DPAC) is a real case where scientific data from the ESA Gaia astrometric satellite will be entirely processed using Java. In this paper we describe the main elements of DPAC and its usage of the DpcbTools framework. We also assess the usefulness and performance of DpcbTools through its performance evaluation and the analysis of its impact on some DPAC systems deployed in the MareNostrum supercomputer (Barcelona Supercomputing Center).

  3. Classification and recognition of texture collagen obtaining by multiphoton microscope with neural network analysis

    NASA Astrophysics Data System (ADS)

    Wu, Shulian; Peng, Yuanyuan; Hu, Liangjun; Zhang, Xiaoman; Li, Hui

    2016-01-01

    Second harmonic generation microscopy (SHGM) was used to monitor the process of chronological aging skin in vivo. The collagen structures of mice model with different ages were obtained using SHGM. Then, texture feature with contrast, correlation and entropy were extracted and analysed using the grey level co-occurrence matrix. At last, the neural network tool of Matlab was applied to train the texture of collagen in different statues during the aging process. And the simulation of mice collagen texture was carried out. The results indicated that the classification accuracy reach 85%. Results demonstrated that the proposed approach effectively detected the target object in the collagen texture image during the chronological aging process and the analysis tool based on neural network applied the skin of classification and feature extraction method is feasible.

  4. The Health Impact Assessment (HIA) Resource and Tool ...

    EPA Pesticide Factsheets

    Health Impact Assessment (HIA) is a relatively new and rapidly emerging field in the U.S. An inventory of available HIA resources and tools was conducted, with a primary focus on resources developed in the U.S. The resources and tools available to HIA practitioners in the conduct of their work were identified through multiple methods and compiled into a comprehensive list. The compilation includes tools and resources related to the HIA process itself and those that can be used to collect and analyze data, establish a baseline profile, assess potential health impacts, and establish benchmarks and indicators for monitoring and evaluation. These resources include literature and evidence bases, data and statistics, guidelines, benchmarks, decision and economic analysis tools, scientific models, methods, frameworks, indices, mapping, and various data collection tools. Understanding the data, tools, models, methods, and other resources available to perform HIAs will help to advance the HIA community of practice in the U.S., improve the quality and rigor of assessments upon which stakeholder and policy decisions are based, and potentially improve the overall effectiveness of HIA to promote healthy and sustainable communities. The Health Impact Assessment (HIA) Resource and Tool Compilation is a comprehensive list of resources and tools that can be utilized by HIA practitioners with all levels of HIA experience to guide them throughout the HIA process. The HIA Resource

  5. New Approach for Environmental Monitoring and Plant Observation Using a Light-Field Camera

    NASA Astrophysics Data System (ADS)

    Schima, Robert; Mollenhauer, Hannes; Grenzdörffer, Görres; Merbach, Ines; Lausch, Angela; Dietrich, Peter; Bumberger, Jan

    2015-04-01

    The aim of gaining a better understanding of ecosystems and the processes in nature accentuates the need for observing exactly these processes with a higher temporal and spatial resolution. In the field of environmental monitoring, an inexpensive and field applicable imaging technique to derive three-dimensional information about plants and vegetation would represent a decisive contribution to the understanding of the interactions and dynamics of ecosystems. This is particularly true for the monitoring of plant growth and the frequently mentioned lack of morphological information about the plants, e.g. plant height, vegetation canopy, leaf position or leaf arrangement. Therefore, an innovative and inexpensive light-field (plenoptic) camera, the Lytro LF, and a stereo vision system, based on two industrial cameras, were tested and evaluated as possible measurement tools for the given monitoring purpose. In this instance, the usage of a light field camera offers the promising opportunity of providing three-dimensional information without any additional requirements during the field measurements based on one single shot, which represents a substantial methodological improvement in the area of environmental research and monitoring. Since the Lytro LF was designed as a daily-life consumer camera, it does not support depth or distance estimation or rather an external triggering by default. Therefore, different technical modifications and a calibration routine had to be figured out during the preliminary study. As a result, the used light-field camera was proven suitable as a depth and distance measurement tool with a measuring range of approximately one meter. Consequently, this confirms the assumption that a light field camera holds the potential of being a promising measurement tool for environmental monitoring purposes, especially with regard to a low methodological effort in field. Within the framework of the Global Change Experimental Facility Project, founded by the Helmholtz Centre for Environmental Research, and its large-scaled field experiments to investigate the influence of the climate change on different forms of land utilization, both techniques were installed and evaluated in a long-term experiment on a pilot-scaled maize field in late 2014. Based on this, it was possible to show the growth of the plants in dependence of time, showing a good accordance to the measurements, which were carried out by hand on a weekly basis. In addition, the experiment has shown that the light-field vision approach is applicable for the monitoring of the crop growth under field conditions, although it is limited to close range applications. Since this work was intended as a proof of concept, further research is recommended, especially with respect to the automation and evaluation of data processing. Altogether, this study is addressed to researchers as an elementary groundwork to improve the usage of the introduced light field imaging technique for the monitoring of plant growth dynamics and the three-dimensional modeling of plants under field conditions.

  6. Coastal Thematic Exploitation Platform (C-TEP): An innovative and collaborative platform to facilitate Big Data coastal research

    NASA Astrophysics Data System (ADS)

    Tuohy, Eimear; Clerc, Sebastien; Politi, Eirini; Mangin, Antoine; Datcu, Mihai; Vignudelli, Stefano; Illuzzi, Diomede; Craciunescu, Vasile; Aspetsberger, Michael

    2017-04-01

    The Coastal Thematic Exploitation Platform (C-TEP) is an on-going European Space Agency (ESA) funded project to develop a web service dedicated to the observation of the coastal environment and to support coastal management and monitoring. For over 20 years ESA satellites have provided a wealth of environmental data. The availability of an ever increasing volume of environmental data from satellite remote sensing provides a unique opportunity for exploratory science and the development of coastal applications. However, the diversity and complexity of EO data available, the need for efficient data access, information extraction, data management and high spec processing tools pose major challenges to achieving its full potential in terms of Big Data exploitation. C-TEP will provide a new means to handle the technical challenges of the observation of coastal areas and contribute to improved understanding and decision-making with respect to coastal resources and environments. C-TEP will unlock coastal knowledge and innovation as a collaborative, virtual work environment providing access to a comprehensive database of coastal Earth Observation (EO) data, in-situ data, model data and the tools and processors necessary to fully exploit these vast and heterogeneous datasets. The cloud processing capabilities provided, allow users to perform heavy processing tasks through a user-friendly Graphical User Interface (GUI). A connection to the PEPS (Plateforme pour l'Exploitation des Produits Sentinel) archive will provide data from Sentinel missions 1, 2 and 3. Automatic comparison tools will be provided to exploit the in-situ datasets in synergy with EO data. In addition, users may develop, test and share their own advanced algorithms for the extraction of coastal information. Algorithm validation will be facilitated by the capabilities to compute statistics over long time-series. Finally, C-TEP subscription services will allow users to perform automatic monitoring of some key indicators (water quality, water level, vegetation stress) from Near Real Time data. To demonstrate the benefits of C-TEP, three pilot cases have been implemented, each addressing specific, and highly topical, coastal research needs. These applications include change detection in land and seabed cover, water quality monitoring and reporting, and a coastal altimetry processor. The pilot cases demonstrate the wide scope of C-TEP and how it may contribute to European projects and international coastal networks. In conclusion, CTEP aims to provide new services and tools which will revolutionise accessibility to EO datasets, support a multi-disciplinary research collaboration, and the provision of long-term data series and innovative services for the monitoring of coastal regions.

  7. Okayama optical polarimetry and spectroscopy system (OOPS) II. Network-transparent control software.

    NASA Astrophysics Data System (ADS)

    Sasaki, T.; Kurakami, T.; Shimizu, Y.; Yutani, M.

    Control system of the OOPS (Okayama Optical Polarimetry and Spectroscopy system) is designed to integrate several instruments whose controllers are distributed over a network; the OOPS instrument, a CCD camera and data acquisition unit, the 91 cm telescope, an autoguider, a weather monitor, and an image display tool SAOimage. With the help of message-based communication, the control processes cooperate with related processes to perform an astronomical observation under supervising control by a scheduler process. A logger process collects status data of all the instruments to distribute them to related processes upon request. Software structure of each process is described.

  8. System of Monitoring Potential Deceased Organ Donations in Over 200 Hospitals in Poland Using a Web Tool: Implementation and Structure.

    PubMed

    Danek, T; Protasiuk, R; Mańkowski, M; Brutkiewicz, A; Trześniewski, R; Podlińska, I; Milecka, A; Jonas, M; Danielewicz, R; Czerwiński, J

    2016-06-01

    In 2010 the formation of the Polish Hospitals Network of Organ Donation Coordinators, originated by Poltransplant, began. One of the goals of this project is to report all deaths in hospital ICUs in which a coordinator is posted. The aim of this strategy is to monitor donation potential, following the recruitment process of potential donors and indicating stages of that process that may be improved to increase effective recruitment. Until the end of 2014 all data were forwarded to Poltransplant as Excel files, but since January 1, 2015, reporting and data collection have been are performed using web tool www.koordynator.net. The aim of the paper is to present the essentials in functioning principles, structure, and usage of the www.koordynator.net system, its technical construction, and to display good practices (know-how) tested by 1 country, for countries such as Poland, that contend with organ insufficiency. The application www.koordynator.net allows for remote addition of individual records with information about deceased patients in hospital ICUs, the forwarding of data about potential and actual organ donors, the generation of complete reports about deceased patients in each hospital monthly, and the introduction of historical data. Introduction of a potential donation monitoring system in 209 hospitals with transplant coordinators increases the number of identified potential and effective actual donors due to self-assessment analysis. Eventually, the www.koordynator.net reporting system allowed for external evaluation by coordinators from other hospitals, regional coordinators, and Poltransplant. The system is a modern tool that improves and increases the quality system in the organ donation field (quality assurance program). Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Performance of an image analysis processing system for hen tracking in an environmental preference chamber.

    PubMed

    Kashiha, Mohammad Amin; Green, Angela R; Sales, Tatiana Glogerley; Bahr, Claudia; Berckmans, Daniel; Gates, Richard S

    2014-10-01

    Image processing systems have been widely used in monitoring livestock for many applications, including identification, tracking, behavior analysis, occupancy rates, and activity calculations. The primary goal of this work was to quantify image processing performance when monitoring laying hens by comparing length of stay in each compartment as detected by the image processing system with the actual occurrences registered by human observations. In this work, an image processing system was implemented and evaluated for use in an environmental animal preference chamber to detect hen navigation between 4 compartments of the chamber. One camera was installed above each compartment to produce top-view images of the whole compartment. An ellipse-fitting model was applied to captured images to detect whether the hen was present in a compartment. During a choice-test study, mean ± SD success detection rates of 95.9 ± 2.6% were achieved when considering total duration of compartment occupancy. These results suggest that the image processing system is currently suitable for determining the response measures for assessing environmental choices. Moreover, the image processing system offered a comprehensive analysis of occupancy while substantially reducing data processing time compared with the time-intensive alternative of manual video analysis. The above technique was used to monitor ammonia aversion in the chamber. As a preliminary pilot study, different levels of ammonia were applied to different compartments while hens were allowed to navigate between compartments. Using the automated monitor tool to assess occupancy, a negative trend of compartment occupancy with ammonia level was revealed, though further examination is needed. ©2014 Poultry Science Association Inc.

  10. Advances in industrial biopharmaceutical batch process monitoring: Machine-learning methods for small data problems.

    PubMed

    Tulsyan, Aditya; Garvin, Christopher; Ündey, Cenk

    2018-04-06

    Biopharmaceutical manufacturing comprises of multiple distinct processing steps that require effective and efficient monitoring of many variables simultaneously in real-time. The state-of-the-art real-time multivariate statistical batch process monitoring (BPM) platforms have been in use in recent years to ensure comprehensive monitoring is in place as a complementary tool for continued process verification to detect weak signals. This article addresses a longstanding, industry-wide problem in BPM, referred to as the "Low-N" problem, wherein a product has a limited production history. The current best industrial practice to address the Low-N problem is to switch from a multivariate to a univariate BPM, until sufficient product history is available to build and deploy a multivariate BPM platform. Every batch run without a robust multivariate BPM platform poses risk of not detecting potential weak signals developing in the process that might have an impact on process and product performance. In this article, we propose an approach to solve the Low-N problem by generating an arbitrarily large number of in silico batches through a combination of hardware exploitation and machine-learning methods. To the best of authors' knowledge, this is the first article to provide a solution to the Low-N problem in biopharmaceutical manufacturing using machine-learning methods. Several industrial case studies from bulk drug substance manufacturing are presented to demonstrate the efficacy of the proposed approach for BPM under various Low-N scenarios. © 2018 Wiley Periodicals, Inc.

  11. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    PubMed

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to pretreatment controls, by substituting the ionisation chamber's measurements with those performed with EPID, and also that a statistical process control monitoring of data brought security guarantee. 2010 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  12. Influence of in line monitored fluid bed granulation process parameters on the stability of Ethinylestradiol.

    PubMed

    Roßteuscher-Carl, Katrin; Fricke, Sabine; Hacker, Michael C; Schulz-Siegmund, Michaela

    2015-12-30

    Ethinylestradiol (EE) as a highly active and low dosed compound is prone to oxidative degradation. The stability of the drug substance is therefore a critical parameter that has to be considered during drug formulation. Beside the stability of the drug substance, granule particle size and moisture are critical quality attributes (CQA) of the fluid bed granulation process which influence the tableting ability of the resulting granules. Both CQA should therefore be monitored during the production process by process analytic technology (PAT) according to ICH Q8. This work focusses on the effects of drying conditions on the stability of EE in a fluid-bed granulation process. We quantified EE degradation products 6-alpha-hydroxy-EE, 6-beta-hydroxy-EE, 9(11)-dehydro-EE and 6-oxo-EE during long time storage and accelerated conditions. PAT-tools that monitor granule particle size (Spatial filtering technology) and granule moisture (Microwave resonance technology) were applied and compared with off-line methods. We found a relevant influence of residual granule moisture and thermic stress applied during granulation on the storage stability of EE, whereas no degradation was found immediately after processing. Hence we conclude that drying parameters have a relevant influence on long term EE stability. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Midlevel Maternity Providers' Preferences of a Childbirth Monitoring Tool in Low-Income Health Units in Uganda.

    PubMed

    Balikuddembe, Michael S; Wakholi, Peter K; Tumwesigye, Nazarius M; Tylleskär, Thorkild

    2018-01-01

    A third of women in childbirth are inadequately monitored, partly due to the tools used. Some stakeholders assert that the current labour monitoring tools are not efficient and need improvement to become more relevant to childbirth attendants. The study objective was to explore the expectations of maternity service providers for a mobile childbirth monitoring tool in maternity facilities in a low-income country like Uganda. Semi-structured interviews of purposively selected midwives and doctors in rural-urban childbirth facilities in Uganda were conducted before thematic data analysis. The childbirth providers expected a tool that enabled fast and secure childbirth record storage and sharing. They desired a tool that would automatically and conveniently register patient clinical findings, and actively provide interactive clinical decision support on a busy ward. The tool ought to support agreed upon standards for good pregnancy outcomes but also adaptable to the patient and their difficult working conditions. The tool functionality should include clinical data management and real-time decision support to the midwives, while the non-functional attributes include versatility and security.

  14. Regional input to joint European space weather service

    NASA Astrophysics Data System (ADS)

    Stanislawska, I.; Belehaki, A.; Jansen, F.; Heynderickx, D.; Lilensten, J.; Candidi, M.

    The basis for elaborating within COST 724 Action Developing the scientific basis for monitoring modeling and predicting Space Weather European space weather service is rich by many national and international activities which provide instruments and tools for global as well as regional monitoring and modeling COST 724 stimulates coordinates and supports Europe s goals of development and global cooperation by providing standards for timely and high quality information and knowledge in space weather Existing local capabilities are taken into account to develop synergies and avoid duplication The enhancement of environment monitoring networks and associated instruments technology yields mutual advantages for European service and regional services specialized for local users needs It structurally increases the integration of limited-area services generates a platform employing the same approach to each task differing mostly in input and output data In doing so it also provides complementary description of the environmental state within issued information A general scheme of regional services concept within COST 724 activity can be the processing chain from measurements trough algorithms to operational knowledge It provides the platform for interaction among the local end users who define what kind of information they need system providers who elaborate tools necessary to obtain required information and local service providers who do the actual processing of data and tailor it to specific user s needs Such initiative creates a unique possibility for small

  15. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorensek, M.; Hamm, L.; Garcia, H.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come frommore » many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.« less

  16. Challenges in atmospheric monitoring of areal emission sources - an Open-path Fourier transform infrared (OP-FTIR) spectroscopic experience report

    NASA Astrophysics Data System (ADS)

    Schuetze, C.; Sauer, U.; Dietrich, P.

    2015-12-01

    Reliable detection and assessment of near-surface CO2 emissions from natural or anthropogenic sources require the application of various monitoring tools at different spatial scales. Especially, optical remote sensing tools for atmospheric monitoring have the potential to measure integrally CO2 emissions over larger scales (> 10.000m2). Within the framework of the MONACO project ("Monitoring approach for geological CO2 storage sites using a hierarchical observation concept"), an integrative hierarchical monitoring concept was developed and validated at different field sites with the aim to establish a modular observation strategy including investigations in the shallow subsurface, at ground surface level and the lower atmospheric boundary layer. The main aims of the atmospheric monitoring using optical remote sensing were the observation of the gas dispersion in to the near-surface atmosphere, the determination of maximum concentration values and identification of the main challenges associated with the monitoring of extended emission sources with the proposed methodological set up under typical environmental conditions. The presentation will give an overview about several case studies using the integrative approach of Open-Path Fourier Transform Infrared spectroscopy (OP FTIR) in combination with in situ measurements. As a main result, the method was validated as possible approach for continuous monitoring of the atmospheric composition, in terms of integral determination of GHG concentrations and to identify target areas which are needed to be investigated more in detail. Especially the data interpretation should closely consider the micrometeorological conditions. Technical aspects concerning robust equipment, experimental set up and fast data processing algorithms have to be taken into account for the enhanced automation of atmospheric monitoring.

  17. Water cooler towers and other man-made aquatic systems as environmental collection systems for agents of concern

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brigmon, Robin; Kingsley, Mark T.

    An apparatus and process of using existing process water sources such as cooling towers, fountains, and waterfalls is provided in which the water sources are utilized as monitoring system for the detection of environmental agents which may be present in the environment. The process water is associated with structures and have an inherent filtering or absorbing capability available in the materials and therefore can be used as a rapid screening tool for quality and quantitative assessment of environmental agents.

  18. Low cost composite manufacturing utilizing intelligent pultrusion and resin transfer molding (IPRTM)

    NASA Astrophysics Data System (ADS)

    Bradley, James E.; Wysocki, Tadeusz S., Jr.

    1993-02-01

    This article describes an innovative method for the economical manufacturing of large, intricately-shaped tubular composite parts. Proprietary intelligent process control techniques are combined with standard pultrusion and RTM methodologies to provide high part throughput, performance, and quality while substantially reducing scrap, rework costs, and labor requirements. On-line process monitoring and control is achieved through a smart tooling interface consisting of modular zone tiles installed on part-specific die assemblies. Real-time archiving of process run parameters provides enhanced SPC and SQC capabilities.

  19. Optimization and Characterization of the Friction Stir Welded Sheets of AA 5754-H111: Monitoring of the Quality of Joints with Thermographic Techniques.

    PubMed

    De Filippis, Luigi Alberto Ciro; Serio, Livia Maria; Palumbo, Davide; De Finis, Rosa; Galietti, Umberto

    2017-10-11

    Friction Stir Welding (FSW) is a solid-state welding process, based on frictional and stirring phenomena, that offers many advantages with respect to the traditional welding methods. However, several parameters can affect the quality of the produced joints. In this work, an experimental approach has been used for studying and optimizing the FSW process, applied on 5754-H111 aluminum plates. In particular, the thermal behavior of the material during the process has been investigated and two thermal indexes, the maximum temperature and the heating rate of the material, correlated to the frictional power input, were investigated for different process parameters (the travel and rotation tool speeds) configurations. Moreover, other techniques (micrographs, macrographs and destructive tensile tests) were carried out for supporting in a quantitative way the analysis of the quality of welded joints. The potential of thermographic technique has been demonstrated both for monitoring the FSW process and for predicting the quality of joints in terms of tensile strength.

  20. Spinoff 2009

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Topics covered include: Image-Capture Devices Extend Medicine's Reach; Medical Devices Assess, Treat Balance Disorders; NASA Bioreactors Advance Disease Treatments; Robotics Algorithms Provide Nutritional Guidelines; "Anti-Gravity" Treadmills Speed Rehabilitation; Crew Management Processes Revitalize Patient Care; Hubble Systems Optimize Hospital Schedules; Web-based Programs Assess Cognitive Fitness; Electrolyte Concentrates Treat Dehydration; Tools Lighten Designs, Maintain Structural Integrity; Insulating Foams Save Money, Increase Safety; Polyimide Resins Resist Extreme Temperatures; Sensors Locate Radio Interference; Surface Operations Systems Improve Airport Efficiency; Nontoxic Resins Advance Aerospace Manufacturing; Sensors Provide Early Warning of Biological Threats; Robot Saves Soldier's Lives Overseas (MarcBot); Apollo-Era Life Raft Saves Hundreds of Sailors; Circuits Enhance Scientific Instruments and Safety Devices; Tough Textiles Protect Payloads and Public Safety Officers; Forecasting Tools Point to Fishing Hotspots; Air Purifiers Eliminate Pathogens, Preserve Food; Fabrics Protect Sensitive Skin from UV Rays; Phase Change Fabrics Control Temperature; Tiny Devices Project Sharp, Colorful Images; Star-Mapping Tools Enable Tracking of Endangered Animals; Nanofiber Filters Eliminate Contaminants; Modeling Innovations Advance Wind Energy Industry; Thermal Insulation Strips Conserve Energy; Satellite Respondent Buoys Identify Ocean Debris; Mobile Instruments Measure Atmospheric Pollutants; Cloud Imagers Offer New Details on Earth's Health; Antennas Lower Cost of Satellite Access; Feature Detection Systems Enhance Satellite Imagery; Chlorophyll Meters Aid Plant Nutrient Management; Telemetry Boards Interpret Rocket, Airplane Engine Data; Programs Automate Complex Operations Monitoring; Software Tools Streamline Project Management; Modeling Languages Refine Vehicle Design; Radio Relays Improve Wireless Products; Advanced Sensors Boost Optical Communication, Imaging; Tensile Fabrics Enhance Architecture Around the World; Robust Light Filters Support Powerful Imaging Devices; Thermoelectric Devices Cool, Power Electronics; Innovative Tools Advance Revolutionary Weld Technique; Methods Reduce Cost, Enhance Quality of Nanotubes; Gauging Systems Monitor Cryogenic Liquids; Voltage Sensors Monitor Harmful Static; and Compact Instruments Measure Heat Potential.

  1. Monitoring Quality of Biotherapeutic Products Using Multivariate Data Analysis.

    PubMed

    Rathore, Anurag S; Pathak, Mili; Jain, Renu; Jadaun, Gaurav Pratap Singh

    2016-07-01

    Monitoring the quality of pharmaceutical products is a global challenge, heightened by the implications of letting subquality drugs come to the market on public safety. Regulatory agencies do their due diligence at the time of approval as per their prescribed regulations. However, product quality needs to be monitored post-approval as well to ensure patient safety throughout the product life cycle. This is particularly complicated for biotechnology-based therapeutics where seemingly minor changes in process and/or raw material attributes have been shown to have a significant effect on clinical safety and efficacy of the product. This article provides a perspective on the topic of monitoring the quality of biotech therapeutics. In the backdrop of challenges faced by the regulatory agencies, the potential use of multivariate data analysis as a tool for effective monitoring has been proposed. Case studies using data from several insulin biosimilars have been used to illustrate the key concepts.

  2. Quality-by-design III: application of near-infrared spectroscopy to monitor roller compaction in-process and product quality attributes of immediate release tablets.

    PubMed

    Kona, Ravikanth; Fahmy, Raafat M; Claycamp, Gregg; Polli, James E; Martinez, Marilyn; Hoag, Stephen W

    2015-02-01

    The objective of this study is to use near-infrared spectroscopy (NIRS) coupled with multivariate chemometric models to monitor granule and tablet quality attributes in the formulation development and manufacturing of ciprofloxacin hydrochloride (CIP) immediate release tablets. Critical roller compaction process parameters, compression force (CFt), and formulation variables identified from our earlier studies were evaluated in more detail. Multivariate principal component analysis (PCA) and partial least square (PLS) models were developed during the development stage and used as a control tool to predict the quality of granules and tablets. Validated models were used to monitor and control batches manufactured at different sites to assess their robustness to change. The results showed that roll pressure (RP) and CFt played a critical role in the quality of the granules and the finished product within the range tested. Replacing binder source did not statistically influence the quality attributes of the granules and tablets. However, lubricant type has significantly impacted the granule size. Blend uniformity, crushing force, disintegration time during the manufacturing was predicted using validated PLS regression models with acceptable standard error of prediction (SEP) values, whereas the models resulted in higher SEP for batches obtained from different manufacturing site. From this study, we were able to identify critical factors which could impact the quality attributes of the CIP IR tablets. In summary, we demonstrated the ability of near-infrared spectroscopy coupled with chemometrics as a powerful tool to monitor critical quality attributes (CQA) identified during formulation development.

  3. SU-F-P-04: Implementation of Dose Monitoring Software: Successes and Pitfalls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Och, J

    2016-06-15

    Purpose: to successfully install a dose monitoring software (DMS) application to assist in CT protocol and dose management. Methods: Upon selecting the DMS, we began our implementation of the application. A working group composed of Medical Physics, Radiology Administration, Information Technology, and CT technologists was formed. On-site training in the application was supplied by the vendor. The decision was made to apply the process for all the CT protocols on all platforms at all facilities. Protocols were painstakingly mapped to the correct masters, and the system went ‘live’. Results: We are routinely using DMS as a tool in our Clinicalmore » Performance CT QA program. It is useful in determining the effectiveness of revisions to existing protocols, and establishing performance baselines for new units. However, the implementation was not without difficulty. We identified several pitfalls and obstacles which frustrated progress. Including: Training deficiencies, Nomenclature problems, Communication, DICOM variability. Conclusion: Dose monitoring software can be a potent tool for QA. However, implementation of the program can be problematic and requires planning, organization and commitment.« less

  4. Volcanic alert system (VAS) developed during the 2011-2014 El Hierro (Canary Islands) volcanic process

    NASA Astrophysics Data System (ADS)

    García, Alicia; Berrocoso, Manuel; Marrero, José M.; Fernández-Ros, Alberto; Prates, Gonçalo; De la Cruz-Reyna, Servando; Ortiz, Ramón

    2014-06-01

    The 2011 volcanic unrest at El Hierro Island illustrated the need for a Volcanic Alert System (VAS) specifically designed for the management of volcanic crises developing after long repose periods. The VAS comprises the monitoring network, the software tools for analysis of the monitoring parameters, the Volcanic Activity Level (VAL) management, and the assessment of hazard. The VAS presented here focuses on phenomena related to moderate eruptions, and on potentially destructive volcano-tectonic earthquakes and landslides. We introduce a set of new data analysis tools, aimed to detect data trend changes, as well as spurious signals related to instrumental failure. When data-trend changes and/or malfunctions are detected, a watchdog is triggered, issuing a watch-out warning (WOW) to the Monitoring Scientific Team (MST). The changes in data patterns are then translated by the MST into a VAL that is easy to use and understand by scientists, technicians, and decision-makers. Although the VAS was designed specifically for the unrest episodes at El Hierro, the methodologies may prove useful at other volcanic systems.

  5. Process auditing and performance improvement in a mixed wastewater-aqueous waste treatment plant.

    PubMed

    Collivignarelli, Maria Cristina; Bertanza, Giorgio; Abbà, Alessandro; Damiani, Silvestro

    2018-02-01

    The wastewater treatment process is based on complex chemical, physical and biological mechanisms that are closely interconnected. The efficiency of the system (which depends on compliance with national regulations on wastewater quality) can be achieved through the use of tools such as monitoring, that is the detection of parameters that allow the continuous interpretation of the current situation, and experimental tests, which allow the measurement of real performance (of a sector, a single treatment or equipment) and comparison with the following ones. Experimental tests have a particular relevance in the case of municipal wastewater treatment plants fed with a strong industrial component and especially in the case of plants authorized to treat aqueous waste. In this paper a case study is presented where the application of management tools such as careful monitoring and experimental tests led to the technical and economic optimization of the plant: the main results obtained were the reduction of sludge production (from 4,000 t/year w.w. (wet weight) to about 2,200 t/year w.w.) and operating costs (e.g. from 600,000 €/year down to about 350,000 €/year for reagents), the increase of resource recovery and the improvement of the overall process performance.

  6. Feasibility of Raman spectroscopy as PAT tool in active coating.

    PubMed

    Müller, Joshua; Knop, Klaus; Thies, Jochen; Uerpmann, Carsten; Kleinebudde, Peter

    2010-02-01

    Active coating is a specific application of film coating where the active ingredient is comprised in the coating layer. This implementation is a challenging operation regarding the achievement of desired amount of coating and coating uniformity. To guarantee the quality of such dosage forms it is desirable to develop a tool that is able to monitor the coating operation and detect the end of the process. Coating experiments were performed at which the model drug diprophylline is coated in a pan coater on placebo tablets and tablets containing the active ingredient itself. During the active coating Raman spectra were recorded in-line. The spectral measurements were correlated with the average weight gain and the amount of coated active ingredient at each time point. The developed chemometric model was tested by monitoring further coated batches. Furthermore, the effects of pan rotation speed and working distance on the acquired Raman signal and, hence, resulting effect of the chemometric model were examined. Besides coating on placebo cores it was possible to determine the amount of active ingredient in the film when coated onto cores containing the same active ingredient. In addition, the method is even applicable when varying the process parameters and measurement conditions within a restricted range. Raman spectroscopy is an appropriate process analytical technology too.

  7. Real-time process monitoring in a semi-continuous fluid-bed dryer - microwave resonance technology versus near-infrared spectroscopy.

    PubMed

    Peters, Johanna; Teske, Andreas; Taute, Wolfgang; Döscher, Claas; Höft, Michael; Knöchel, Reinhard; Breitkreutz, Jörg

    2018-02-15

    The trend towards continuous manufacturing in the pharmaceutical industry is associated with an increasing demand for advanced control strategies. It is a mandatory requirement to obtain reliable real-time information on critical quality attributes (CQA) during every process step as the decision on diversion of material needs to be performed fast and automatically. Where possible, production equipment should provide redundant systems for in-process control (IPC) measurements to ensure continuous process monitoring even if one of the systems is not available. In this paper, two methods for real-time monitoring of granule moisture in a semi-continuous fluid-bed drying unit are compared. While near-infrared (NIR) spectroscopy has already proven to be a suitable process analytical technology (PAT) tool for moisture measurements in fluid-bed applications, microwave resonance technology (MRT) showed difficulties to monitor moistures above 8% until recently. The results indicate, that the newly developed MRT sensor operating at four resonances is capable to compete with NIR spectroscopy. While NIR spectra were preprocessed by mean centering and first derivative before application of partial least squares (PLS) regression to build predictive models (RMSEP = 0.20%), microwave moisture values of two resonances sufficed to build a statistically close multiple linear regression (MLR) model (RMSEP = 0.07%) for moisture prediction. Thereby, it could be verified that moisture monitoring by MRT sensor systems could be a valuable alternative to NIR spectroscopy or could be used as a redundant system providing great ease of application. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Three-dimensional non-destructive optical evaluation of laser-processing performance using optical coherence tomography.

    PubMed

    Kim, Youngseop; Choi, Eun Seo; Kwak, Wooseop; Shin, Yongjin; Jung, Woonggyu; Ahn, Yeh-Chan; Chen, Zhongping

    2008-06-01

    We demonstrate the use of optical coherence tomography (OCT) as a non-destructive diagnostic tool for evaluating laser-processing performance by imaging the features of a pit and a rim. A pit formed on a material at different laser-processing conditions is imaged using both a conventional scanning electron microscope (SEM) and OCT. Then using corresponding images, the geometrical characteristics of the pit are analyzed and compared. From the results, we could verify the feasibility and the potential of the application of OCT to the monitoring of the laser-processing performance.

  9. Three-dimensional non-destructive optical evaluation of laser-processing performance using optical coherence tomography

    PubMed Central

    Kim, Youngseop; Choi, Eun Seo; Kwak, Wooseop; Shin, Yongjin; Jung, Woonggyu; Ahn, Yeh-Chan; Chen, Zhongping

    2014-01-01

    We demonstrate the use of optical coherence tomography (OCT) as a non-destructive diagnostic tool for evaluating laser-processing performance by imaging the features of a pit and a rim. A pit formed on a material at different laser-processing conditions is imaged using both a conventional scanning electron microscope (SEM) and OCT. Then using corresponding images, the geometrical characteristics of the pit are analyzed and compared. From the results, we could verify the feasibility and the potential of the application of OCT to the monitoring of the laser-processing performance. PMID:24932051

  10. Individual human cell responses to low doses of chemicals studied by synchrotron infrared spectromicroscopy

    NASA Astrophysics Data System (ADS)

    Holman, Hoi-Ying N.; Goth-Goldstein, Regine; Blakely, Elanor A.; Bjornstad, Kathy; Martin, Michael C.; McKinney, Wayne R.

    2000-05-01

    Vibrational spectroscopy, when combined with synchrotron radiation-based (SR) microscopy, is a powerful new analytical tool with high spatial resolution for detecting biochemical changes in the individual living cells. In contrast to other microscopy methods that require fixing, drying, staining or labeling, SR-FTIR microscopy probes intact living cells providing a composite view of all of the molecular response and the ability to monitor the response over time in the same cell. Observed spectral changes include all types of lesions induced in that cell as well as cellular responses to external and internal stresses. These spectral changes combined with other analytical tools may provide a fundamental understanding of the key molecular mechanisms induced in response to stresses created by low- doses of chemicals. In this study we used the high spatial - resolution SR-FTIR vibrational spectromicroscopy as a sensitive analytical tool to detect chemical- and radiation- induced changes in individual human cells. Our preliminary spectral measurements indicate that this technique is sensitive enough to detect changes in nucleic acids and proteins of cells treated with environmentally relevant concentrations of dioxin. This technique has the potential to distinguish changes from exogenous or endogenous oxidative processes. Future development of this technique will allow rapid monitoring of cellular processes such as drug metabolism, early detection of disease, bio- compatibility of implant materials, cellular repair mechanisms, self assembly of cellular apparatus, cell differentiation and fetal development.

  11. Model Based Optimal Sensor Network Design for Condition Monitoring in an IGCC Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Rajeeva; Kumar, Aditya; Dai, Dan

    2012-12-31

    This report summarizes the achievements and final results of this program. The objective of this program is to develop a general model-based sensor network design methodology and tools to address key issues in the design of an optimal sensor network configuration: the type, location and number of sensors used in a network, for online condition monitoring. In particular, the focus in this work is to develop software tools for optimal sensor placement (OSP) and use these tools to design optimal sensor network configuration for online condition monitoring of gasifier refractory wear and radiant syngas cooler (RSC) fouling. The methodology developedmore » will be applicable to sensing system design for online condition monitoring for broad range of applications. The overall approach consists of (i) defining condition monitoring requirement in terms of OSP and mapping these requirements in mathematical terms for OSP algorithm, (ii) analyzing trade-off of alternate OSP algorithms, down selecting the most relevant ones and developing them for IGCC applications (iii) enhancing the gasifier and RSC models as required by OSP algorithms, (iv) applying the developed OSP algorithm to design the optimal sensor network required for the condition monitoring of an IGCC gasifier refractory and RSC fouling. Two key requirements for OSP for condition monitoring are desired precision for the monitoring variables (e.g. refractory wear) and reliability of the proposed sensor network in the presence of expected sensor failures. The OSP problem is naturally posed within a Kalman filtering approach as an integer programming problem where the key requirements of precision and reliability are imposed as constraints. The optimization is performed over the overall network cost. Based on extensive literature survey two formulations were identified as being relevant to OSP for condition monitoring; one based on LMI formulation and the other being standard INLP formulation. Various algorithms to solve these two formulations were developed and validated. For a given OSP problem the computation efficiency largely depends on the “size” of the problem. Initially a simplified 1-D gasifier model assuming axial and azimuthal symmetry was used to test out various OSP algorithms. Finally these algorithms were used to design the optimal sensor network for condition monitoring of IGCC gasifier refractory wear and RSC fouling. The sensors type and locations obtained as solution to the OSP problem were validated using model based sensing approach. The OSP algorithm has been developed in a modular form and has been packaged as a software tool for OSP design where a designer can explore various OSP design algorithm is a user friendly way. The OSP software tool is implemented in Matlab/Simulink© in-house. The tool also uses few optimization routines that are freely available on World Wide Web. In addition a modular Extended Kalman Filter (EKF) block has also been developed in Matlab/Simulink© which can be utilized for model based sensing of important process variables that are not directly measured through combining the online sensors with model based estimation once the hardware sensor and their locations has been finalized. The OSP algorithm details and the results of applying these algorithms to obtain optimal sensor location for condition monitoring of gasifier refractory wear and RSC fouling profile are summarized in this final report.« less

  12. Tropical forests and fragmentation: A case of South Garo Hills, Meghalaya, North East India

    Treesearch

    Ashish Kumar; Bruce Marcot; Rohitkumar Patel

    2017-01-01

    This study presents an ecological assessment of tropical forests at stand and landscape levels to provide knowledge, tools and, indicators to evaluate specific diversity patterns and related ecological processes happening in these tropical forest conditions; and for monitoring landscape changes for managing forest and wildlife resources of Jhum (shifting cultivation)...

  13. Elements from Theatre Art as Learning Tools in Medical Education

    ERIC Educational Resources Information Center

    Alraek, Torild Jacobsen; Baerheim, Anders

    2005-01-01

    For the project, an actress created a patient character, staging a consultation among a group of 36 medical students. The consultation process was monitored by a teacher and was stopped by "timeout" at any critical incidence. Students reflected on possible strategies and, one at a time tried out their own or someone else's proposal. The project…

  14. The MicronEye Motion Monitor: A New Tool for Class and Laboratory Demonstrations.

    ERIC Educational Resources Information Center

    Nissan, M.; And Others

    1988-01-01

    Describes a special camera that can be directly linked to a computer that has been adapted for studying movement. Discusses capture, processing, and analysis of two-dimensional data with either IBM PC or Apple II computers. Gives examples of a variety of mechanical tests including pendulum motion, air track, and air table. (CW)

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kitanidis, Peter

    As large-scale, commercial storage projects become operational, the problem of utilizing information from diverse sources becomes more critically important. In this project, we developed, tested, and applied an advanced joint data inversion system for CO 2 storage modeling with large data sets for use in site characterization and real-time monitoring. Emphasis was on the development of advanced and efficient computational algorithms for joint inversion of hydro-geophysical data, coupled with state-of-the-art forward process simulations. The developed system consists of (1) inversion tools using characterization data, such as 3D seismic survey (amplitude images), borehole log and core data, as well as hydraulic,more » tracer and thermal tests before CO 2 injection, (2) joint inversion tools for updating the geologic model with the distribution of rock properties, thus reducing uncertainty, using hydro-geophysical monitoring data, and (3) highly efficient algorithms for directly solving the dense or sparse linear algebra systems derived from the joint inversion. The system combines methods from stochastic analysis, fast linear algebra, and high performance computing. The developed joint inversion tools have been tested through synthetic CO 2 storage examples.« less

  16. Using SFOC to fly the Magellan Venus mapping mission

    NASA Technical Reports Server (NTRS)

    Bucher, Allen W.; Leonard, Robert E., Jr.; Short, Owen G.

    1993-01-01

    Traditionally, spacecraft flight operations at the Jet Propulsion Laboratory (JPL) have been performed by teams of spacecraft experts utilizing ground software designed specifically for the current mission. The Jet Propulsion Laboratory set out to reduce the cost of spacecraft mission operations by designing ground data processing software that could be used by multiple spacecraft missions, either sequentially or concurrently. The Space Flight Operations Center (SFOC) System was developed to provide the ground data system capabilities needed to monitor several spacecraft simultaneously and provide enough flexibility to meet the specific needs of individual projects. The Magellan Spacecraft Team utilizes the SFOC hardware and software designed for engineering telemetry analysis, both real-time and non-real-time. The flexibility of the SFOC System has allowed the spacecraft team to integrate their own tools with SFOC tools to perform the tasks required to operate a spacecraft mission. This paper describes how the Magellan Spacecraft Team is utilizing the SFOC System in conjunction with their own software tools to perform the required tasks of spacecraft event monitoring as well as engineering data analysis and trending.

  17. Concepts for laser beam parameter monitoring during industrial mass production

    NASA Astrophysics Data System (ADS)

    Harrop, Nicholas J.; Maerten, Otto; Wolf, Stefan; Kramer, Reinhard

    2017-02-01

    In today's industrial mass production, lasers have become an established tool for a variety of processes. As with any other tool, mechanical or otherwise, the laser and its ancillary components are prone to wear and ageing. Monitoring of these ageing processes at full operating power of an industrial laser is challenging for a range of reasons. Not only the damage threshold of the measurement device itself, but also cycle time constraints in industrial processing are just two of these challenges. Power measurement, focus spot size or full beam caustic measurements are being implemented in industrial laser systems. The scope of the measurement and the amount of data collected is limited by the above mentioned cycle time, which in some cases can only be a few seconds. For successful integration of these measurement systems into automated production lines, the devices must be equipped with standardized communication interfaces, enabling a feedback loop from the measurement device to the laser processing systems. If necessary these measurements can be performed before each cycle. Power is determined with either static or dynamic calorimetry while camera and scanning systems are used for beam profile analysis. Power levels can be measured from 25W up to 20 kW, with focus spot sizes between 10μm and several millimeters. We will show, backed by relevant statistical data, that defects or contamination of the laser beam path can be detected with applied measurement systems, enabling a quality control chain to prevent process defects.

  18. Model-Based Infrared Metrology for Advanced Technology Nodes and 300 mm Wafer Processing

    NASA Astrophysics Data System (ADS)

    Rosenthal, Peter A.; Duran, Carlos; Tower, Josh; Mazurenko, Alex; Mantz, Ulrich; Weidner, Peter; Kasic, Alexander

    2005-09-01

    The use of infrared spectroscopy for production semiconductor process monitoring has evolved recently from primarily unpatterned, i.e. blanket test wafer measurements in a limited historical application space of blanket epitaxial, BPSG, and FSG layers to new applications involving patterned product wafer measurements, and new measurement capabilities. Over the last several years, the semiconductor industry has adopted a new set of materials associated with copper/low-k interconnects, and new structures incorporating exotic materials including silicon germanium, SOI substrates and high aspect ratio trenches. The new device architectures and more chemically sophisticated materials have raised new process control and metrology challenges that are not addressed by current measurement technology. To address the challenges we have developed a new infrared metrology tool designed for emerging semiconductor production processes, in a package compatible with modern production and R&D environments. The tool incorporates recent advances in reflectance instrumentation including highly accurate signal processing, optimized reflectometry optics, and model-based calibration and analysis algorithms. To meet the production requirements of the modern automated fab, the measurement hardware has been integrated with a fully automated 300 mm platform incorporating front opening unified pod (FOUP) interfaces, automated pattern recognition and high throughput ultra clean robotics. The tool employs a suite of automated dispersion-model analysis algorithms capable of extracting a variety of layer properties from measured spectra. The new tool provides excellent measurement precision, tool matching, and a platform for deploying many new production and development applications. In this paper we will explore the use of model based infrared analysis as a tool for characterizing novel bottle capacitor structures employed in high density dynamic random access memory (DRAM) chips. We will explore the capability of the tool for characterizing multiple geometric parameters associated with the manufacturing process that are important to the yield and performance of advanced bottle DRAM devices.

  19. A multimedia perioperative record keeper for clinical research.

    PubMed

    Perrino, A C; Luther, M A; Phillips, D B; Levin, F L

    1996-05-01

    To develop a multimedia perioperative recordkeeper that provides: 1. synchronous, real-time acquisition of multimedia data, 2. on-line access to the patient's chart data, and 3. advanced data analysis capabilities through integrated, multimedia database and analysis applications. To minimize cost and development time, the system design utilized industry standard hardware components and graphical. software development tools. The system was configured to use a Pentium PC complemented with a variety of hardware interfaces to external data sources. These sources included physiologic monitors with data in digital, analog, video, and audio as well as paper-based formats. The development process was guided by trials in over 80 clinical cases and by the critiques from numerous users. As a result of this process, a suite of custom software applications were created to meet the design goals. The Perioperative Data Acquisition application manages data collection from a variety of physiological monitors. The Charter application provides for rapid creation of an electronic medical record from the patient's paper-based chart and investigator's notes. The Multimedia Medical Database application provides a relational database for the organization and management of multimedia data. The Triscreen application provides an integrated data analysis environment with simultaneous, full-motion data display. With recent technological advances in PC power, data acquisition hardware, and software development tools, the clinical researcher now has the ability to collect and examine a more complete perioperative record. It is hoped that the description of the MPR and its development process will assist and encourage others to advance these tools for perioperative research.

  20. Quality tracing in meat supply chains

    PubMed Central

    Mack, Miriam; Dittmer, Patrick; Veigt, Marius; Kus, Mehmet; Nehmiz, Ulfert; Kreyenschmidt, Judith

    2014-01-01

    The aim of this study was the development of a quality tracing model for vacuum-packed lamb that is applicable in different meat supply chains. Based on the development of relevant sensory parameters, the predictive model was developed by combining a linear primary model and the Arrhenius model as the secondary model. Then a process analysis was conducted to define general requirements for the implementation of the temperature-based model into a meat supply chain. The required hardware and software for continuous temperature monitoring were developed in order to use the model under practical conditions. Further on a decision support tool was elaborated in order to use the model as an effective tool in combination with the temperature monitoring equipment for the improvement of quality and storage management within the meat logistics network. Over the long term, this overall procedure will support the reduction of food waste and will improve the resources efficiency of food production. PMID:24797136

  1. Quality tracing in meat supply chains.

    PubMed

    Mack, Miriam; Dittmer, Patrick; Veigt, Marius; Kus, Mehmet; Nehmiz, Ulfert; Kreyenschmidt, Judith

    2014-06-13

    The aim of this study was the development of a quality tracing model for vacuum-packed lamb that is applicable in different meat supply chains. Based on the development of relevant sensory parameters, the predictive model was developed by combining a linear primary model and the Arrhenius model as the secondary model. Then a process analysis was conducted to define general requirements for the implementation of the temperature-based model into a meat supply chain. The required hardware and software for continuous temperature monitoring were developed in order to use the model under practical conditions. Further on a decision support tool was elaborated in order to use the model as an effective tool in combination with the temperature monitoring equipment for the improvement of quality and storage management within the meat logistics network. Over the long term, this overall procedure will support the reduction of food waste and will improve the resources efficiency of food production.

  2. Trends in fluorescence imaging and related techniques to unravel biological information.

    PubMed

    Haustein, Elke; Schwille, Petra

    2007-09-01

    Optical microscopy is among the most powerful tools that the physical sciences have ever provided biology. It is indispensable for basic lab work, as well as for cutting edge research, as the visual monitoring of life processes still belongs to the most compelling evidences for a multitude of biomedical applications. Along with the rapid development of new probes and methods for the analysis of laser induced fluorescence, optical microscopy over past years experienced a vast increase of both new techniques and novel combinations of established methods to study biological processes with unprecedented spatial and temporal precision. On the one hand, major technical advances have significantly improved spatial resolution. On the other hand, life scientists are moving toward three- and even four-dimensional cell biology and biophysics involving time as a crucial coordinate to quantitatively understand living specimen. Monitoring the whole cell or tissue in real time, rather than producing snap-shot-like two-dimensional projections, will enable more physiological and, thus, more clinically relevant experiments, whereas an increase in temporal resolution facilitates monitoring fast nonperiodic processes as well as the quantitative analysis of characteristic dynamics.

  3. Trends in fluorescence imaging and related techniques to unravel biological information

    PubMed Central

    Haustein, Elke; Schwille, Petra

    2007-01-01

    Optical microscopy is among the most powerful tools that the physical sciences have ever provided biology. It is indispensable for basic lab work, as well as for cutting edge research, as the visual monitoring of life processes still belongs to the most compelling evidences for a multitude of biomedical applications. Along with the rapid development of new probes and methods for the analysis of laser induced fluorescence, optical microscopy over past years experienced a vast increase of both new techniques and novel combinations of established methods to study biological processes with unprecedented spatial and temporal precision. On the one hand, major technical advances have significantly improved spatial resolution. On the other hand, life scientists are moving toward three- and even four-dimensional cell biology and biophysics involving time as a crucial coordinate to quantitatively understand living specimen. Monitoring the whole cell or tissue in real time, rather than producing snap-shot-like two-dimensional projections, will enable more physiological and, thus, more clinically relevant experiments, whereas an increase in temporal resolution facilitates monitoring fast nonperiodic processes as well as the quantitative analysis of characteristic dynamics. PMID:19404444

  4. A Portable Surface Contamination Monitor Based on the Principle of Optically Stimulated Electron Emission (OSEE)

    NASA Technical Reports Server (NTRS)

    Perey, D. F.

    1996-01-01

    Many industrial and aerospace processes involving the joining of materials, require sufficient surface cleanliness to insure proper bonding. Processes as diverse as painting, welding, or the soldering of electronic circuits will be compromised if prior inspection and removal of surface contaminants is inadequate. As process requirements become more stringent and the number of different materials and identified contaminants increases, various instruments and techniques have been developed for improved inspection. One such technique, based on the principle of Optically Stimulated Electron Emission (OSEE), has been explored for a number of years as a tool for surface contamination monitoring. Some of the benefits of OSEE are: it is non-contacting; requires little operator training; and has very high contamination sensitivity. This paper describes the development of a portable OSEE based surface contamination monitor. The instrument is suitable for both hand-held and robotic inspections with either manual or automated control of instrument operation. In addition, instrument output data is visually displayed to the operator and may be sent to an external computer for archiving or analysis.

  5. Performance assessment of fire-sat monitoring system based on satellite time series for fire danger estimation : the experience of the pre-operative application in the Basilicata Region (Italy)

    NASA Astrophysics Data System (ADS)

    Lanorte, Antonio; Desantis, Fortunato; Aromando, Angelo; Lasaponara, Rosa

    2013-04-01

    This paper presents the results we obtained in the context of the FIRE-SAT project during the 2012 operative application of the satellite based tools for fire monitoring. FIRE_SAT project has been funded by the Civil Protection of the Basilicata Region in order to set up a low cost methodology for fire danger monitoring and fire effect estimation based on satellite Earth Observation techniques. To this aim, NASA Moderate Resolution Imaging Spectroradiometer (MODIS), ASTER, Landsat TM data were used. Novel data processing techniques have been developed by researchers of the ARGON Laboratory of the CNR-IMAA for the operative monitoring of fire. In this paper we only focus on the danger estimation model which has been fruitfully used since 2008 to 2012 as an reliable operative tool to support and optimize fire fighting strategies from the alert to the management of resources including fire attacks. The daily updating of fire danger is carried out using satellite MODIS images selected for their spectral capability and availability free of charge from NASA web site. This makes these data sets very suitable for an effective systematic (daily) and sustainable low-cost monitoring of large areas. The preoperative use of the integrated model, pointed out that the system properly monitor spatial and temporal variations of fire susceptibility and provide useful information of both fire severity and post fire regeneration capability.

  6. Real-time complex event processing for cloud resources

    NASA Astrophysics Data System (ADS)

    Adam, M.; Cordeiro, C.; Field, L.; Giordano, D.; Magnoni, L.

    2017-10-01

    The ongoing integration of clouds into the WLCG raises the need for detailed health and performance monitoring of the virtual resources in order to prevent problems of degraded service and interruptions due to undetected failures. When working in scale, the existing monitoring diversity can lead to a metric overflow whereby the operators need to manually collect and correlate data from several monitoring tools and frameworks, resulting in tens of different metrics to be constantly interpreted and analyzed per virtual machine. In this paper we present an ESPER based standalone application which is able to process complex monitoring events coming from various sources and automatically interpret data in order to issue alarms upon the resources’ statuses, without interfering with the actual resources and data sources. We will describe how this application has been used with both commercial and non-commercial cloud activities, allowing the operators to quickly be alarmed and react to misbehaving VMs and LHC experiments’ workflows. We will present the pattern analysis mechanisms being used, as well as the surrounding Elastic and REST API interfaces where the alarms are collected and served to users.

  7. Intelligent composting assisted by a wireless sensing network.

    PubMed

    López, Marga; Martinez-Farre, Xavier; Casas, Oscar; Quilez, Marcos; Polo, Jose; Lopez, Oscar; Hornero, Gemma; Pinilla, Mirta R; Rovira, Carlos; Ramos, Pedro M; Borges, Beatriz; Marques, Hugo; Girão, Pedro Silva

    2014-04-01

    Monitoring of the moisture and temperature of composting process is a key factor to obtain a quality product beyond the quality of raw materials. Current methodologies for monitoring these two parameters are time consuming for workers, sometimes not sufficiently reliable to help decision-making and thus are ignored in some cases. This article describes an advance on monitoring of composting process through a Wireless Sensor Network (WSN) that allows measurement of temperature and moisture in real time in multiple points of the composting material, the Compo-ball system. To implement such measurement capabilities on-line, a WSN composed of multiple sensor nodes was designed and implemented to provide the staff with an efficient monitoring composting management tool. After framing the problem, the objectives and characteristics of the WSN are briefly discussed and a short description of the hardware and software of the network's components are presented. Presentation and discussion of practical issues and results obtained with the WSN during a demonstration stage that took place in several composting sites concludes the paper. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. WLCG Transfers Dashboard: a Unified Monitoring Tool for Heterogeneous Data Transfers

    NASA Astrophysics Data System (ADS)

    Andreeva, J.; Beche, A.; Belov, S.; Kadochnikov, I.; Saiz, P.; Tuckett, D.

    2014-06-01

    The Worldwide LHC Computing Grid provides resources for the four main virtual organizations. Along with data processing, data distribution is the key computing activity on the WLCG infrastructure. The scale of this activity is very large, the ATLAS virtual organization (VO) alone generates and distributes more than 40 PB of data in 100 million files per year. Another challenge is the heterogeneity of data transfer technologies. Currently there are two main alternatives for data transfers on the WLCG: File Transfer Service and XRootD protocol. Each LHC VO has its own monitoring system which is limited to the scope of that particular VO. There is a need for a global system which would provide a complete cross-VO and cross-technology picture of all WLCG data transfers. We present a unified monitoring tool - WLCG Transfers Dashboard - where all the VOs and technologies coexist and are monitored together. The scale of the activity and the heterogeneity of the system raise a number of technical challenges. Each technology comes with its own monitoring specificities and some of the VOs use several of these technologies. This paper describes the implementation of the system with particular focus on the design principles applied to ensure the necessary scalability and performance, and to easily integrate any new technology providing additional functionality which might be specific to that technology.

  9. Time series analysis of tool wear in sheet metal stamping using acoustic emission

    NASA Astrophysics Data System (ADS)

    Vignesh Shanbhag, V.; Pereira, P. Michael; Rolfe, F. Bernard; Arunachalam, N.

    2017-09-01

    Galling is an adhesive wear mode that often affects the lifespan of stamping tools. Since stamping tools represent significant economic cost, even a slight improvement in maintenance cost is of high importance for the stamping industry. In other manufacturing industries, online tool condition monitoring has been used to prevent tool wear-related failure. However, monitoring the acoustic emission signal from a stamping process is a non-trivial task since the acoustic emission signal is non-stationary and non-transient. There have been numerous studies examining acoustic emissions in sheet metal stamping. However, very few have focused in detail on how the signals change as wear on the tool surface progresses prior to failure. In this study, time domain analysis was applied to the acoustic emission signals to extract features related to tool wear. To understand the wear progression, accelerated stamping tests were performed using a semi-industrial stamping setup which can perform clamping, piercing, stamping in a single cycle. The time domain features related to stamping were computed for the acoustic emissions signal of each part. The sidewalls of the stamped parts were scanned using an optical profilometer to obtain profiles of the worn part, and they were qualitatively correlated to that of the acoustic emissions signal. Based on the wear behaviour, the wear data can be divided into three stages: - In the first stage, no wear is observed, in the second stage, adhesive wear is likely to occur, and in the third stage severe abrasive plus adhesive wear is likely to occur. Scanning electron microscopy showed the formation of lumps on the stamping tool, which represents galling behavior. Correlation between the time domain features of the acoustic emissions signal and the wear progression identified in this study lays the basis for tool diagnostics in stamping industry.

  10. Scalable Node Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drotar, Alexander P.; Quinn, Erin E.; Sutherland, Landon D.

    2012-07-30

    Project description is: (1) Build a high performance computer; and (2) Create a tool to monitor node applications in Component Based Tool Framework (CBTF) using code from Lightweight Data Metric Service (LDMS). The importance of this project is that: (1) there is a need a scalable, parallel tool to monitor nodes on clusters; and (2) New LDMS plugins need to be able to be easily added to tool. CBTF stands for Component Based Tool Framework. It's scalable and adjusts to different topologies automatically. It uses MRNet (Multicast/Reduction Network) mechanism for information transport. CBTF is flexible and general enough to bemore » used for any tool that needs to do a task on many nodes. Its components are reusable and 'EASILY' added to a new tool. There are three levels of CBTF: (1) frontend node - interacts with users; (2) filter nodes - filters or concatenates information from backend nodes; and (3) backend nodes - where the actual work of the tool is done. LDMS stands for lightweight data metric servies. It's a tool used for monitoring nodes. Ltool is the name of the tool we derived from LDMS. It's dynamically linked and includes the following components: Vmstat, Meminfo, Procinterrupts and more. It works by: Ltool command is run on the frontend node; Ltool collects information from the backend nodes; backend nodes send information to the filter nodes; and filter nodes concatenate information and send to a database on the front end node. Ltool is a useful tool when it comes to monitoring nodes on a cluster because the overhead involved with running the tool is not particularly high and it will automatically scale to any size cluster.« less

  11. Uncertainty quantification in structural health monitoring: Applications on cultural heritage buildings

    NASA Astrophysics Data System (ADS)

    Lorenzoni, Filippo; Casarin, Filippo; Caldon, Mauro; Islami, Kleidi; Modena, Claudio

    2016-01-01

    In the last decades the need for an effective seismic protection and vulnerability reduction of cultural heritage buildings and sites determined a growing interest in structural health monitoring (SHM) as a knowledge-based assessment tool to quantify and reduce uncertainties regarding their structural performance. Monitoring can be successfully implemented in some cases as an alternative to interventions or to control the medium- and long-term effectiveness of already applied strengthening solutions. The research group at the University of Padua, in collaboration with public administrations, has recently installed several SHM systems on heritage structures. The paper reports the application of monitoring strategies implemented to avoid (or at least minimize) the execution of strengthening interventions/repairs and control the response as long as a clear worsening or damaging process is detected. Two emblematic case studies are presented and discussed: the Roman Amphitheatre (Arena) of Verona and the Conegliano Cathedral. Both are excellent examples of on-going monitoring activities, performed through static and dynamic approaches in combination with automated procedures to extract meaningful structural features from collected data. In parallel to the application of innovative monitoring techniques, statistical models and data processing algorithms have been developed and applied in order to reduce uncertainties and exploit monitoring results for an effective assessment and protection of historical constructions. Processing software for SHM was implemented to perform the continuous real time treatment of static data and the identification of modal parameters based on the structural response to ambient vibrations. Statistical models were also developed to filter out the environmental effects and thermal cycles from the extracted features.

  12. Development of Friction Stir Welding Technologies for In-Space Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Longhurst, William R.; Cox, Chase D.; Gibson, Brian T.

    Friction stir welding (FSW) has emerged as an attractive process for fabricating aerospace vehicles. Current FSW state-of-the-art uses large machines that are not portable. However, there is a growing need for fabrication and repair operations associated with in-space manufacturing. This need stems from a desire for prolonged missions and travel beyond low-earth orbit. To address this need, research and development is presented regarding two enabling technologies. The first is a self-adjusting and aligning (SAA) FSW tool that drastically reduces the axial force that has historically been quite large. The SAA-FSW tool is a bobbin style tool that floats freely, withoutmore » any external actuators, along its vertical axis to adjust and align with the workpiece s position and orientation. Successful butt welding of 1/8 in. (3.175 mm) thick aluminum 1100 was achieved in conjunction with a drastic reduction and near elimination of the axial process force. Along with the SAA-FSW, an innovative in-process monitor technique is presented in which a magnetoelastic force rate-of-change sensor is employed. The sensor consists of a magnetized FSW tool that is used to induce a voltage in a coil surrounding the tool when changes to the process forces occur. The sensor was able to detect 1/16 in. (1.5875 mm) diameter voids. It is concluded that these technologies could be applied toward the development of a portable FSW machine for use in space.« less

  13. Development of Friction Stir Welding Technologies for In-Space Manufacturing

    DOE PAGES

    Longhurst, William R.; Cox, Chase D.; Gibson, Brian T.; ...

    2016-08-26

    Friction stir welding (FSW) has emerged as an attractive process for fabricating aerospace vehicles. Current FSW state-of-the-art uses large machines that are not portable. However, there is a growing need for fabrication and repair operations associated with in-space manufacturing. This need stems from a desire for prolonged missions and travel beyond low-earth orbit. To address this need, research and development is presented regarding two enabling technologies. The first is a self-adjusting and aligning (SAA) FSW tool that drastically reduces the axial force that has historically been quite large. The SAA-FSW tool is a bobbin style tool that floats freely, withoutmore » any external actuators, along its vertical axis to adjust and align with the workpiece s position and orientation. Successful butt welding of 1/8 in. (3.175 mm) thick aluminum 1100 was achieved in conjunction with a drastic reduction and near elimination of the axial process force. Along with the SAA-FSW, an innovative in-process monitor technique is presented in which a magnetoelastic force rate-of-change sensor is employed. The sensor consists of a magnetized FSW tool that is used to induce a voltage in a coil surrounding the tool when changes to the process forces occur. The sensor was able to detect 1/16 in. (1.5875 mm) diameter voids. It is concluded that these technologies could be applied toward the development of a portable FSW machine for use in space.« less

  14. Monitoring Autophagy in the Model Green Microalga Chlamydomonas reinhardtii.

    PubMed

    Pérez-Pérez, María Esther; Couso, Inmaculada; Heredia-Martínez, Luis G; Crespo, José L

    2017-10-22

    Autophagy is an intracellular catabolic system that delivers cytoplasmic constituents and organelles in the vacuole. This degradative process is mediated by a group of proteins coded by autophagy-related ( ATG ) genes that are widely conserved from yeasts to plants and mammals. Homologs of ATG genes have been also identified in algal genomes including the unicellular model green alga Chlamydomonas reinhardtii . The development of specific tools to monitor autophagy in Chlamydomonas has expanded our current knowledge about the regulation and function of this process in algae. Recent findings indicated that autophagy is regulated by redox signals and the TOR network in Chlamydomonas and revealed that this process may play in important role in the control of lipid metabolism and ribosomal protein turnover in this alga. Here, we will describe the different techniques and approaches that have been reported to study autophagy and autophagic flux in Chlamydomonas.

  15. Comprehension Tools for Teachers: Reading for Understanding from Prekindergarten through Fourth Grade

    PubMed Central

    Connor, Carol McDonald; Phillips, Beth M.; Kaschak, Michael; Apel, Kenn; Kim, Young-Suk; Al Otaiba, Stephanie; Crowe, Elizabeth C.; Thomas-Tate, Shurita; Johnson, Lakeisha Cooper; Lonigan, Christopher J.

    2015-01-01

    This paper describes the theoretical framework, as well as the development and testing of the intervention, Comprehension Tools for Teachers (CTT), which is composed of eight component interventions targeting malleable language and reading comprehension skills that emerging research indicates contribute to proficient reading for understanding for prekindergarteners through fourth graders. Component interventions target processes considered largely automatic as well as more reflective processes, with interacting and reciprocal effects. Specifically, we present component interventions targeting cognitive, linguistic, and text-specific processes, including morphological awareness, syntax, mental-state verbs, comprehension monitoring, narrative and expository text structure, enacted comprehension, academic knowledge, and reading to learn from informational text. Our aim was to develop a tool set composed of intensive meaningful individualized small group interventions. We improved feasibility in regular classrooms through the use of design-based iterative research methods including careful lesson planning, targeted scripting, pre- and postintervention proximal assessments, and technology. In addition to the overall framework, we discuss seven of the component interventions and general results of design and efficacy studies. PMID:26500420

  16. Monitoring damage growth in titanium matrix composites using acoustic emission

    NASA Technical Reports Server (NTRS)

    Bakuckas, J. G., Jr.; Prosser, W. H.; Johnson, W. S.

    1993-01-01

    The application of the acoustic emission (AE) technique to locate and monitor damage growth in titanium matrix composites (TMC) was investigated. Damage growth was studied using several optical techniques including a long focal length, high magnification microscope system with image acquisition capabilities. Fracture surface examinations were conducted using a scanning electron microscope (SEM). The AE technique was used to locate damage based on the arrival times of AE events between two sensors. Using model specimens exhibiting a dominant failure mechanism, correlations were established between the observed damage growth mechanisms and the AE results in terms of the events amplitude. These correlations were used to monitor the damage growth process in laminates exhibiting multiple modes of damage. Results revealed that the AE technique is a viable and effective tool to monitor damage growth in TMC.

  17. Applicability of the Design Tool for Inventory and Monitoring (DTIM) and the Explore Sample Data Tool for the Assessment of Caribbean Forest Dynamics

    Treesearch

    Humfredo Marcano-Vega; Andrew Lister; Kevin Megown; Charles Scott

    2016-01-01

    There is a growing need within the insular Caribbean for technical assistance in planning forest-monitoring projects and data analysis. This paper gives an overview of software tools developed by the USDA Forest Service’s National Inventory and Monitoring Applications Center and the Remote Sensing Applications Center. We discuss their applicability in the efficient...

  18. Review of Software Tools for Design and Analysis of Large scale MRM Proteomic Datasets

    PubMed Central

    Colangelo, Christopher M.; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-01-01

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368

  19. Frequency Based Volcanic Activity Detection through Remotely Sensed Data

    NASA Astrophysics Data System (ADS)

    Worden, A. K.; Dehn, J.; Webley, P. W.

    2015-12-01

    Satellite remote sensing has proved to offer a useful and relatively inexpensive method for monitoring large areas where field work is logistically unrealistic, and potentially dangerous. Current sensors are able to detect the majority of explosive volcanic activity; those that tend to effect and represent larger scale changes in the volcanic systems, eventually relating to ash producing periods of extended eruptive activity, and effusive activity. As new spaceborne sensors are developed, the ability to detect activity improves so that a system to gauge the frequency of volcanic activity can be used as a useful monitoring tool. Four volcanoes were chosen for development and testing of a method to monitor explosive activity: Stromboli (Italy); Shishaldin and Cleveland (Alaska, USA); and Karymsky (Kamchatka, Russia). Each volcano studied had similar but unique signatures of pre-cursory and eruptive activity. This study has shown that this monitoring tool could be applied to a wide range of volcanoes and still produce useful and robust data. Our method deals specifically with the detection of small scale explosive activity. The method described here could be useful in an operational setting, especially at remote volcanoes that have the potential to impact populations, infrastructure, and the aviation community. A number of important factors will affect the validity of application of this method. They are: (1) the availability of a continuous and continually populated dataset; (2) appropriate and reasonable sensor resolutions; (3) a recorded history of the volcano's previous activity; and, if available, (4) some ground-based monitoring system. We aim to develop the method further to be able to capture and evaluate the frequency of other volcanic processes such as lava flows, phreatomagmatic eruptions and dome growth and collapse. The work shown here has served to illustrate the capability of this method and monitoring tool for use at remote, un-instrumented volcanoes.

  20. Helmet-Cam: tool for assessing miners’ respirable dust exposure

    PubMed Central

    Cecala, A.B.; Reed, W.R.; Joy, G.J.; Westmoreland, S.C.; O’Brien, A.D.

    2015-01-01

    Video technology coupled with datalogging exposure monitors have been used to evaluate worker exposure to different types of contaminants. However, previous application of this technology used a stationary video camera to record the worker’s activity while the worker wore some type of contaminant monitor. These techniques are not applicable to mobile workers in the mining industry because of their need to move around the operation while performing their duties. The Helmet-Cam is a recently developed exposure assessment tool that integrates a person-wearable video recorder with a datalogging dust monitor. These are worn by the miner in a backpack, safety belt or safety vest to identify areas or job tasks of elevated exposure. After a miner performs his or her job while wearing the unit, the video and dust exposure data files are downloaded to a computer and then merged together through a NIOSH-developed computer software program called Enhanced Video Analysis of Dust Exposure (EVADE). By providing synchronized playback of the merged video footage and dust exposure data, the EVADE software allows for the assessment and identification of key work areas and processes, as well as work tasks that significantly impact a worker’s personal respirable dust exposure. The Helmet-Cam technology has been tested at a number of metal/nonmetal mining operations and has proven to be a valuable assessment tool. Mining companies wishing to use this technique can purchase a commercially available video camera and an instantaneous dust monitor to obtain the necessary data, and the NIOSH-developed EVADE software will be available for download at no cost on the NIOSH website. PMID:26380529

  1. Rock Slide Monitoring by Using TDR Inclinometers

    NASA Astrophysics Data System (ADS)

    Drusa, Marián; Bulko, Roman

    2016-12-01

    The geotechnical monitoring of the slope deformations is widespread at present time. In many geological localities and civil engineering construction areas, monitoring is a unique tool for controlling of negative factors and processes, also inform us about actual state of rock environment or interacting structures. It is necessary for risk assessment. In our case, geotechnical monitoring is controlling rockslide activity around in the future part of motorway. The construction of new highway route D1 from Bratislava to Košice crosses the territory which is affected by a massive rockslide close to Kraľovany village. There was a need to monitor the activity of a large unstable rockslide with deep shear planes. In this case of underground movement activity, the Department of Geotechnics of the University of Žilina installed inclinometers at the unstable area which worked on Time Domain Reflectometry (TDR) principle. Based on provided measurements, effectivity and suitability of TDR inclinometers for monitoring of deep underground movement activity is demonstrated.

  2. A Job Monitoring and Accounting Tool for the LSF Batch System

    NASA Astrophysics Data System (ADS)

    Sarkar, Subir; Taneja, Sonia

    2011-12-01

    This paper presents a web based job monitoring and group-and-user accounting tool for the LSF Batch System. The user oriented job monitoring displays a simple and compact quasi real-time overview of the batch farm for both local and Grid jobs. For Grid jobs the Distinguished Name (DN) of the Grid users is shown. The overview monitor provides the most up-to-date status of a batch farm at any time. The accounting tool works with the LSF accounting log files. The accounting information is shown for a few pre-defined time periods by default. However, one can also compute the same information for any arbitrary time window. The tool already proved to be an extremely useful means to validate more extensive accounting tools available in the Grid world. Several sites have already been using the present tool and more sites running the LSF batch system have shown interest. We shall discuss the various aspects that make the tool essential for site administrators and end-users alike and outline the current status of development as well as future plans.

  3. Microalgal process-monitoring based on high-selectivity spectroscopy tools: status and future perspectives.

    PubMed

    Podevin, Michael; Fotidis, Ioannis A; Angelidaki, Irini

    2018-08-01

    Microalgae are well known for their ability to accumulate lipids intracellularly, which can be used for biofuels and mitigate CO 2 emissions. However, due to economic challenges, microalgae bioprocesses have maneuvered towards the simultaneous production of food, feed, fuel, and various high-value chemicals in a biorefinery concept. On-line and in-line monitoring of macromolecules such as lipids, proteins, carbohydrates, and high-value pigments will be more critical to maintain product quality and consistency for downstream processing in a biorefinery to maintain and valorize these markets. The main contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity - the capability of monitoring several analytes simultaneously - in the interest of improving product quality, productivity, and process automation of a microalgal biorefinery. The high-selectivity PAT under consideration are mid-infrared (MIR), near-infrared (NIR), and Raman vibrational spectroscopies. The current review contains a critical assessment of these technologies in the context of recent advances in software and hardware in order to move microalgae production towards process automation through multivariate process control (MVPC) and software sensors trained on "big data". The paper will also include a comprehensive overview of off-line implementations of vibrational spectroscopy in microalgal research as it pertains to spectral interpretation and process automation to aid and motivate development.

  4. [The training of medical and scientific manpower in the system of postgraduate medical education].

    PubMed

    Kabanova, S A; Lozhkevich, I Iu

    2010-01-01

    The research was held within Petrovsky National surgery center and revealed certain regularities and trends testifying the necessity of further strategic and tactic development of training of graduated specialists through the innovative optimization of effectiveness of post-graduate training of medical personnel. The inclusion of social psychological monitoring of educational process is obligatory. The implementation of sociological monitoring in any institution providing post-graduate training has to be a powerful tool for enhancing quality and efficiency of training of medical professionals. This approach presupposes modernization of training programs accounting the innovations and research data.

  5. Is it worth changing pattern recognition methods for structural health monitoring?

    NASA Astrophysics Data System (ADS)

    Bull, L. A.; Worden, K.; Cross, E. J.; Dervilis, N.

    2017-05-01

    The key element of this work is to demonstrate alternative strategies for using pattern recognition algorithms whilst investigating structural health monitoring. This paper looks to determine if it makes any difference in choosing from a range of established classification techniques: from decision trees and support vector machines, to Gaussian processes. Classification algorithms are tested on adjustable synthetic data to establish performance metrics, then all techniques are applied to real SHM data. To aid the selection of training data, an informative chain of artificial intelligence tools is used to explore an active learning interaction between meaningful clusters of data.

  6. Fermentanomics: Relating quality attributes of a monoclonal antibody to cell culture process variables and raw materials using multivariate data analysis.

    PubMed

    Rathore, Anurag S; Kumar Singh, Sumit; Pathak, Mili; Read, Erik K; Brorson, Kurt A; Agarabi, Cyrus D; Khan, Mansoor

    2015-01-01

    Fermentanomics is an emerging field of research and involves understanding the underlying controlled process variables and their effect on process yield and product quality. Although major advancements have occurred in process analytics over the past two decades, accurate real-time measurement of significant quality attributes for a biotech product during production culture is still not feasible. Researchers have used an amalgam of process models and analytical measurements for monitoring and process control during production. This article focuses on using multivariate data analysis as a tool for monitoring the internal bioreactor dynamics, the metabolic state of the cell, and interactions among them during culture. Quality attributes of the monoclonal antibody product that were monitored include glycosylation profile of the final product along with process attributes, such as viable cell density and level of antibody expression. These were related to process variables, raw materials components of the chemically defined hybridoma media, concentration of metabolites formed during the course of the culture, aeration-related parameters, and supplemented raw materials such as glucose, methionine, threonine, tryptophan, and tyrosine. This article demonstrates the utility of multivariate data analysis for correlating the product quality attributes (especially glycosylation) to process variables and raw materials (especially amino acid supplements in cell culture media). The proposed approach can be applied for process optimization to increase product expression, improve consistency of product quality, and target the desired quality attribute profile. © 2015 American Institute of Chemical Engineers.

  7. Modeling of afforestation possibilities on one part of Hungary

    NASA Astrophysics Data System (ADS)

    Bozsik, Éva; Riczu, Péter; Tamás, János; Burriel, Charles; Helilmeier, Hermann

    2015-04-01

    Agroforestry systems are part of the history of the European Union rural landscapes, but the regional increase of size of agricultural parcels had a significant effect on European land use in the 20th century, thereby it has radically reduced the coverage of natural forest. However, this cause conflicts between interest of agricultural and forestry sectors. The agroforestry land uses could be a solution of this conflict management. One real - ecological - problem with the remnant forests and new forest plantation is the partly missing of network function without connecting ecological green corridors, the other problem is verifiability for the agroforestry payment system, monitoring the arable lands and plantations. Remote sensing methods are currently used to supervise European Union payments. Nowadays, next to use satellite imagery the airborne hyperspectral and LiDAR (Light Detection And Ranging) remote sensing technologies are becoming more widespread use for nature, environmental, forest, agriculture protection, conservation and monitoring and it is an effective tool for monitoring biomass production. In this Hungarian case study we made a Spatial Decision Support System (SDSS) to create agroforestry site selection model. The aim of model building was to ensure the continuity of ecological green corridors, maintain the appropriate land use of regional endowments. The investigation tool was the more widely used hyperspectral and airborne LiDAR remote sensing technologies which can provide appropriate data acquisition and data processing tools to build a decision support system

  8. Introduction of blended learning in a master program: Developing an integrative mixed method evaluation framework.

    PubMed

    Chmiel, Aviva S; Shaha, Maya; Schneider, Daniel K

    2017-01-01

    The aim of this research is to develop a comprehensive evaluation framework involving all actors in a higher education blended learning (BL) program. BL evaluation usually either focuses on students, faculty, technological or institutional aspects. Currently, no validated comprehensive monitoring tool exists that can support introduction and further implementation of BL in a higher education context. Starting from established evaluation principles and standards, concepts that were to be evaluated were firstly identified and grouped. In a second step, related BL evaluation tools referring to students, faculty and institutional level were selected. This allowed setting up and implementing an evaluation framework to monitor the introduction of BL during two succeeding recurrences of the program. The results of the evaluation allowed documenting strengths and weaknesses of the BL format in a comprehensive way, involving all actors. It has led to improvements at program, faculty and course level. The evaluation process and the reporting of the results proved to be demanding in time and personal resources. The evaluation framework allows measuring the most significant dimensions influencing the success of a BL implementation at program level. However, this comprehensive evaluation is resource intensive. Further steps will be to refine the framework towards a sustainable and transferable BL monitoring tool that finds a balance between comprehensiveness and efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Monitoring and evaluation of strategic change programme implementation-Lessons from a case analysis.

    PubMed

    Neumann, Jan; Robson, Andrew; Sloan, Diane

    2018-02-01

    This study considered the monitoring and evaluation of a large-scale and domestic and global strategic change programme implementation. It considers the necessary prerequisites to overcome challenges and barriers that prevent systematic and effective monitoring and evaluation to take place alongside its operationalisation. The work involves a case study based on a major industrial company from the energy sector. The change programme makes particular reference to changes in business models, business processes, organisation structures as well as Enterprise Resource Planning infrastructure. The case study focussed on the summative evaluation of the programme post-implementation. This assessment involved 25 semi-structured interviews with employees across a range of managerial strata capturing more than 65 roles within the change programme at both local and global levels. Data relating to their perception of evaluation effectiveness and shortcomings were analysed by means of template analysis. The study identifies responsibilities for executing an evaluation alongside various methods and tools that are appropriate, thereby focussing on the "Who" (roles, responsibility for particular activities) and "How" (methods and tools) rather than "What" to monitor and evaluate. The findings are presented generically so they offer new insights and transferability for practitioners involved in managing strategic change and its associated evaluation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Microbial trophic interactions and mcrA gene expression in monitoring of anaerobic digesters

    PubMed Central

    Alvarado, Alejandra; Montañez-Hernández, Lilia E.; Palacio-Molina, Sandra L.; Oropeza-Navarro, Ricardo; Luévanos-Escareño, Miriam P.; Balagurusamy, Nagamani

    2014-01-01

    Anaerobic digestion (AD) is a biological process where different trophic groups of microorganisms break down biodegradable organic materials in the absence of oxygen. A wide range of AD technologies is being used to convert livestock manure, municipal and industrial wastewaters, and solid organic wastes into biogas. AD gains importance not only because of its relevance in waste treatment but also because of the recovery of carbon in the form of methane, which is a renewable energy and is used to generate electricity and heat. Despite the advances on the engineering and design of new bioreactors for AD, the microbiology component always poses challenges. Microbiology of AD processes is complicated as the efficiency of the process depends on the interactions of various trophic groups involved. Due to the complex interdependence of microbial activities for the functionality of the anaerobic bioreactors, the genetic expression of mcrA, which encodes a key enzyme in methane formation, is proposed as a parameter to monitor the process performance in real time. This review evaluates the current knowledge on microbial groups, their interactions, and their relationship to the performance of anaerobic biodigesters with a focus on using mcrA gene expression as a tool to monitor the process. PMID:25429286

  11. Microbial safety in space

    NASA Astrophysics Data System (ADS)

    Krooneman, Janneke; Harmsen, Hermie; Landini, Paolo; Zinn, Manfred; Munaut, Françoise; van der Meer, Walter; Beimfohr, Claudia; Reichert, Bas; Preuß, Andrea

    2005-10-01

    Microbial hygiene is important in our daily lives; preventing and combating microbial infections is increasingly important in society. In hospitals, strict monitoring and control is exercised for people and infrastructure alike. In modern buildings, air-conditioning system are screened for harmful bacteria such as Legionella. More recently, concerns about SARS (virus) and anthrax (bacteria) have added pressure on the scientific community to come up with adequate monitoring and control techniques to assure microbial hygiene. Additionally, the use of biotechnological recycling and cleaning processes for sustainability brings the need for reliable monitoring tools and preventive or riks-reducing strategies. In the manned space environment, similar problems need to be solved and efforts have already been made to study the behaviour of micro-organisms and microbial hygiene onboard space stations.

  12. Development of a knowledge acquisition tool for an expert system flight status monitor

    NASA Technical Reports Server (NTRS)

    Disbrow, J. D.; Duke, E. L.; Regenie, V. A.

    1986-01-01

    Two of the main issues in artificial intelligence today are knowledge acquisition dion and knowledge representation. The Dryden Flight Research Facility of NASA's Ames Research Center is presently involved in the design and implementation of an expert system flight status monitor that will provide expertise and knowledge to aid the flight systems engineer in monitoring today's advanced high-performance aircraft. The flight status monitor can be divided into two sections: the expert system itself and the knowledge acquisition tool. The knowledge acquisition tool, the means it uses to extract knowledge from the domain expert, and how that knowledge is represented for computer use is discussed. An actual aircraft system has been codified by this tool with great success. Future real-time use of the expert system has been facilitated by using the knowledge acquisition tool to easily generate a logically consistent and complete knowledge base.

  13. Development of a knowledge acquisition tool for an expert system flight status monitor

    NASA Technical Reports Server (NTRS)

    Disbrow, J. D.; Duke, E. L.; Regenie, V. A.

    1986-01-01

    Two of the main issues in artificial intelligence today are knowledge acquisition and knowledge representation. The Dryden Flight Research Facility of NASA's Ames Research Center is presently involved in the design and implementation of an expert system flight status monitor that will provide expertise and knowledge to aid the flight systems engineer in monitoring today's advanced high-performance aircraft. The flight status monitor can be divided into two sections: the expert system itself and the knowledge acquisition tool. This paper discusses the knowledge acquisition tool, the means it uses to extract knowledge from the domain expert, and how that knowledge is represented for computer use. An actual aircraft system has been codified by this tool with great success. Future real-time use of the expert system has been facilitated by using the knowledge acquisition tool to easily generate a logically consistent and complete knowledge base.

  14. Environmental DNA for wildlife biology and biodiversity monitoring.

    PubMed

    Bohmann, Kristine; Evans, Alice; Gilbert, M Thomas P; Carvalho, Gary R; Creer, Simon; Knapp, Michael; Yu, Douglas W; de Bruyn, Mark

    2014-06-01

    Extraction and identification of DNA from an environmental sample has proven noteworthy recently in detecting and monitoring not only common species, but also those that are endangered, invasive, or elusive. Particular attributes of so-called environmental DNA (eDNA) analysis render it a potent tool for elucidating mechanistic insights in ecological and evolutionary processes. Foremost among these is an improved ability to explore ecosystem-level processes, the generation of quantitative indices for analyses of species, community diversity, and dynamics, and novel opportunities through the use of time-serial samples and unprecedented sensitivity for detecting rare or difficult-to-sample taxa. Although technical challenges remain, here we examine the current frontiers of eDNA, outline key aspects requiring improvement, and suggest future developments and innovations for research. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Effective Documentation Tools

    NASA Technical Reports Server (NTRS)

    Sleboda, Claire

    1997-01-01

    Quality assurance programs provide a very effective means to monitor and evaluate medical care. Quality assurance involves: (1) Identify a problem; (2) Determine the source and nature of the problem; (3) Develop policies and methods to effect improvement; (4) Implement those polices; (5) Monitor the methods applied; and (6) Evaluate their effectiveness. Because this definition of quality assurance so closely resembles the Nursing Process, the health unit staff was able to use their knowledge of the nursing process to develop many forms which improve the quality of patient care. These forms include the NASA DFRC Service Report, the occupational injury form (Incident Report), the patient survey (Pre-hospital Evaluation/Care Report), the Laboratory Log Sheet, the 911 Run Sheet, and the Patient Assessment Stamp. Examples and steps which are followed to generate these reports are described.

  16. Demonstrating the Value of Near Real-time Satellite-based Earth Observations in a Research and Education Framework

    NASA Astrophysics Data System (ADS)

    Chiu, L.; Hao, X.; Kinter, J. L.; Stearn, G.; Aliani, M.

    2017-12-01

    The launch of GOES-16 series provides an opportunity to advance near real-time applications in natural hazard detection, monitoring and warning. This study demonstrates the capability and values of receiving real-time satellite-based Earth observations over a fast terrestrial networks and processing high-resolution remote sensing data in a university environment. The demonstration system includes 4 components: 1) Near real-time data receiving and processing; 2) data analysis and visualization; 3) event detection and monitoring; and 4) information dissemination. Various tools are developed and integrated to receive and process GRB data in near real-time, produce images and value-added data products, and detect and monitor extreme weather events such as hurricane, fire, flooding, fog, lightning, etc. A web-based application system is developed to disseminate near-real satellite images and data products. The images are generated with GIS-compatible format (GeoTIFF) to enable convenient use and integration in various GIS platforms. This study enhances the capacities for undergraduate and graduate education in Earth system and climate sciences, and related applications to understand the basic principles and technology in real-time applications with remote sensing measurements. It also provides an integrated platform for near real-time monitoring of extreme weather events, which are helpful for various user communities.

  17. A Feasibility Study on Monitoring Residual Sugar and Alcohol Strength in Kiwi Wine Fermentation Using a Fiber-Optic FT-NIR Spectrometry and PLS Regression.

    PubMed

    Wang, Bingqian; Peng, Bangzhu

    2017-02-01

    This work aims to investigate the potential of fiber-optic Fourier transform-near-infrared (FT-NIR) spectrometry associated with chemometric analysis, which will be applied to monitor time-related changes in residual sugar and alcohol strength during kiwi wine fermentation. NIR calibration models for residual sugar and alcohol strength during kiwi wine fermentation were established on the FT-NIR spectra of 98 samples scanned in a fiber-optic FT-NIR spectrometer, and partial least squares regression method. The results showed that R 2 and root mean square error of cross-validation could achieve 0.982 and 3.81 g/L for residual sugar, and 0.984 and 0.34% for alcohol strength, respectively. Furthermore, crucial process information on kiwi must and wine fermentations provided by fiber-optic FT-NIR spectrometry was found to agree with those obtained from traditional chemical methods, and therefore this fiber-optic FT-NIR spectrometry can be applied as an effective and suitable alternative for analyses and monitoring of those processes. The overall results suggested that fiber-optic FT-NIR spectrometry is a promising tool for monitoring and controlling the kiwi wine fermentation process. © 2017 Institute of Food Technologists®.

  18. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  19. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    NASA Technical Reports Server (NTRS)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated with the monitoring architecture to reduce the volume of event traffic flow in the system, and thereby reduce the intrusiveness of the monitoring process. We are developing an event filtering architecture to efficiently process the large volume of event traffic generated by LSD systems (such as distributed interactive applications). This filtering architecture is used to monitor collaborative distance learning application for obtaining debugging and feedback information. Our architecture supports the dynamic (re)configuration and optimization of event filters in large-scale distributed systems. Our work represents a major contribution by (1) survey and evaluating existing event filtering mechanisms In supporting monitoring LSD systems and (2) devising an integrated scalable high- performance architecture of event filtering that spans several kev application domains, presenting techniques to improve the functionality, performance and scalability. This paper describes the primary characteristics and challenges of developing high-performance event filtering for monitoring LSD systems. We survey existing event filtering mechanisms and explain key characteristics for each technique. In addition, we discuss limitations with existing event filtering mechanisms and outline how our architecture will improve key aspects of event filtering.

  20. Electronic self-monitoring of mood using IT platforms in adult patients with bipolar disorder: A systematic review of the validity and evidence.

    PubMed

    Faurholt-Jepsen, Maria; Munkholm, Klaus; Frost, Mads; Bardram, Jakob E; Kessing, Lars Vedel

    2016-01-15

    Various paper-based mood charting instruments are used in the monitoring of symptoms in bipolar disorder. During recent years an increasing number of electronic self-monitoring tools have been developed. The objectives of this systematic review were 1) to evaluate the validity of electronic self-monitoring tools as a method of evaluating mood compared to clinical rating scales for depression and mania and 2) to investigate the effect of electronic self-monitoring tools on clinically relevant outcomes in bipolar disorder. A systematic review of the scientific literature, reported according to the Preferred Reporting items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines was conducted. MEDLINE, Embase, PsycINFO and The Cochrane Library were searched and supplemented by hand search of reference lists. Databases were searched for 1) studies on electronic self-monitoring tools in patients with bipolar disorder reporting on validity of electronically self-reported mood ratings compared to clinical rating scales for depression and mania and 2) randomized controlled trials (RCT) evaluating electronic mood self-monitoring tools in patients with bipolar disorder. A total of 13 published articles were included. Seven articles were RCTs and six were longitudinal studies. Electronic self-monitoring of mood was considered valid compared to clinical rating scales for depression in six out of six studies, and in two out of seven studies compared to clinical rating scales for mania. The included RCTs primarily investigated the effect of heterogeneous electronically delivered interventions; none of the RCTs investigated the sole effect of electronic mood self-monitoring tools. Methodological issues with risk of bias at different levels limited the evidence in the majority of studies. Electronic self-monitoring of mood in depression appears to be a valid measure of mood in contrast to self-monitoring of mood in mania. There are yet few studies on the effect of electronic self-monitoring of mood in bipolar disorder. The evidence of electronic self-monitoring is limited by methodological issues and by a lack of RCTs. Although the idea of electronic self-monitoring of mood seems appealing, studies using rigorous methodology investigating the beneficial as well as possible harmful effects of electronic self-monitoring are needed.

  1. Evolution of Quality Assurance for Clinical Immunohistochemistry in the Era of Precision Medicine: Part 4: Tissue Tools for Quality Assurance in Immunohistochemistry.

    PubMed

    Cheung, Carol C; D'Arrigo, Corrado; Dietel, Manfred; Francis, Glenn D; Fulton, Regan; Gilks, C Blake; Hall, Jacqueline A; Hornick, Jason L; Ibrahim, Merdol; Marchetti, Antonio; Miller, Keith; van Krieken, J Han; Nielsen, Soren; Swanson, Paul E; Taylor, Clive R; Vyberg, Mogens; Zhou, Xiaoge; Torlakovic, Emina E

    2017-04-01

    The numbers of diagnostic, prognostic, and predictive immunohistochemistry (IHC) tests are increasing; the implementation and validation of new IHC tests, revalidation of existing tests, as well as the on-going need for daily quality assurance monitoring present significant challenges to clinical laboratories. There is a need for proper quality tools, specifically tissue tools that will enable laboratories to successfully carry out these processes. This paper clarifies, through the lens of laboratory tissue tools, how validation, verification, and revalidation of IHC tests can be performed in order to develop and maintain high quality "fit-for-purpose" IHC testing in the era of precision medicine. This is the final part of the 4-part series "Evolution of Quality Assurance for Clinical Immunohistochemistry in the Era of Precision Medicine."

  2. An IT Manager's View on E-Mail and Internet Policies and Procedures

    ERIC Educational Resources Information Center

    Desai, Mayur S.; Hart, Jeff; Richards, Thomas C.

    2009-01-01

    E-mail is mandatory tool of communications any business to survive in the 21st century. It is imperative that Information technology (IT) managers monitor and make sure that the e-mail systems are used properly. In order to organize a systematic process for proper use of email an administrator must have an input into the development of appropriate…

  3. Natural Attenuation of Perchlorate in Groundwater: Processes, Tools and Monitoring Techniques

    DTIC Science & Technology

    2008-04-01

    attenuation of perchlorate. Tier 3: Microbiological Indicators. For situations where additional lines of evidence are required, Tier 3 offers...USEPA, 1997). Like enhanced bioremediation, MNA requires an in-depth understanding of the microbiology , chemistry, and hydrogeology of the...nitrate, perchlorate (if present), and iron have been depleted in the microbiological treatment zone. Whereas sulfate concentration greater than 20

  4. Inter-Enterprise Integration - Moving Beyond Data Level Integration

    DTIC Science & Technology

    2010-06-01

    Center, Mississippi Abstract- Navy METOC is fundamentally a knowledge -based enterprise. The products are themselves knowledge products and the ...Effective transformation to a NCOW-aligned enterprise requires a clear way to express, understand, implement, monitor, manage , and assess the value of net...information that is available and the processes, tools, and agents that turn this collection of information into battlespace knowledge . Individuals will

  5. Influence of the baking process for chemically amplified resist on CD performance

    NASA Astrophysics Data System (ADS)

    Sasaki, Shiho; Ohfuji, Takeshi; Kurihara, Masa-aki; Inomata, Hiroyuki; Jackson, Curt A.; Murata, Yoshio; Totsukawa, Daisuke; Tsugama, Naoko; Kitano, Naoki; Hayashi, Naoya; Hwang, David H.

    2002-12-01

    CD uniformity and MTT (Mean to Target) control are very important in mask production for the 90nm node and beyond. Although it is well known that baking temperatures influence CD control in the CAR (chemically amplified resist) process for mask patterning, we found that 2 other process factors, which are related to acid diffusion and CA- reaction, greatly affect CD performance. We used a commercially available, negative CAR material and a 50kV exposure tool. We focused on the baking process for both PB (Pre Baking) and PEB (Post Exposure Bake). Film densification strength was evaluated from film thickness loss during PB. Plate temperature distribution was monitored with a thermocouple plate and IR camera. CA-reactions were also monitored with in-situ FTIR during PEB. CD uniformity was used to define the process influence. In conclusion, we found that airflow control and ramping temperature control in the baking process are very important factors to control CD in addition to conventional temperature control. These improvements contributed to a 30 % of reduction in CD variation.

  6. Tool vibration detection with eddy current sensors in machining process and computation of stability lobes using fuzzy classifiers

    NASA Astrophysics Data System (ADS)

    Devillez, Arnaud; Dudzinski, Daniel

    2007-01-01

    Today the knowledge of a process is very important for engineers to find optimal combination of control parameters warranting productivity, quality and functioning without defects and failures. In our laboratory, we carry out research in the field of high speed machining with modelling, simulation and experimental approaches. The aim of our investigation is to develop a software allowing the cutting conditions optimisation to limit the number of predictive tests, and the process monitoring to prevent any trouble during machining operations. This software is based on models and experimental data sets which constitute the knowledge of the process. In this paper, we deal with the problem of vibrations occurring during a machining operation. These vibrations may cause some failures and defects to the process, like workpiece surface alteration and rapid tool wear. To measure on line the tool micro-movements, we equipped a lathe with a specific instrumentation using eddy current sensors. Obtained signals were correlated with surface finish and a signal processing algorithm was used to determine if a test is stable or unstable. Then, a fuzzy classification method was proposed to classify the tests in a space defined by the width of cut and the cutting speed. Finally, it was shown that the fuzzy classification takes into account of the measurements incertitude to compute the stability limit or stability lobes of the process.

  7. [Outsourcing in long-term care: a risk management approach].

    PubMed

    Guimarães, Cristina Machado; Carvalho, José Crespo de

    2012-05-01

    This article seeks to investigate outsourcing decisions in supply chain management of healthcare organizations, namely the motives and constraints behind the decision, the selection criteria for activities to be outsourced to third parties, the type of possible agreements, and the impact of this decision on the organization per se. A case study of the start-up phase of a Long-term Care unit with an innovative approach and high levels of customization was conducted to understand the outsourcing process in a start-up context (not in the standard context of organizational change) and a risk evaluation matrix was created for outsourcing activities in order to define and implement a performance monitoring process. This study seeks to understand how to evaluate and assess the risks of an outsourcing strategy and proposes a monitoring model using risk management tools. It was shown that the risk management approach can be a solution for monitoring outsourcing in the organizational start-up phase. Conclusions concerning dissatisfaction with the results of outsourcing strategies adopted are also presented.

  8. Monitoring of Batch Industrial Crystallization with Growth, Nucleation, and Agglomeration. Part 1: Modeling with Method of Characteristics.

    PubMed

    Porru, Marcella; Özkan, Leyla

    2017-05-24

    This paper develops a new simulation model for crystal size distribution dynamics in industrial batch crystallization. The work is motivated by the necessity of accurate prediction models for online monitoring purposes. The proposed numerical scheme is able to handle growth, nucleation, and agglomeration kinetics by means of the population balance equation and the method of characteristics. The former offers a detailed description of the solid phase evolution, while the latter provides an accurate and efficient numerical solution. In particular, the accuracy of the prediction of the agglomeration kinetics, which cannot be ignored in industrial crystallization, has been assessed by comparing it with solutions in the literature. The efficiency of the solution has been tested on a simulation of a seeded flash cooling batch process. Since the proposed numerical scheme can accurately simulate the system behavior more than hundred times faster than the batch duration, it is suitable for online applications such as process monitoring tools based on state estimators.

  9. Monitoring of Batch Industrial Crystallization with Growth, Nucleation, and Agglomeration. Part 1: Modeling with Method of Characteristics

    PubMed Central

    2017-01-01

    This paper develops a new simulation model for crystal size distribution dynamics in industrial batch crystallization. The work is motivated by the necessity of accurate prediction models for online monitoring purposes. The proposed numerical scheme is able to handle growth, nucleation, and agglomeration kinetics by means of the population balance equation and the method of characteristics. The former offers a detailed description of the solid phase evolution, while the latter provides an accurate and efficient numerical solution. In particular, the accuracy of the prediction of the agglomeration kinetics, which cannot be ignored in industrial crystallization, has been assessed by comparing it with solutions in the literature. The efficiency of the solution has been tested on a simulation of a seeded flash cooling batch process. Since the proposed numerical scheme can accurately simulate the system behavior more than hundred times faster than the batch duration, it is suitable for online applications such as process monitoring tools based on state estimators. PMID:28603342

  10. Symbolic Constraint Maintenance Grid

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    Version 3.1 of Symbolic Constraint Maintenance Grid (SCMG) is a software system that provides a general conceptual framework for utilizing pre-existing programming techniques to perform symbolic transformations of data. SCMG also provides a language (and an associated communication method and protocol) for representing constraints on the original non-symbolic data. SCMG provides a facility for exchanging information between numeric and symbolic components without knowing the details of the components themselves. In essence, it integrates symbolic software tools (for diagnosis, prognosis, and planning) with non-artificial-intelligence software. SCMG executes a process of symbolic summarization and monitoring of continuous time series data that are being abstractly represented as symbolic templates of information exchange. This summarization process enables such symbolic- reasoning computing systems as artificial- intelligence planning systems to evaluate the significance and effects of channels of data more efficiently than would otherwise be possible. As a result of the increased efficiency in representation, reasoning software can monitor more channels and is thus able to perform monitoring and control functions more effectively.

  11. Regulated bioluminescence as a tool for bioremediation process monitoring and control of bacterial cultures

    NASA Technical Reports Server (NTRS)

    Burlage, Robert S.; Heitzer, Armin; Digrazia, Philip M.

    1991-01-01

    An effective on-line monitoring technique for toxic waste bioremediation using bioluminescent microorganisms has shown great potential for the description and optimization of biological processes. The lux genes of the bacterium Vibrio fischeri are used by this species to produce visible light. The lux genes can be genetically fused to the control region of a catabolic gene, with the result that bioluminescence is produced whenever the catabolic gene is induced. Thus the detection of light from a sample indicates that genetic expression from a specific gene is occurring. This technique was used to monitor biodegradation of specific contaminants from waste sites. For these studies, fusions between the lux genes and the operons for naphthalene and toluene/xylene degradation were constructed. Strains carrying one of these fusions respond sensitively and specifically to target substrates. Bioluminescence from these cultures can be rapidly measured in a nondestructive and noninvasive manner. The potential for this technique in this and other biological systems is discussed.

  12. A further tool to monitor the coffee roasting process: aroma composition and chemical indices.

    PubMed

    Ruosi, Manuela R; Cordero, Chiara; Cagliero, Cecilia; Rubiolo, Patrizia; Bicchi, Carlo; Sgorbini, Barbara; Liberto, Erica

    2012-11-14

    Coffee quality is strictly related to its flavor and aroma developed during the roasting process, that, in their turn, depend on variety and origin, harvest and postharvest practices, and the time, temperature, and degree of roasting. This study investigates the possibility of combining chemical (aroma components) and physical (color) parameters through chemometric approaches to monitor the roasting process, degree of roasting, and aroma formation by analyzing a suitable number of coffee samples from different varieties and blends. In particular, a correlation between the aroma composition of roasted coffee obtained by HS-SPME-GC-MS and degree of roasting, defined by the color, has been researched. The results showed that aroma components are linearly correlated to coffee color with a correlation factor of 0.9387. The study continued looking for chemical indices: 11 indices were found to be linearly correlated to the color resulting from the roasting process, the most effective of them being the 5-methylfurfural/2-acetylfuran ratio (index).

  13. Trends in non-stationary signal processing techniques applied to vibration analysis of wind turbine drive train - A contemporary survey

    NASA Astrophysics Data System (ADS)

    Uma Maheswari, R.; Umamaheswari, R.

    2017-02-01

    Condition Monitoring System (CMS) substantiates potential economic benefits and enables prognostic maintenance in wind turbine-generator failure prevention. Vibration Monitoring and Analysis is a powerful tool in drive train CMS, which enables the early detection of impending failure/damage. In variable speed drives such as wind turbine-generator drive trains, the vibration signal acquired is of non-stationary and non-linear. The traditional stationary signal processing techniques are inefficient to diagnose the machine faults in time varying conditions. The current research trend in CMS for drive-train focuses on developing/improving non-linear, non-stationary feature extraction and fault classification algorithms to improve fault detection/prediction sensitivity and selectivity and thereby reducing the misdetection and false alarm rates. In literature, review of stationary signal processing algorithms employed in vibration analysis is done at great extent. In this paper, an attempt is made to review the recent research advances in non-linear non-stationary signal processing algorithms particularly suited for variable speed wind turbines.

  14. Customizable tool for ecological data entry, assessment, monitoring, and interpretation

    USDA-ARS?s Scientific Manuscript database

    The Database for Inventory, Monitoring and Assessment (DIMA) is a highly customizable tool for data entry, assessment, monitoring, and interpretation. DIMA is a Microsoft Access database that can easily be used without Access knowledge and is available at no cost. Data can be entered for common, nat...

  15. Identification of Tool Wear when Machining of Austenitic Steels and Titatium by Miniature Machining

    NASA Astrophysics Data System (ADS)

    Pilc, Jozef; Kameník, Roman; Varga, Daniel; Martinček, Juraj; Sadilek, Marek

    2016-12-01

    Application of miniature machining is currently rapidly increasing mainly in biomedical industry and machining of hard-to-machine materials. Machinability of materials with increased level of toughness depends on factors that are important in the final state of surface integrity. Because of this, it is necessary to achieve high precision (varying in microns) in miniature machining. If we want to guarantee machining high precision, it is necessary to analyse tool wear intensity in direct interaction with given machined materials. During long-term cutting process, different cutting wedge deformations occur, leading in most cases to a rapid wear and destruction of the cutting wedge. This article deal with experimental monitoring of tool wear intensity during miniature machining.

  16. A case management tool for occupational health nurses: development, testing, and application.

    PubMed

    Mannon, J A; Conrad, K M; Blue, C L; Muran, S

    1994-08-01

    1. Case management is a process of coordinating an individual client's health care services to achieve optimal, quality care delivered in a cost effective manner. The case manager establishes a provider network, recommends treatment plans that assure quality and efficacy while controlling costs, monitors outcomes, and maintains a strong communication link among all the parties. 2. Through development of audit tools such as the one presented in this article, occupational health nurses can document case management activities and provide employers with measurable outcomes. 3. The Case Management Activity Checklist was tested using data from 61 firefighters' musculoskeletal injury cases. 4. The activities on the checklist are a step by step process: case identification/case disposition; assessment; return to work plan; resource identification; collaborative communication; and evaluation.

  17. Opportunities and challenges for structural health monitoring of radioactive waste systems and structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giurgiutiu, Victor; Mendez Torres, Adrian E.

    2013-07-01

    Radioactive waste systems and structures (RWSS) are safety-critical facilities in need of monitoring over prolonged periods of time. Structural health monitoring (SHM) is an emerging technology that aims at monitoring the state of a structure through the use of networks of permanently mounted sensors. SHM technologies have been developed primarily within the aerospace and civil engineering communities. This paper addresses the issue of transitioning the SHM concept to the monitoring of RWSS and evaluates the opportunities and challenges associated with this process. Guided wave SHM technologies utilizing structurally-mounted piezoelectric wafer active sensors (PWAS) have a wide range of applications basedmore » on both propagating-wave and standing-wave methodologies. Hence, opportunities exist for transitioning these SHM technologies into RWSS monitoring. However, there exist certain special operational conditions specific to RWSS such as: radiation field, caustic environments, marine environments, and chemical, mechanical and thermal stressors. In order to address the high discharge of used nuclear fuel (UNF) and the limited space in the storage pools the U.S. the Department of Energy (DOE) has adopted a 'Strategy for the Management and Disposal of Used Nuclear Fuel and High-Level Radioactive Waste' (January 2013). This strategy endorses the key principles that underpin the Blue Ribbon Commission's on America's Nuclear Future recommendations to develop a sustainable program for deploying an integrated system capable of transporting, storing, and disposing of UNF and high-level radioactive waste from civilian nuclear power generation, defense, national security, and other activities. This will require research to develop monitoring, diagnosis, and prognosis tools that can aid to establish a strong technical basis for extended storage and transportation of UNF. Monitoring of such structures is critical for assuring the safety and security of the nation's spent nuclear fuel until a national policy for closure of the nuclear fuel cycle is defined and implemented. In addition, such tools can provide invaluable and timely information for verification of the predicted mechanical performance of RWSS (e.g. concrete or steel barriers) during off-normal occurrence and accident events such as the tsunami and earthquake event that affected Fukushima Daiichi nuclear power plant. The ability to verify the conditions, health, and degradation behavior of RWSS over time by applying nondestructive testing (NDT) as well as development of nondestructive evaluation (NDE) tools for new degradation processes will become challenging. The paper discusses some of the challenges associated to verification and diagnosis for RWSS and identifies SHM technologies which are more readily available for transitioning into RWSS applications. Fundamental research objectives that should be considered for the transition of SHM technologies (e.g., radiation hardened piezoelectric materials) for RWSS applications are discussed. The paper ends with summary, conclusions, and suggestions for further work. (authors)« less

  18. Dietary Adherence Monitoring Tool for Free-living, Controlled Feeding Studies

    USDA-ARS?s Scientific Manuscript database

    Objective: To devise a dietary adherence monitoring tool for use in controlled human feeding trials involving free-living study participants. Methods: A scoring tool was devised to measure and track dietary adherence for an 8-wk randomized trial evaluating the effects of two different dietary patter...

  19. Near infrared (NIR) spectroscopy for in-line monitoring of polymer extrusion processes.

    PubMed

    Rohe, T; Becker, W; Kölle, S; Eisenreich, N; Eyerer, P

    1999-09-13

    In recent years, near infrared (NIR) spectroscopy has become an analytical tool frequently used in many chemical production processes. In particular, on-line measurements are of interest to increase process stability and to document constant product quality. Application to polymer processing e.g. polymer extrusion, could even increase product quality. Interesting parameters are composition of the processed polymer, moisture, or reaction status in reactive extrusion. For this issue a transmission sensor was developed for application of NIR spectroscopy to extrusion processes. This sensor includes fibre optic probes and a measuring cell to be adapted to various extruders for in-line measurements. In contrast to infrared sensors, it only uses optical quartz components. Extrusion processes at temperatures up to 300 degrees C and pressures up to 37 MPa have been investigated. Application of multivariate data analysis (e.g. partial least squares, PLS) demonstrated the performance of the system with respect to process monitoring: in the case of polymer blending, deviations between predicted and actual polymer composition were quite low (in the range of +/-0.25%). So the complete system is suitable for harsh industrial environments and could lead to improved polymer extrusion processes.

  20. Biomanufacturing process analytical technology (PAT) application for downstream processing: Using dissolved oxygen as an indicator of product quality for a protein refolding reaction.

    PubMed

    Pizarro, Shelly A; Dinges, Rachel; Adams, Rachel; Sanchez, Ailen; Winter, Charles

    2009-10-01

    Process analytical technology (PAT) is an initiative from the US FDA combining analytical and statistical tools to improve manufacturing operations and ensure regulatory compliance. This work describes the use of a continuous monitoring system for a protein refolding reaction to provide consistency in product quality and process performance across batches. A small-scale bioreactor (3 L) is used to understand the impact of aeration for refolding recombinant human vascular endothelial growth factor (rhVEGF) in a reducing environment. A reverse-phase HPLC assay is used to assess product quality. The goal in understanding the oxygen needs of the reaction and its impact to quality, is to make a product that is efficiently refolded to its native and active form with minimum oxidative degradation from batch to batch. Because this refolding process is heavily dependent on oxygen, the % dissolved oxygen (DO) profile is explored as a PAT tool to regulate process performance at commercial manufacturing scale. A dynamic gassing out approach using constant mass transfer (k(L)a) is used for scale-up of the aeration parameters to manufacturing scale tanks (2,000 L, 15,000 L). The resulting DO profiles of the refolding reaction show similar trends across scales and these are analyzed using rpHPLC. The desired product quality attributes are then achieved through alternating air and nitrogen sparging triggered by changes in the monitored DO profile. This approach mitigates the impact of differences in equipment or feedstock components between runs, and is directly inline with the key goal of PAT to "actively manage process variability using a knowledge-based approach." (c) 2009 Wiley Periodicals, Inc.

  1. A method to assess how interactive water simulation tools influence transdisciplinary decision-making processes in water management

    NASA Astrophysics Data System (ADS)

    Leskens, Johannes

    2015-04-01

    In modern water management, often transdisciplinary work sessions are organized in which various stakeholders participate to jointly define problems, choose measures and divide responsibilities to take actions. Involved stakeholders are for example policy analysts or decision-makers from municipalities, water boards or provinces, representatives of pressure groups and researchers from knowledge institutes. Parallel to this increasing attention for transdisciplinary work sessions, we see a growing availability of interactive IT-tools that can be applied during these sessions. For example, dynamic flood risk maps have become recently available that allow users during a work sessions to instantaneously assess the impact of storm surges or dam breaches, displayed on digital maps. Other examples are serious games, realistic visualizations and participatory simulations. However, the question is if and how these interactive IT-tools contribute to better decision-making. To assess this, we take the process of knowledge construction during a work session as a measure for the quality of decision-making. Knowledge construction can be defined as the process in which ideas, perspectives and opinions of different stakeholders, all having their own expertise and experience, are confronted with each other and new shared meanings towards water management issues are created. We present an assessment method to monitor the process of knowledge construction during work sessions in water management in which interactive IT tools are being used. The assessment method is based on a literature review, focusing on studies in which knowledge construction was monitored in other contexts that water management. To test the applicability of the assessment method, we applied it during a multi-stakeholder work session in Westland, located in the southwest of the Netherlands. The discussions during the work session were observed by camera. All statements, expressed by the various members of a stakeholder session, were classified according to our assessment method. We can draw the following preliminary conclusions. First, the case study showed that the method was useful to show the knowledge construction process over time, in terms of content and cognitive level of statements and interaction, attention and response between stakeholders. It was observed that the various aspects of knowledge construction all were influenced by the use of the 3Di model. The model focused discussions on technical issues of flood risk management, non-flood specialists were able to participate in discussions and in suggesting solutions and more topics could be evaluated in respect to non-interactive flood maps. Second, the method is considered useful as a benchmark for different interactive IT tools. The method is also considered useful to gain insight in how to optimally set-up multi-stakeholder meetings in which interactive IT-tools are being used. Further, the method can provide model developers insight in how to better meet the technical requirements of interactive IT tools to support the knowledge construction process during multi-stakeholder meeting

  2. Characterization of delamination and transverse cracking in graphite/epoxy laminates by acoustic emission

    NASA Technical Reports Server (NTRS)

    Garg, A.; Ishaei, O.

    1983-01-01

    Efforts to characterize and differentiate between two major failure processes in graphite/epoxy composites - transverse cracking and Mode I delamination are described. Representative laminates were tested in uniaxial tension and flexure. The failure processes were monitored and identified by acoustic emission (AE). The effect of moisture on AE was also investigated. Each damage process was found to have a distinctive AE output that is significantly affected by moisture conditions. It is concluded that AE can serve as a useful tool for detecting and identifying failure modes in composite structures in laboratory and in service environments.

  3. State of the art in on-line techniques coupled to flow injection analysis FIA/on-line- a critical review

    PubMed Central

    Puchades, R.; Maquieira, A.; Atienza, J.; Herrero, M. A.

    1990-01-01

    Flow injection analysis (FIA) has emerged as an increasingly used laboratory tool in chemical analysis. Employment of the technique for on-line sample treatment and on-line measurement in chemical process control is a growing trend. This article reviews the recent applications of FlA. Most papers refer to on-line sample treatment. Although FIA is very well suited to continuous on-line process monitoring, few examples have been found in this areamost of them have been applied to water treatment or fermentation processes. PMID:18925271

  4. Lean Six Sigma applied to a process innovation in a mexican health institute's imaging department.

    PubMed

    Garcia-Porres, J; Ortiz-Posadas, M R; Pimentel-Aguilar, A B

    2008-01-01

    Delivery of services to a patient has to be given with an acceptable measure of quality that can be monitored through the patient's satisfaction. The objective of this work was to innovate processes eliminating waste and non value-added work in processes done at the Imaging Department in the National Institute of Respiratory Diseases (INER for its Spanish acronym) in Mexico City, to decrease the time a patient spends in a study and increase satisfaction. This innovation will be done using Lean Six Sigma tools and applied in a pilot program.

  5. Enterprise tools to promote interoperability: MonitoringResources.org supports design and documentation of large-scale, long-term monitoringprograms

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.; Scully, R. A.; Bayer, J.

    2016-12-01

    Individual natural resource monitoring programs have evolved in response to different organizational mandates, jurisdictional needs, issues and questions. We are establishing a collaborative forum for large-scale, long-term monitoring programs to identify opportunities where collaboration could yield efficiency in monitoring design, implementation, analyses, and data sharing. We anticipate these monitoring programs will have similar requirements - e.g. survey design, standardization of protocols and methods, information management and delivery - that could be met by enterprise tools to promote sustainability, efficiency and interoperability of information across geopolitical boundaries or organizational cultures. MonitoringResources.org, a project of the Pacific Northwest Aquatic Monitoring Partnership, provides an on-line suite of enterprise tools focused on aquatic systems in the Pacific Northwest Region of the United States. We will leverage on and expand this existing capacity to support continental-scale monitoring of both aquatic and terrestrial systems. The current stakeholder group is focused on programs led by bureaus with the Department of Interior, but the tools will be readily and freely available to a broad variety of other stakeholders. Here, we report the results of two initial stakeholder workshops focused on (1) establishing a collaborative forum of large scale monitoring programs, (2) identifying and prioritizing shared needs, (3) evaluating existing enterprise resources, (4) defining priorities for development of enhanced capacity for MonitoringResources.org, and (5) identifying a small number of pilot projects that can be used to define and test development requirements for specific monitoring programs.

  6. The evolution of monitoring system: the INFN-CNAF case study

    NASA Astrophysics Data System (ADS)

    Bovina, Stefano; Michelotto, Diego

    2017-10-01

    Over the past two years, the operations at CNAF, the ICT center of the Italian Institute for Nuclear Physics, have undergone significant changes. The adoption of configuration management tools, such as Puppet, and the constant increase of dynamic and cloud infrastructures have led us to investigate a new monitoring approach. The present work deals with the centralization of the monitoring service at CNAF through a scalable and highly configurable monitoring infrastructure. The selection of tools has been made taking into account the following requirements given by users: (I) adaptability to dynamic infrastructures, (II) ease of configuration and maintenance, capability to provide more flexibility, (III) compatibility with existing monitoring system, (IV) re-usability and ease of access to information and data. In the paper, the CNAF monitoring infrastructure and its related components are hereafter described: Sensu as monitoring router, InfluxDB as time series database to store data gathered from sensors, Uchiwa as monitoring dashboard and Grafana as a tool to create dashboards and to visualize time series metrics.

  7. Updating Parameters for Volcanic Hazard Assessment Using Multi-parameter Monitoring Data Streams And Bayesian Belief Networks

    NASA Astrophysics Data System (ADS)

    Odbert, Henry; Aspinall, Willy

    2014-05-01

    Evidence-based hazard assessment at volcanoes assimilates knowledge about the physical processes of hazardous phenomena and observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We discuss the uncertainty of inferences, and how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.

  8. Combining Volcano Monitoring Timeseries Analyses with Bayesian Belief Networks to Update Hazard Forecast Estimates

    NASA Astrophysics Data System (ADS)

    Odbert, Henry; Hincks, Thea; Aspinall, Willy

    2015-04-01

    Volcanic hazard assessments must combine information about the physical processes of hazardous phenomena with observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We show how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.

  9. Electrochemical Biosensors: A Solution to Pollution Detection with Reference to Environmental Contaminants.

    PubMed

    Hernandez-Vargas, Gustavo; Sosa-Hernández, Juan Eduardo; Saldarriaga-Hernandez, Sara; Villalba-Rodríguez, Angel M; Parra-Saldivar, Roberto; Iqbal, Hafiz M N

    2018-03-24

    The increasing environmental pollution with particular reference to emerging contaminants, toxic heavy elements, and other hazardous agents is a serious concern worldwide. Considering this global issue, there is an urgent need to design and develop strategic measuring techniques with higher efficacy and precision to detect a broader spectrum of numerous contaminants. The development of precise instruments can further help in real-time and in-process monitoring of the generation and release of environmental pollutants from different industrial sectors. Moreover, real-time monitoring can also reduce the excessive consumption of several harsh chemicals and reagents with an added advantage of on-site determination of contaminant composition prior to discharge into the environment. With key scientific advances, electrochemical biosensors have gained considerable attention to solve this problem. Electrochemical biosensors can be an excellent fit as an analytical tool for monitoring programs to implement legislation. Herein, we reviewed the current trends in the use of electrochemical biosensors as novel tools to detect various contaminant types including toxic heavy elements. A particular emphasis was given to screen-printed electrodes, nanowire sensors, and paper-based biosensors and their role in the pollution detection processes. Towards the end, the work is wrapped up with concluding remarks and future perspectives. In summary, electrochemical biosensors and related areas such as bioelectronics, and (bio)-nanotechnology seem to be growing areas that will have a marked influence on the development of new bio-sensing strategies in future studies.

  10. Global Agricultural Monitoring (GLAM) using MODAPS and LANCE Data Products

    NASA Astrophysics Data System (ADS)

    Anyamba, A.; Pak, E. E.; Majedi, A. H.; Small, J. L.; Tucker, C. J.; Reynolds, C. A.; Pinzon, J. E.; Smith, M. M.

    2012-12-01

    The Global Inventory Modeling and Mapping Studies / Global Agricultural Monitoring (GIMMS GLAM) system is a web-based geographic application that offers Moderate Resolution Imaging Spectroradiometer (MODIS) imagery and user interface tools to data query and plot MODIS NDVI time series. The system processes near real-time and science quality Terra and Aqua MODIS 8-day composited datasets. These datasets are derived from the MOD09 and MYD09 surface reflectance products which are generated and provided by NASA/GSFC Land and Atmosphere Near Real-time Capability for EOS (LANCE) and NASA/GSFC MODIS Adaptive Processing System (MODAPS). The GIMMS GLAM system is developed and provided by the NASA/GSFC GIMMS group for the U.S. Department of Agriculture / Foreign Agricultural Service / International Production Assessment Division (USDA/FAS/IPAD) Global Agricultural Monitoring project (GLAM). The USDA/FAS/IPAD mission is to provide objective, timely, and regular assessment of the global agricultural production outlook and conditions affecting global food security. This system was developed to improve USDA/FAS/IPAD capabilities for making operational quantitative estimates for crop production and yield estimates based on satellite-derived data. The GIMMS GLAM system offers 1) web map imagery including Terra & Aqua MODIS 8-day composited NDVI, NDVI percent anomaly, and SWIR-NIR-Red band combinations, 2) web map overlays including administrative and 0.25 degree Land Information System (LIS) shape boundaries, and crop land cover masks, and 3) user interface tools to select features, data query, plot, and download MODIS NDVI time series.

  11. Optical coherence tomography as a novel tool for in-line monitoring of a pharmaceutical film-coating process.

    PubMed

    Markl, Daniel; Hannesschläger, Günther; Sacher, Stephan; Leitner, Michael; Khinast, Johannes G

    2014-05-13

    Optical coherence tomography (OCT) is a contact-free non-destructive high-resolution imaging technique based on low-coherence interferometry. This study investigates the application of spectral-domain OCT as an in-line quality control tool for monitoring pharmaceutical film-coated tablets. OCT images of several commercially-available film-coated tablets of different shapes, formulations and coating thicknesses were captured off-line using two OCT systems with centre wavelengths of 830nm and 1325nm. Based on the off-line image evaluation, another OCT system operating at a shorter wavelength was selected to study the feasibility of OCT as an in-line monitoring method. Since in spectral-domain OCT motion artefacts can occur as a result of the tablet or sensor head movement, a basic understanding of the relationship between the tablet speed and the motion effects is essential for correct quantifying and qualifying of the tablet coating. Experimental data was acquired by moving the sensor head of the OCT system across a static tablet bed. Although examining the homogeneity of the coating turned more difficult with increasing transverse speed of the tablets, the determination of the coating thickness was still highly accurate at a speed up to 0.7m/s. The presented OCT setup enables the investigation of the intra- and inter-tablet coating uniformity in-line during the coating process. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Coral Reef Surveillance: Infrared-Sensitive Video Surveillance Technology as a New Tool for Diurnal and Nocturnal Long-Term Field Observations.

    PubMed

    Dirnwoeber, Markus; Machan, Rudolf; Herler, Juergen

    2012-10-31

    Direct field observations of fine-scaled biological processes and interactions of the benthic community of corals and associated reef organisms (e.g., feeding, reproduction, mutualistic or agonistic behavior, behavioral responses to changing abiotic factors) usually involve a disturbing intervention. Modern digital camcorders (without inflexible land-or ship-based cable connection) such as the GoPro camera enable undisturbed and unmanned, stationary close-up observations. Such observations, however, are also very time-limited (~3 h) and full 24 h-recordings throughout day and night, including nocturnal observations without artificial daylight illumination, are not possible. Herein we introduce the application of modern standard video surveillance technology with the main objective of providing a tool for monitoring coral reef or other sessile and mobile organisms for periods of 24 h and longer. This system includes nocturnal close-up observations with miniature infrared (IR)-sensitive cameras and separate high-power IR-LEDs. Integrating this easy-to-set up and portable remote-sensing equipment into coral reef research is expected to significantly advance our understanding of fine-scaled biotic processes on coral reefs. Rare events and long-lasting processes can easily be recorded, in situ -experiments can be monitored live on land, and nocturnal IR-observations reveal undisturbed behavior. The options and equipment choices in IR-sensitive surveillance technology are numerous and subject to a steadily increasing technical supply and quality at decreasing prices. Accompanied by short video examples, this report introduces a radio-transmission system for simultaneous recordings and real-time monitoring of multiple cameras with synchronized timestamps, and a surface-independent underwater-recording system.

  13. Coral Reef Surveillance: Infrared-Sensitive Video Surveillance Technology as a New Tool for Diurnal and Nocturnal Long-Term Field Observations

    PubMed Central

    Dirnwoeber, Markus; Machan, Rudolf; Herler, Juergen

    2014-01-01

    Direct field observations of fine-scaled biological processes and interactions of the benthic community of corals and associated reef organisms (e.g., feeding, reproduction, mutualistic or agonistic behavior, behavioral responses to changing abiotic factors) usually involve a disturbing intervention. Modern digital camcorders (without inflexible land-or ship-based cable connection) such as the GoPro camera enable undisturbed and unmanned, stationary close-up observations. Such observations, however, are also very time-limited (~3 h) and full 24 h-recordings throughout day and night, including nocturnal observations without artificial daylight illumination, are not possible. Herein we introduce the application of modern standard video surveillance technology with the main objective of providing a tool for monitoring coral reef or other sessile and mobile organisms for periods of 24 h and longer. This system includes nocturnal close-up observations with miniature infrared (IR)-sensitive cameras and separate high-power IR-LEDs. Integrating this easy-to-set up and portable remote-sensing equipment into coral reef research is expected to significantly advance our understanding of fine-scaled biotic processes on coral reefs. Rare events and long-lasting processes can easily be recorded, in situ-experiments can be monitored live on land, and nocturnal IR-observations reveal undisturbed behavior. The options and equipment choices in IR-sensitive surveillance technology are numerous and subject to a steadily increasing technical supply and quality at decreasing prices. Accompanied by short video examples, this report introduces a radio-transmission system for simultaneous recordings and real-time monitoring of multiple cameras with synchronized timestamps, and a surface-independent underwater-recording system. PMID:24829763

  14. Developing enterprise tools and capacities for large-scale natural resource monitoring: A visioning workshop

    USGS Publications Warehouse

    Bayer, Jennifer M.; Weltzin, Jake F.; Scully, Rebecca A.

    2017-01-01

    Objectives of the workshop were: 1) identify resources that support natural resource monitoring programs working across the data life cycle; 2) prioritize desired capacities and tools to facilitate monitoring design and implementation; 3) identify standards and best practices that improve discovery, accessibility, and interoperability of data across programs and jurisdictions; and 4) contribute to an emerging community of practice focused on natural resource monitoring.

  15. Design tool for inventory and monitoring

    Treesearch

    Charles T. Scott; Renate Bush

    2009-01-01

    Forest survey planning typically begins by determining the area to be sampled and the attributes to be measured. All too often the data are collected but underutilized because they did not address the critical management questions. The Design Tool for Inventory and Monitoring (DTIM) is being developed by the National Inventory and Monitoring Applications Center in...

  16. Biological Effects–Based Tools for Monitoring Impacted Surface Waters in the Great Lakes: A Multiagency Program in Support of the Great Lakes Restoration Initiative

    EPA Science Inventory

    There is increasing demand for the implementation of effects-based monitoring and surveillance (EBMS) approaches in the Great Lakes Basin to complement traditional chemical monitoring. Herein, we describe an ongoing multiagency effort to develop and implement EBMS tools, particul...

  17. Web-Based Mathematics Progress Monitoring in Second Grade

    ERIC Educational Resources Information Center

    Salaschek, Martin; Souvignier, Elmar

    2014-01-01

    We examined a web-based mathematics progress monitoring tool for second graders. The tool monitors the learning progress of two competences, number sense and computation. A total of 414 students from 19 classrooms in Germany were checked every 3 weeks from fall to spring. Correlational analyses indicate that alternate-form reliability was adequate…

  18. Optimal distribution of borehole geophones for monitoring CO2-injection-induced seismicity

    NASA Astrophysics Data System (ADS)

    Huang, L.; Chen, T.; Foxall, W.; Wagoner, J. L.

    2016-12-01

    The U.S. DOE initiative, National Risk Assessment Partnership (NRAP), aims to develop quantitative risk assessment methodologies for carbon capture, utilization and storage (CCUS). As part of tasks of the Strategic Monitoring Group of NRAP, we develop a tool for optimal design of a borehole geophones distribution for monitoring CO2-injection-induced seismicity. The tool consists of a number of steps, including building a geophysical model for a given CO2 injection site, defining target monitoring regions within CO2-injection/migration zones, generating synthetic seismic data, giving acceptable uncertainties in input data, and determining the optimal distribution of borehole geophones. We use a synthetic geophysical model as an example to demonstrate the capability our new tool to design an optimal/cost-effective passive seismic monitoring network using borehole geophones. The model is built based on the geologic features found at the Kimberlina CCUS pilot site located in southern San Joaquin Valley, California. This tool can provide CCUS operators with a guideline for cost-effective microseismic monitoring of geologic carbon storage and utilization.

  19. Self-Monitoring Symptoms in Glaucoma: A Feasibility Study of a Web-Based Diary Tool

    PubMed Central

    McDonald, Leanne; Glen, Fiona C.; Taylor, Deanna J.

    2017-01-01

    Purpose. Glaucoma patients annually spend only a few hours in an eye clinic but spend more than 5000 waking hours engaged in everything else. We propose that patients could self-monitor changes in visual symptoms providing valuable between clinic information; we test the hypothesis that this is feasible using a web-based diary tool. Methods. Ten glaucoma patients with a range of visual field loss took part in an eight-week pilot study. After completing a series of baseline tests, volunteers were prompted to monitor symptoms every three days and complete a diary about their vision during daily life using a bespoke web-based diary tool. Response to an end of a study questionnaire about the usefulness of the exercise was a main outcome measure. Results. Eight of the 10 patients rated the monitoring scheme to be “valuable” or “very valuable.” Completion rate to items was excellent (96%). Themes from a qualitative synthesis of the diary entries related to behavioural aspects of glaucoma. One patient concluded that a constant focus on monitoring symptoms led to negative feelings. Conclusions. A web-based diary tool for monitoring self-reported glaucoma symptoms is practically feasible. The tool must be carefully designed to ensure participants are benefitting, and it is not increasing anxiety. PMID:28546876

  20. Monitoring Object Library Usage and Changes

    NASA Technical Reports Server (NTRS)

    Owen, R. K.; Craw, James M. (Technical Monitor)

    1995-01-01

    The NASA Ames Numerical Aerodynamic Simulation program Aeronautics Consolidated Supercomputing Facility (NAS/ACSF) supercomputing center services over 1600 users, and has numerous analysts with root access. Several tools have been developed to monitor object library usage and changes. Some of the tools do "noninvasive" monitoring and other tools implement run-time logging even for object-only libraries. The run-time logging identifies who, when, and what is being used. The benefits are that real usage can be measured, unused libraries can be discontinued, training and optimization efforts can be focused at those numerical methods that are actually used. An overview of the tools will be given and the results will be discussed.

Top