Sample records for statistical process monitoring

  1. Batch Statistical Process Monitoring Approach to a Cocrystallization Process.

    PubMed

    Sarraguça, Mafalda C; Ribeiro, Paulo R S; Dos Santos, Adenilson O; Lopes, João A

    2015-12-01

    Cocrystals are defined as crystalline structures composed of two or more compounds that are solid at room temperature held together by noncovalent bonds. Their main advantages are the increase of solubility, bioavailability, permeability, stability, and at the same time retaining active pharmaceutical ingredient bioactivity. The cocrystallization between furosemide and nicotinamide by solvent evaporation was monitored on-line using near-infrared spectroscopy (NIRS) as a process analytical technology tool. The near-infrared spectra were analyzed using principal component analysis. Batch statistical process monitoring was used to create control charts to perceive the process trajectory and define control limits. Normal and non-normal operating condition batches were performed and monitored with NIRS. The use of NIRS associated with batch statistical process models allowed the detection of abnormal variations in critical process parameters, like the amount of solvent or amount of initial components present in the cocrystallization. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  2. 77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-02

    ...] Statistical Process Controls for Blood Establishments; Public Workshop AGENCY: Food and Drug Administration... workshop entitled: ``Statistical Process Controls for Blood Establishments.'' The purpose of this public workshop is to discuss the implementation of statistical process controls to validate and monitor...

  3. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    PubMed

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  4. Information integration and diagnosis analysis of equipment status and production quality for machining process

    NASA Astrophysics Data System (ADS)

    Zan, Tao; Wang, Min; Hu, Jianzhong

    2010-12-01

    Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.

  5. Monitoring Satellite Data Ingest and Processing for the Atmosphere Science Investigator-led Processing Systems (SIPS)

    NASA Astrophysics Data System (ADS)

    Witt, J.; Gumley, L.; Braun, J.; Dutcher, S.; Flynn, B.

    2017-12-01

    The Atmosphere SIPS (Science Investigator-led Processing Systems) team at the Space Science and Engineering Center (SSEC), which is funded through a NASA contract, creates Level 2 cloud and aerosol products from the VIIRS instrument aboard the S-NPP satellite. In order to monitor the ingest and processing of files, we have developed an extensive monitoring system to observe every step in the process. The status grid is used for real time monitoring, and shows the current state of the system, including what files we have and whether or not we are meeting our latency requirements. Our snapshot tool displays the state of the system in the past. It displays which files were available at a given hour and is used for historical and backtracking purposes. In addition to these grid like tools we have created histograms and other statistical graphs for tracking processing and ingest metrics, such as total processing time, job queue time, and latency statistics.

  6. Particle monitoring and control in vacuum processing equipment

    NASA Astrophysics Data System (ADS)

    Borden, Peter G., Dr.; Gregg, John

    1989-10-01

    Particle contamination during vacuum processes has emerged as the largest single source of yield loss in VLSI manufacturing. While a number of tools have been available to help understand the sources and nature of this contamination, only recently has it been possible to monitor free particle levels within vacuum equipment in real-time. As a result, a better picture is available of how particle contamination can affect a variety of processes. This paper reviews some of the work that has been done to monitor particles in vacuum loadlocks and in processes such as etching, sputtering and ion implantation. The aim has been to make free particles in vacuum equipment a measurable process parameter. Achieving this allows particles to be controlled using statistical process control. It will be shown that free particle levels in load locks correlate to wafer surface counts, device yield and process conditions, but that these levels are considerable higher during production than when dummy wafers are run to qualify a system. It will also be shown how real-time free particle monitoring can be used to monitor and control cleaning cycles, how major episodic events can be detected, and how data can be gathered in a format suitable for statistical process control.

  7. Hybrid statistical testing for nuclear material accounting data and/or process monitoring data in nuclear safeguards

    DOE PAGES

    Burr, Tom; Hamada, Michael S.; Ticknor, Larry; ...

    2015-01-01

    The aim of nuclear safeguards is to ensure that special nuclear material is used for peaceful purposes. Historically, nuclear material accounting (NMA) has provided the quantitative basis for monitoring for nuclear material loss or diversion, and process monitoring (PM) data is collected by the operator to monitor the process. PM data typically support NMA in various ways, often by providing a basis to estimate some of the in-process nuclear material inventory. We develop options for combining PM residuals and NMA residuals (residual = measurement - prediction), using a hybrid of period-driven and data-driven hypothesis testing. The modified statistical tests canmore » be used on time series of NMA residuals (the NMA residual is the familiar material balance), or on a combination of PM and NMA residuals. The PM residuals can be generated on a fixed time schedule or as events occur.« less

  8. Control charts for monitoring accumulating adverse event count frequencies from single and multiple blinded trials.

    PubMed

    Gould, A Lawrence

    2016-12-30

    Conventional practice monitors accumulating information about drug safety in terms of the numbers of adverse events reported from trials in a drug development program. Estimates of between-treatment adverse event risk differences can be obtained readily from unblinded trials with adjustment for differences among trials using conventional statistical methods. Recent regulatory guidelines require monitoring the cumulative frequency of adverse event reports to identify possible between-treatment adverse event risk differences without unblinding ongoing trials. Conventional statistical methods for assessing between-treatment adverse event risks cannot be applied when the trials are blinded. However, CUSUM charts can be used to monitor the accumulation of adverse event occurrences. CUSUM charts for monitoring adverse event occurrence in a Bayesian paradigm are based on assumptions about the process generating the adverse event counts in a trial as expressed by informative prior distributions. This article describes the construction of control charts for monitoring adverse event occurrence based on statistical models for the processes, characterizes their statistical properties, and describes how to construct useful prior distributions. Application of the approach to two adverse events of interest in a real trial gave nearly identical results for binomial and Poisson observed event count likelihoods. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Multivariate statistical process control of a continuous pharmaceutical twin-screw granulation and fluid bed drying process.

    PubMed

    Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A

    2017-08-07

    A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Expert system and process optimization techniques for real-time monitoring and control of plasma processes

    NASA Astrophysics Data System (ADS)

    Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.

    1991-03-01

    To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).

  11. Statistical Process Control Charts for Measuring and Monitoring Temporal Consistency of Ratings

    ERIC Educational Resources Information Center

    Omar, M. Hafidz

    2010-01-01

    Methods of statistical process control were briefly investigated in the field of educational measurement as early as 1999. However, only the use of a cumulative sum chart was explored. In this article other methods of statistical quality control are introduced and explored. In particular, methods in the form of Shewhart mean and standard deviation…

  12. Statistically Qualified Neuro-Analytic system and Method for Process Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    1998-11-04

    An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less

  13. Coupling Sensing Hardware with Data Interrogation Software for Structural Health Monitoring

    DOE PAGES

    Farrar, Charles R.; Allen, David W.; Park, Gyuhae; ...

    2006-01-01

    The process of implementing a damage detection strategy for aerospace, civil and mechanical engineering infrastructure is referred to as structural health monitoring (SHM). The authors' approach is to address the SHM problem in the context of a statistical pattern recognition paradigm. In this paradigm, the process can be broken down into four parts: (1) Operational Evaluation, (2) Data Acquisition and Cleansing, (3) Feature Extraction and Data Compression, and (4) Statistical Model Development for Feature Discrimination. These processes must be implemented through hardware or software and, in general, some combination of these two approaches will be used. This paper will discussmore » each portion of the SHM process with particular emphasis on the coupling of a general purpose data interrogation software package for structural health monitoring with a modular wireless sensing and processing platform. More specifically, this paper will address the need to take an integrated hardware/software approach to developing SHM solutions.« less

  14. Errors in patient specimen collection: application of statistical process control.

    PubMed

    Dzik, Walter Sunny; Beckman, Neil; Selleng, Kathleen; Heddle, Nancy; Szczepiorkowski, Zbigniew; Wendel, Silvano; Murphy, Michael

    2008-10-01

    Errors in the collection and labeling of blood samples for pretransfusion testing increase the risk of transfusion-associated patient morbidity and mortality. Statistical process control (SPC) is a recognized method to monitor the performance of a critical process. An easy-to-use SPC method was tested to determine its feasibility as a tool for monitoring quality in transfusion medicine. SPC control charts were adapted to a spreadsheet presentation. Data tabulating the frequency of mislabeled and miscollected blood samples from 10 hospitals in five countries from 2004 to 2006 were used to demonstrate the method. Control charts were produced to monitor process stability. The participating hospitals found the SPC spreadsheet very suitable to monitor the performance of the sample labeling and collection and applied SPC charts to suit their specific needs. One hospital monitored subcategories of sample error in detail. A large hospital monitored the number of wrong-blood-in-tube (WBIT) events. Four smaller-sized facilities, each following the same policy for sample collection, combined their data on WBIT samples into a single control chart. One hospital used the control chart to monitor the effect of an educational intervention. A simple SPC method is described that can monitor the process of sample collection and labeling in any hospital. SPC could be applied to other critical steps in the transfusion processes as a tool for biovigilance and could be used to develop regional or national performance standards for pretransfusion sample collection. A link is provided to download the spreadsheet for free.

  15. Statistical porcess control in Deep Space Network operation

    NASA Technical Reports Server (NTRS)

    Hodder, J. A.

    2002-01-01

    This report describes how the Deep Space Mission System (DSMS) Operations Program Office at the Jet Propulsion Laboratory's (EL) uses Statistical Process Control (SPC) to monitor performance and evaluate initiatives for improving processes on the National Aeronautics and Space Administration's (NASA) Deep Space Network (DSN).

  16. Risk-based Methodology for Validation of Pharmaceutical Batch Processes.

    PubMed

    Wiles, Frederick

    2013-01-01

    In January 2011, the U.S. Food and Drug Administration published new process validation guidance for pharmaceutical processes. The new guidance debunks the long-held industry notion that three consecutive validation batches or runs are all that are required to demonstrate that a process is operating in a validated state. Instead, the new guidance now emphasizes that the level of monitoring and testing performed during process performance qualification (PPQ) studies must be sufficient to demonstrate statistical confidence both within and between batches. In some cases, three qualification runs may not be enough. Nearly two years after the guidance was first published, little has been written defining a statistical methodology for determining the number of samples and qualification runs required to satisfy Stage 2 requirements of the new guidance. This article proposes using a combination of risk assessment, control charting, and capability statistics to define the monitoring and testing scheme required to show that a pharmaceutical batch process is operating in a validated state. In this methodology, an assessment of process risk is performed through application of a process failure mode, effects, and criticality analysis (PFMECA). The output of PFMECA is used to select appropriate levels of statistical confidence and coverage which, in turn, are used in capability calculations to determine when significant Stage 2 (PPQ) milestones have been met. The achievement of Stage 2 milestones signals the release of batches for commercial distribution and the reduction of monitoring and testing to commercial production levels. Individuals, moving range, and range/sigma charts are used in conjunction with capability statistics to demonstrate that the commercial process is operating in a state of statistical control. The new process validation guidance published by the U.S. Food and Drug Administration in January of 2011 indicates that the number of process validation batches or runs required to demonstrate that a pharmaceutical process is operating in a validated state should be based on sound statistical principles. The old rule of "three consecutive batches and you're done" is no longer sufficient. The guidance, however, does not provide any specific methodology for determining the number of runs required, and little has been published to augment this shortcoming. The paper titled "Risk-based Methodology for Validation of Pharmaceutical Batch Processes" describes a statistically sound methodology for determining when a statistically valid number of validation runs has been acquired based on risk assessment and calculation of process capability.

  17. Multivariate statistical monitoring as applied to clean-in-place (CIP) and steam-in-place (SIP) operations in biopharmaceutical manufacturing.

    PubMed

    Roy, Kevin; Undey, Cenk; Mistretta, Thomas; Naugle, Gregory; Sodhi, Manbir

    2014-01-01

    Multivariate statistical process monitoring (MSPM) is becoming increasingly utilized to further enhance process monitoring in the biopharmaceutical industry. MSPM can play a critical role when there are many measurements and these measurements are highly correlated, as is typical for many biopharmaceutical operations. Specifically, for processes such as cleaning-in-place (CIP) and steaming-in-place (SIP, also known as sterilization-in-place), control systems typically oversee the execution of the cycles, and verification of the outcome is based on offline assays. These offline assays add to delays and corrective actions may require additional setup times. Moreover, this conventional approach does not take interactive effects of process variables into account and cycle optimization opportunities as well as salient trends in the process may be missed. Therefore, more proactive and holistic online continued verification approaches are desirable. This article demonstrates the application of real-time MSPM to processes such as CIP and SIP with industrial examples. The proposed approach has significant potential for facilitating enhanced continuous verification, improved process understanding, abnormal situation detection, and predictive monitoring, as applied to CIP and SIP operations. © 2014 American Institute of Chemical Engineers.

  18. Statistical Process Control in the Practice of Program Evaluation.

    ERIC Educational Resources Information Center

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

  19. Feasibility Study on the Use of On-line Multivariate Statistical Process Control for Safeguards Applications in Natural Uranium Conversion Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ladd-Lively, Jennifer L

    2014-01-01

    The objective of this work was to determine the feasibility of using on-line multivariate statistical process control (MSPC) for safeguards applications in natural uranium conversion plants. Multivariate statistical process control is commonly used throughout industry for the detection of faults. For safeguards applications in uranium conversion plants, faults could include the diversion of intermediate products such as uranium dioxide, uranium tetrafluoride, and uranium hexafluoride. This study was limited to a 100 metric ton of uranium (MTU) per year natural uranium conversion plant (NUCP) using the wet solvent extraction method for the purification of uranium ore concentrate. A key component inmore » the multivariate statistical methodology is the Principal Component Analysis (PCA) approach for the analysis of data, development of the base case model, and evaluation of future operations. The PCA approach was implemented through the use of singular value decomposition of the data matrix where the data matrix represents normal operation of the plant. Component mole balances were used to model each of the process units in the NUCP. However, this approach could be applied to any data set. The monitoring framework developed in this research could be used to determine whether or not a diversion of material has occurred at an NUCP as part of an International Atomic Energy Agency (IAEA) safeguards system. This approach can be used to identify the key monitoring locations, as well as locations where monitoring is unimportant. Detection limits at the key monitoring locations can also be established using this technique. Several faulty scenarios were developed to test the monitoring framework after the base case or normal operating conditions of the PCA model were established. In all of the scenarios, the monitoring framework was able to detect the fault. Overall this study was successful at meeting the stated objective.« less

  20. Nuclear Explosion and Infrasound Event Resources of the SMDC Monitoring Research Program

    DTIC Science & Technology

    2008-09-01

    2008 Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies 928 Figure 7. Dozens of detected infrasound signals from...investigate alternative detection schemes at the two infrasound arrays based on frequency-wavenumber (fk) processing and the F-statistic. The results of... infrasound signal - detection processing schemes. REFERENCES Bahavar, M., B. Barker, J. Bennett, R. Bowman, H. Israelsson, B. Kohl, Y-L. Kung, J. Murphy

  1. Attitudes toward statistics in medical postgraduates: measuring, evaluating and monitoring.

    PubMed

    Zhang, Yuhai; Shang, Lei; Wang, Rui; Zhao, Qinbo; Li, Chanjuan; Xu, Yongyong; Su, Haixia

    2012-11-23

    In medical training, statistics is considered a very difficult course to learn and teach. Current studies have found that students' attitudes toward statistics can influence their learning process. Measuring, evaluating and monitoring the changes of students' attitudes toward statistics are important. Few studies have focused on the attitudes of postgraduates, especially medical postgraduates. Our purpose was to understand current attitudes regarding statistics held by medical postgraduates and explore their effects on students' achievement. We also wanted to explore the influencing factors and the sources of these attitudes and monitor their changes after a systematic statistics course. A total of 539 medical postgraduates enrolled in a systematic statistics course completed the pre-form of the Survey of Attitudes Toward Statistics -28 scale, and 83 postgraduates were selected randomly from among them to complete the post-form scale after the course. Most medical postgraduates held positive attitudes toward statistics, but they thought statistics was a very difficult subject. The attitudes mainly came from experiences in a former statistical or mathematical class. Age, level of statistical education, research experience, specialty and mathematics basis may influence postgraduate attitudes toward statistics. There were significant positive correlations between course achievement and attitudes toward statistics. In general, student attitudes showed negative changes after completing a statistics course. The importance of student attitudes toward statistics must be recognized in medical postgraduate training. To make sure all students have a positive learning environment, statistics teachers should measure their students' attitudes and monitor their change of status during a course. Some necessary assistance should be offered for those students who develop negative attitudes.

  2. Statistical package for improved analysis of hillslope monitoring data collected as part of the Board of Forestry's long-term monitoring program

    Treesearch

    Jack Lewis; Jim Baldwin

    1997-01-01

    The State of California has embarked upon a Long-Term Monitoring Program whose primary goal is to assess the effectiveness of the Forest Practice Rules and Review Process in protecting the beneficial uses of waters from the impacts of timber operations on private timberlands. The Board of Forestry's Monitoring Study Group concluded that hillslope monitoring should...

  3. Attitudes toward statistics in medical postgraduates: measuring, evaluating and monitoring

    PubMed Central

    2012-01-01

    Background In medical training, statistics is considered a very difficult course to learn and teach. Current studies have found that students’ attitudes toward statistics can influence their learning process. Measuring, evaluating and monitoring the changes of students’ attitudes toward statistics are important. Few studies have focused on the attitudes of postgraduates, especially medical postgraduates. Our purpose was to understand current attitudes regarding statistics held by medical postgraduates and explore their effects on students’ achievement. We also wanted to explore the influencing factors and the sources of these attitudes and monitor their changes after a systematic statistics course. Methods A total of 539 medical postgraduates enrolled in a systematic statistics course completed the pre-form of the Survey of Attitudes Toward Statistics −28 scale, and 83 postgraduates were selected randomly from among them to complete the post-form scale after the course. Results Most medical postgraduates held positive attitudes toward statistics, but they thought statistics was a very difficult subject. The attitudes mainly came from experiences in a former statistical or mathematical class. Age, level of statistical education, research experience, specialty and mathematics basis may influence postgraduate attitudes toward statistics. There were significant positive correlations between course achievement and attitudes toward statistics. In general, student attitudes showed negative changes after completing a statistics course. Conclusions The importance of student attitudes toward statistics must be recognized in medical postgraduate training. To make sure all students have a positive learning environment, statistics teachers should measure their students’ attitudes and monitor their change of status during a course. Some necessary assistance should be offered for those students who develop negative attitudes. PMID:23173770

  4. Diagnosis of abnormal patterns in multivariate microclimate monitoring: a case study of an open-air archaeological site in Pompeii (Italy).

    PubMed

    Merello, Paloma; García-Diego, Fernando-Juan; Zarzo, Manuel

    2014-08-01

    Chemometrics has been applied successfully since the 1990s for the multivariate statistical control of industrial processes. A new area of interest for these tools is the microclimatic monitoring of cultural heritage. Sensors record climatic parameters over time and statistical data analysis is performed to obtain valuable information for preventive conservation. A case study of an open-air archaeological site is presented here. A set of 26 temperature and relative humidity data-loggers was installed in four rooms of Ariadne's house (Pompeii). If climatic values are recorded versus time at different positions, the resulting data structure is equivalent to records of physical parameters registered at several points of a continuous chemical process. However, there is an important difference in this case: continuous processes are controlled to reach a steady state, whilst open-air sites undergo tremendous fluctuations. Although data from continuous processes are usually column-centred prior to applying principal components analysis, it turned out that another pre-treatment (row-centred data) was more convenient for the interpretation of components and to identify abnormal patterns. The detection of typical trajectories was more straightforward by dividing the whole monitored period into several sub-periods, because the marked climatic fluctuations throughout the year affect the correlation structures. The proposed statistical methodology is of interest for the microclimatic monitoring of cultural heritage, particularly in the case of open-air or semi-confined archaeological sites. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Improvement of statistical methods for detecting anomalies in climate and environmental monitoring systems

    NASA Astrophysics Data System (ADS)

    Yakunin, A. G.; Hussein, H. M.

    2018-01-01

    The article shows how the known statistical methods, which are widely used in solving financial problems and a number of other fields of science and technology, can be effectively applied after minor modification for solving such problems in climate and environment monitoring systems, as the detection of anomalies in the form of abrupt changes in signal levels, the occurrence of positive and negative outliers and the violation of the cycle form in periodic processes.

  6. CISN ShakeAlert Earthquake Early Warning System Monitoring Tools

    NASA Astrophysics Data System (ADS)

    Henson, I. H.; Allen, R. M.; Neuhauser, D. S.

    2015-12-01

    CISN ShakeAlert is a prototype earthquake early warning system being developed and tested by the California Integrated Seismic Network. The system has recently been expanded to support redundant data processing and communications. It now runs on six machines at three locations with ten Apache ActiveMQ message brokers linking together 18 waveform processors, 12 event association processes and 4 Decision Module alert processes. The system ingests waveform data from about 500 stations and generates many thousands of triggers per day, from which a small portion produce earthquake alerts. We have developed interactive web browser system-monitoring tools that display near real time state-of-health and performance information. This includes station availability, trigger statistics, communication and alert latencies. Connections to regional earthquake catalogs provide a rapid assessment of the Decision Module hypocenter accuracy. Historical performance can be evaluated, including statistics for hypocenter and origin time accuracy and alert time latencies for different time periods, magnitude ranges and geographic regions. For the ElarmS event associator, individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers. Individual station trigger and latency statistics are available. Detailed information about the ElarmS trigger association process for both alerted events and rejected events is also available. The Google Web Toolkit and Map API have been used to develop interactive web pages that link tabular and geographic information. Statistical analysis is provided by the R-Statistics System linked to a PostgreSQL database.

  7. Monitoring of an antigen manufacturing process.

    PubMed

    Zavatti, Vanessa; Budman, Hector; Legge, Raymond; Tamer, Melih

    2016-06-01

    Fluorescence spectroscopy in combination with multivariate statistical methods was employed as a tool for monitoring the manufacturing process of pertactin (PRN), one of the virulence factors of Bordetella pertussis utilized in whopping cough vaccines. Fluorophores such as amino acids and co-enzymes were detected throughout the process. The fluorescence data collected at different stages of the fermentation and purification process were treated employing principal component analysis (PCA). Through PCA, it was feasible to identify sources of variability in PRN production. Then, partial least square (PLS) was employed to correlate the fluorescence spectra obtained from pure PRN samples and the final protein content measured by a Kjeldahl test from these samples. In view that a statistically significant correlation was found between fluorescence and PRN levels, this approach could be further used as a method to predict the final protein content.

  8. Does daily nurse staffing match ward workload variability? Three hospitals' experiences.

    PubMed

    Gabbay, Uri; Bukchin, Michael

    2009-01-01

    Nurse shortage and rising healthcare resource burdens mean that appropriate workforce use is imperative. This paper aims to evaluate whether daily nursing staffing meets ward workload needs. Nurse attendance and daily nurses' workload capacity in three hospitals were evaluated. Statistical process control was used to evaluate intra-ward nurse workload capacity and day-to-day variations. Statistical process control is a statistics-based method for process monitoring that uses charts with predefined target measure and control limits. Standardization was performed for inter-ward analysis by converting ward-specific crude measures to ward-specific relative measures by dividing observed/expected. Two charts: acceptable and tolerable daily nurse workload intensity, were defined. Appropriate staffing indicators were defined as those exceeding predefined rates within acceptable and tolerable limits (50 percent and 80 percent respectively). A total of 42 percent of the overall days fell within acceptable control limits and 71 percent within tolerable control limits. Appropriate staffing indicators were met in only 33 percent of wards regarding acceptable nurse workload intensity and in only 45 percent of wards regarding tolerable workloads. The study work did not differentiate crude nurse attendance and it did not take into account patient severity since crude bed occupancy was used. Double statistical process control charts and certain staffing indicators were used, which is open to debate. Wards that met appropriate staffing indicators prove the method's feasibility. Wards that did not meet appropriate staffing indicators prove the importance and the need for process evaluations and monitoring. Methods presented for monitoring daily staffing appropriateness are simple to implement either for intra-ward day-to-day variation by using nurse workload capacity statistical process control charts or for inter-ward evaluation using standardized measure of nurse workload intensity. The real challenge will be to develop planning systems and implement corrective interventions such as dynamic and flexible daily staffing, which will face difficulties and barriers. The paper fulfils the need for workforce utilization evaluation. A simple method using available data for daily staffing appropriateness evaluation, which is easy to implement and operate, is presented. The statistical process control method enables intra-ward evaluation, while standardization by converting crude into relative measures enables inter-ward analysis. The staffing indicator definitions enable performance evaluation. This original study uses statistical process control to develop simple standardization methods and applies straightforward statistical tools. This method is not limited to crude measures, rather it uses weighted workload measures such as nursing acuity or weighted nurse level (i.e. grade/band).

  9. Development of a Dmt Monitor for Statistical Tracking of Gravitational-Wave Burst Triggers Generated from the Omega Pipeline

    NASA Astrophysics Data System (ADS)

    Li, Jun-Wei; Cao, Jun-Wei

    2010-04-01

    One challenge in large-scale scientific data analysis is to monitor data in real-time in a distributed environment. For the LIGO (Laser Interferometer Gravitational-wave Observatory) project, a dedicated suit of data monitoring tools (DMT) has been developed, yielding good extensibility to new data type and high flexibility to a distributed environment. Several services are provided, including visualization of data information in various forms and file output of monitoring results. In this work, a DMT monitor, OmegaMon, is developed for tracking statistics of gravitational-wave (OW) burst triggers that are generated from a specific OW burst data analysis pipeline, the Omega Pipeline. Such results can provide diagnostic information as reference of trigger post-processing and interferometer maintenance.

  10. Algorithm for computing descriptive statistics for very large data sets and the exa-scale era

    NASA Astrophysics Data System (ADS)

    Beekman, Izaak

    2017-11-01

    An algorithm for Single-point, Parallel, Online, Converging Statistics (SPOCS) is presented. It is suited for in situ analysis that traditionally would be relegated to post-processing, and can be used to monitor the statistical convergence and estimate the error/residual in the quantity-useful for uncertainty quantification too. Today, data may be generated at an overwhelming rate by numerical simulations and proliferating sensing apparatuses in experiments and engineering applications. Monitoring descriptive statistics in real time lets costly computations and experiments be gracefully aborted if an error has occurred, and monitoring the level of statistical convergence allows them to be run for the shortest amount of time required to obtain good results. This algorithm extends work by Pébay (Sandia Report SAND2008-6212). Pébay's algorithms are recast into a converging delta formulation, with provably favorable properties. The mean, variance, covariances and arbitrary higher order statistical moments are computed in one pass. The algorithm is tested using Sillero, Jiménez, & Moser's (2013, 2014) publicly available UPM high Reynolds number turbulent boundary layer data set, demonstrating numerical robustness, efficiency and other favorable properties.

  11. Indicator organisms in meat and poultry slaughter operations: their potential use in process control and the role of emerging technologies.

    PubMed

    Saini, Parmesh K; Marks, Harry M; Dreyfuss, Moshe S; Evans, Peter; Cook, L Victor; Dessai, Uday

    2011-08-01

    Measuring commonly occurring, nonpathogenic organisms on poultry products may be used for designing statistical process control systems that could result in reductions of pathogen levels. The extent of pathogen level reduction that could be obtained from actions resulting from monitoring these measurements over time depends upon the degree of understanding cause-effect relationships between processing variables, selected output variables, and pathogens. For such measurements to be effective for controlling or improving processing to some capability level within the statistical process control context, sufficiently frequent measurements would be needed to help identify processing deficiencies. Ultimately the correct balance of sampling and resources is determined by those characteristics of deficient processing that are important to identify. We recommend strategies that emphasize flexibility, depending upon sampling objectives. Coupling the measurement of levels of indicator organisms with practical emerging technologies and suitable on-site platforms that decrease the time between sample collections and interpreting results would enhance monitoring process control.

  12. Numerical Implementation of Indicators and Statistical Control Tools in Monitoring and Evaluating CACEI-ISO Indicators of Study Program in Industrial Process by Systematization

    ERIC Educational Resources Information Center

    Ayala, Gabriela Cota; Real, Francia Angélica Karlos; Ivan, Ramirez Alvarado Edqar

    2016-01-01

    The research was conducted to determine if the study program of the career of industrial processes Technological University of Chihuahua, 1 year after that it was certified by CACEI, continues achieving the established indicators and ISO 9001: 2008, implementing quality tools, monitoring of essential indicators are determined, flow charts are…

  13. DATA QUALITY OBJECTIVES AND STATISTICAL DESIGN SUPPORT FOR DEVELOPMENT OF A MONITORING PROTOCOL FOR RECREATIONAL WATERS

    EPA Science Inventory

    The purpose of this report is to describe the outputs of the Data Quality Objectives (DQOs) Process and discussions about developing a statistical design that will be used to implement the research study of recreational beach waters.

  14. Nekton Interaction Monitoring System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-03-15

    The software provides a real-time processing system for sonar to detect and track animals, and to extract water column biomass statistics in order to facilitate continuous monitoring of an underwater environment. The Nekton Interaction Monitoring System (NIMS) extracts and archives tracking and backscatter statistics data from a real-time stream of data from a sonar device. NIMS also sends real-time tracking messages over the network that can be used by other systems to generate other metrics or to trigger instruments such as an optical video camera. A web-based user interface provides remote monitoring and control. NIMS currently supports three popular sonarmore » devices: M3 multi-beam sonar (Kongsberg), EK60 split-beam echo-sounder (Simrad) and BlueView acoustic camera (Teledyne).« less

  15. Shortcomings of low-cost imaging systems for viewing computed radiographs.

    PubMed

    Ricke, J; Hänninen, E L; Zielinski, C; Amthauer, H; Stroszczynski, C; Liebig, T; Wolf, M; Hosten, N

    2000-01-01

    To assess potential advantages of a new PC-based viewing tool featuring image post-processing for viewing computed radiographs on low-cost hardware (PC) with a common display card and color monitor, and to evaluate the effect of using color versus monochrome monitors. Computed radiographs of a statistical phantom were viewed on a PC, with and without post-processing (spatial frequency and contrast processing), employing a monochrome or a color monitor. Findings were compared with the viewing on a radiological Workstation and evaluated with ROC analysis. Image post-processing improved the perception of low-contrast details significantly irrespective of the monitor used. No significant difference in perception was observed between monochrome and color monitors. The review at the radiological Workstation was superior to the review done using the PC with image processing. Lower quality hardware (graphic card and monitor) used in low cost PCs negatively affects perception of low-contrast details in computed radiographs. In this situation, it is highly recommended to use spatial frequency and contrast processing. No significant quality gain has been observed for the high-end monochrome monitor compared to the color display. However, the color monitor was affected stronger by high ambient illumination.

  16. Multivariate fault isolation of batch processes via variable selection in partial least squares discriminant analysis.

    PubMed

    Yan, Zhengbing; Kuang, Te-Hui; Yao, Yuan

    2017-09-01

    In recent years, multivariate statistical monitoring of batch processes has become a popular research topic, wherein multivariate fault isolation is an important step aiming at the identification of the faulty variables contributing most to the detected process abnormality. Although contribution plots have been commonly used in statistical fault isolation, such methods suffer from the smearing effect between correlated variables. In particular, in batch process monitoring, the high autocorrelations and cross-correlations that exist in variable trajectories make the smearing effect unavoidable. To address such a problem, a variable selection-based fault isolation method is proposed in this research, which transforms the fault isolation problem into a variable selection problem in partial least squares discriminant analysis and solves it by calculating a sparse partial least squares model. As different from the traditional methods, the proposed method emphasizes the relative importance of each process variable. Such information may help process engineers in conducting root-cause diagnosis. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Standardized quality-assessment system to evaluate pressure ulcer care in the nursing home.

    PubMed

    Bates-Jensen, Barbara M; Cadogan, Mary; Jorge, Jennifer; Schnelle, John F

    2003-09-01

    To demonstrate reliability and feasibility of a standardized protocol to assess and score quality indicators relevant to pressure ulcer (PU) care processes in nursing homes (NHs). Descriptive. Eight NHs. One hundred ninety-one NH residents for whom the PU Resident Assessment Protocol of the Minimum Data Set was initiated. Nine quality indicators (two related to screening and prevention of PU, two focused on assessment, and five addressing management) were scored using medical record data, direct human observation, and wireless thigh monitor observation data. Feasibility and reliability of medical record, observation, and thigh monitor protocols were determined. The percentage of participants who passed each of the indicators, indicating care consistent with practice guidelines, ranged from 0% to 98% across all indicators. In general, participants in NHs passed fewer indicators and had more problems with medical record accuracy before a PU was detected (screening/prevention indicators) than they did once an ulcer was documented (assessment and management indicators). Reliability of the medical record protocol showed kappa statistics ranging from 0.689 to 1.00 and percentage agreement from 80% to 100%. Direct observation protocols yielded kappa statistics of 0.979 and 0.928. Thigh monitor protocols showed kappa statistics ranging from 0.609 to 0.842. Training was variable, with the observation protocol requiring 1 to 2 hours, medical records requiring joint review of 20 charts with average time to complete the review of 20 minutes, and the thigh monitor data requiring 1 week for training in data preparation and interpretation. The standardized quality assessment system generated scores for nine PU quality indicators with good reliability and provided explicit scoring rules that permit reproducible conclusions about PU care. The focus of the indicators on care processes that are under the control of NH staff made the protocol useful for external survey and internal quality improvement purposes, and the thigh monitor observational technology provided a method for monitoring repositioning care processes that were otherwise difficult to monitor and manage.

  18. Information processing requirements for on-board monitoring of automatic landing

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.; Karmarkar, J. S.

    1977-01-01

    A systematic procedure is presented for determining the information processing requirements for on-board monitoring of automatic landing systems. The monitoring system detects landing anomalies through use of appropriate statistical tests. The time-to-correct aircraft perturbations is determined from covariance analyses using a sequence of suitable aircraft/autoland/pilot models. The covariance results are used to establish landing safety and a fault recovery operating envelope via an event outcome tree. This procedure is demonstrated with examples using the NASA Terminal Configured Vehicle (B-737 aircraft). The procedure can also be used to define decision height, assess monitoring implementation requirements, and evaluate alternate autoland configurations.

  19. [Monitoring method for macroporous resin column chromatography process of salvianolic acids based on near infrared spectroscopy].

    PubMed

    Hou, Xiang-Mei; Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang

    2016-07-01

    To study and establish a monitoring method for macroporous resin column chromatography process of salvianolic acids by using near infrared spectroscopy (NIR) as a process analytical technology (PAT).The multivariate statistical process control (MSPC) model was developed based on 7 normal operation batches, and 2 test batches (including one normal operation batch and one abnormal operation batch) were used to verify the monitoring performance of this model. The results showed that MSPC model had a good monitoring ability for the column chromatography process. Meanwhile, NIR quantitative calibration model was established for three key quality indexes (rosmarinic acid, lithospermic acid and salvianolic acid B) by using partial least squares (PLS) algorithm. The verification results demonstrated that this model had satisfactory prediction performance. The combined application of the above two models could effectively achieve real-time monitoring for macroporous resin column chromatography process of salvianolic acids, and can be used to conduct on-line analysis of key quality indexes. This established process monitoring method could provide reference for the development of process analytical technology for traditional Chinese medicines manufacturing. Copyright© by the Chinese Pharmaceutical Association.

  20. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    PubMed

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to pretreatment controls, by substituting the ionisation chamber's measurements with those performed with EPID, and also that a statistical process control monitoring of data brought security guarantee. 2010 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  1. [Near infrared spectroscopy based process trajectory technology and its application in monitoring and controlling of traditional Chinese medicine manufacturing process].

    PubMed

    Li, Wen-Long; Qu, Hai-Bin

    2016-10-01

    In this paper, the principle of NIRS (near infrared spectroscopy)-based process trajectory technology was introduced.The main steps of the technique include:① in-line collection of the processes spectra of different technics; ② unfolding of the 3-D process spectra;③ determination of the process trajectories and their normal limits;④ monitoring of the new batches with the established MSPC (multivariate statistical process control) models.Applications of the technology in the chemical and biological medicines were reviewed briefly. By a comprehensive introduction of our feasibility research on the monitoring of traditional Chinese medicine technical process using NIRS-based multivariate process trajectories, several important problems of the practical applications which need urgent solutions are proposed, and also the application prospect of the NIRS-based process trajectory technology is fully discussed and put forward in the end. Copyright© by the Chinese Pharmaceutical Association.

  2. The impact on post-operative shoulder function of intraoperative nerve monitoring of cranial nerve XI during modified radical neck dissection.

    PubMed

    Lanišnik, Boštjan; Žitnik, Lidija; Levart, Primož; Žargi, Miha; Rodi, Zoran

    2016-12-01

    Intraoperative monitoring of the cranial nerve XI (CN XI) may decrease shoulder disability following modified radical neck dissection. Prospective study was designed comparing results of Constant Shoulder Score (CSS), Shoulder Pain and Disability Index (SPADI) and EMG score of the trapezius muscle (mT) before and after surgery. One side of the neck was monitored during surgery with intraoperative nerve monitor. EMG scores of the mT 6 months postoperatively were statistically better on monitored as compared to the non-monitored side of the neck (p = 0.041), while the differences of the CSS and SPADI were not statistically significant. Patients with better EMG scores of the mT at 6 weeks recuperated better and with smaller decrease of the CSS. Intraoperative monitoring is beneficial at the beginning of the surgeon's learning curve and in the process of familiarizing with anatomical variation of the CN XI.

  3. Monitoring of bread cooling by statistical analysis of laser speckle patterns

    NASA Astrophysics Data System (ADS)

    Lyubenova, Tanya; Stoykova, Elena; Nacheva, Elena; Ivanov, Branimir; Panchev, Ivan; Sainov, Ventseslav

    2013-03-01

    The phenomenon of laser speckle can be used for detection and visualization of physical or biological activity in various objects (e.g. fruits, seeds, coatings) through statistical description of speckle dynamics. The paper presents the results of non-destructive monitoring of bread cooling by co-occurrence matrix and temporal structure function analysis of speckle patterns which have been recorded continuously within a few days. In total, 72960 and 39680 images were recorded and processed for two similar bread samples respectively. The experiments proved the expected steep decrease of activity related to the processes in the bread samples during the first several hours and revealed its oscillating character within the next few days. Characterization of activity over the bread sample surface was also obtained.

  4. System and method for statistically monitoring and analyzing sensed conditions

    DOEpatents

    Pebay, Philippe P [Livermore, CA; Brandt, James M [Dublin, CA; Gentile, Ann C [Dublin, CA; Marzouk, Youssef M [Oakland, CA; Hale, Darrian J [San Jose, CA; Thompson, David C [Livermore, CA

    2011-01-04

    A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.

  5. System and method for statistically monitoring and analyzing sensed conditions

    DOEpatents

    Pebay, Philippe P [Livermore, CA; Brandt, James M [Dublin, CA; Gentile, Ann C [Dublin, CA; Marzouk, Youssef M [Oakland, CA; Hale, Darrian J [San Jose, CA; Thompson, David C [Livermore, CA

    2011-01-25

    A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.

  6. System and method for statistically monitoring and analyzing sensed conditions

    DOEpatents

    Pebay, Philippe P [Livermore, CA; Brandt, James M. , Gentile; Ann C. , Marzouk; Youssef M. , Hale; Darrian J. , Thompson; David, C [Livermore, CA

    2010-07-13

    A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.

  7. Statistical process control charts for monitoring military injuries.

    PubMed

    Schuh, Anna; Canham-Chervak, Michelle; Jones, Bruce H

    2017-12-01

    An essential aspect of an injury prevention process is surveillance, which quantifies and documents injury rates in populations of interest and enables monitoring of injury frequencies, rates and trends. To drive progress towards injury reduction goals, additional tools are needed. Statistical process control charts, a methodology that has not been previously applied to Army injury monitoring, capitalise on existing medical surveillance data to provide information to leadership about injury trends necessary for prevention planning and evaluation. Statistical process control Shewhart u-charts were created for 49 US Army installations using quarterly injury medical encounter rates, 2007-2015, for active duty soldiers obtained from the Defense Medical Surveillance System. Injuries were defined according to established military injury surveillance recommendations. Charts display control limits three standard deviations (SDs) above and below an installation-specific historical average rate determined using 28 data points, 2007-2013. Charts are available in Army strategic management dashboards. From 2007 to 2015, Army injury rates ranged from 1254 to 1494 unique injuries per 1000 person-years. Installation injury rates ranged from 610 to 2312 injuries per 1000 person-years. Control charts identified four installations with injury rates exceeding the upper control limits at least once during 2014-2015, rates at three installations exceeded the lower control limit at least once and 42 installations had rates that fluctuated around the historical mean. Control charts can be used to drive progress towards injury reduction goals by indicating statistically significant increases and decreases in injury rates. Future applications to military subpopulations, other health outcome metrics and chart enhancements are suggested. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  8. Cumulative sum control charts for monitoring geometrically inflated Poisson processes: An application to infectious disease counts data.

    PubMed

    Rakitzis, Athanasios C; Castagliola, Philippe; Maravelakis, Petros E

    2018-02-01

    In this work, we study upper-sided cumulative sum control charts that are suitable for monitoring geometrically inflated Poisson processes. We assume that a process is properly described by a two-parameter extension of the zero-inflated Poisson distribution, which can be used for modeling count data with an excessive number of zero and non-zero values. Two different upper-sided cumulative sum-type schemes are considered, both suitable for the detection of increasing shifts in the average of the process. Aspects of their statistical design are discussed and their performance is compared under various out-of-control situations. Changes in both parameters of the process are considered. Finally, the monitoring of the monthly cases of poliomyelitis in the USA is given as an illustrative example.

  9. Contribution of large-scale forest inventories to biodiversity assessment and monitoring

    Treesearch

    Piermaria Corona; Gherardo Chirici; Ronald E. McRoberts; Susanne Winter; Anna Barbati

    2011-01-01

    Statistically-designed inventories and biodiversity monitoring programs are gaining relevance for biological conservation and natural resources management. Mandated periodic surveys provide unique opportunities to identify and satisfy natural resources management information needs. However, this is not an end in itself but rather is the beginning of a process that...

  10. Monitoring of Educational Performance Indicators in Higher Education: A Comparison of Perceptions

    ERIC Educational Resources Information Center

    Sencan, Hüner; Karabulut, A. Tugba

    2015-01-01

    The purpose of this study is to explore whether there is a statistically significant difference between the ideas of university administrators and faculty members regarding how strictly Educational Performance Indicators for Educators (EPIE) should be monitored in the educational process. The responses of university directors were compared with…

  11. Nonparametric Bayesian predictive distributions for future order statistics

    Treesearch

    Richard A. Johnson; James W. Evans; David W. Green

    1999-01-01

    We derive the predictive distribution for a specified order statistic, determined from a future random sample, under a Dirichlet process prior. Two variants of the approach are treated and some limiting cases studied. A practical application to monitoring the strength of lumber is discussed including choices of prior expectation and comparisons made to a Bayesian...

  12. Statistically qualified neuro-analytic failure detection method and system

    DOEpatents

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  13. Uncertainty quantification in structural health monitoring: Applications on cultural heritage buildings

    NASA Astrophysics Data System (ADS)

    Lorenzoni, Filippo; Casarin, Filippo; Caldon, Mauro; Islami, Kleidi; Modena, Claudio

    2016-01-01

    In the last decades the need for an effective seismic protection and vulnerability reduction of cultural heritage buildings and sites determined a growing interest in structural health monitoring (SHM) as a knowledge-based assessment tool to quantify and reduce uncertainties regarding their structural performance. Monitoring can be successfully implemented in some cases as an alternative to interventions or to control the medium- and long-term effectiveness of already applied strengthening solutions. The research group at the University of Padua, in collaboration with public administrations, has recently installed several SHM systems on heritage structures. The paper reports the application of monitoring strategies implemented to avoid (or at least minimize) the execution of strengthening interventions/repairs and control the response as long as a clear worsening or damaging process is detected. Two emblematic case studies are presented and discussed: the Roman Amphitheatre (Arena) of Verona and the Conegliano Cathedral. Both are excellent examples of on-going monitoring activities, performed through static and dynamic approaches in combination with automated procedures to extract meaningful structural features from collected data. In parallel to the application of innovative monitoring techniques, statistical models and data processing algorithms have been developed and applied in order to reduce uncertainties and exploit monitoring results for an effective assessment and protection of historical constructions. Processing software for SHM was implemented to perform the continuous real time treatment of static data and the identification of modal parameters based on the structural response to ambient vibrations. Statistical models were also developed to filter out the environmental effects and thermal cycles from the extracted features.

  14. 42 CFR 493.1256 - Standard: Control procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for having control procedures that monitor the accuracy and precision of the complete analytic process..., include two control materials, including one that is capable of detecting errors in the extraction process... control materials having previously determined statistical parameters. (e) For reagent, media, and supply...

  15. DAMT - DISTRIBUTED APPLICATION MONITOR TOOL (HP9000 VERSION)

    NASA Technical Reports Server (NTRS)

    Keith, B.

    1994-01-01

    Typical network monitors measure status of host computers and data traffic among hosts. A monitor to collect statistics about individual processes must be unobtrusive and possess the ability to locate and monitor processes, locate and monitor circuits between processes, and report traffic back to the user through a single application program interface (API). DAMT, Distributed Application Monitor Tool, is a distributed application program that will collect network statistics and make them available to the user. This distributed application has one component (i.e., process) on each host the user wishes to monitor as well as a set of components at a centralized location. DAMT provides the first known implementation of a network monitor at the application layer of abstraction. Potential users only need to know the process names of the distributed application they wish to monitor. The tool locates the processes and the circuit between them, and reports any traffic between them at a user-defined rate. The tool operates without the cooperation of the processes it monitors. Application processes require no changes to be monitored by this tool. Neither does DAMT require the UNIX kernel to be recompiled. The tool obtains process and circuit information by accessing the operating system's existing process database. This database contains all information available about currently executing processes. Expanding the information monitored by the tool can be done by utilizing more information from the process database. Traffic on a circuit between processes is monitored by a low-level LAN analyzer that has access to the raw network data. The tool also provides features such as dynamic event reporting and virtual path routing. A reusable object approach was used in the design of DAMT. The tool has four main components; the Virtual Path Switcher, the Central Monitor Complex, the Remote Monitor, and the LAN Analyzer. All of DAMT's components are independent, asynchronously executing processes. The independent processes communicate with each other via UNIX sockets through a Virtual Path router, or Switcher. The Switcher maintains a routing table showing the host of each component process of the tool, eliminating the need for each process to do so. The Central Monitor Complex provides the single application program interface (API) to the user and coordinates the activities of DAMT. The Central Monitor Complex is itself divided into independent objects that perform its functions. The component objects are the Central Monitor, the Process Locator, the Circuit Locator, and the Traffic Reporter. Each of these objects is an independent, asynchronously executing process. User requests to the tool are interpreted by the Central Monitor. The Process Locator identifies whether a named process is running on a monitored host and which host that is. The circuit between any two processes in the distributed application is identified using the Circuit Locator. The Traffic Reporter handles communication with the LAN Analyzer and accumulates traffic updates until it must send a traffic report to the user. The Remote Monitor process is replicated on each monitored host. It serves the Central Monitor Complex processes with application process information. The Remote Monitor process provides access to operating systems information about currently executing processes. It allows the Process Locator to find processes and the Circuit Locator to identify circuits between processes. It also provides lifetime information about currently monitored processes. The LAN Analyzer consists of two processes. Low-level monitoring is handled by the Sniffer. The Sniffer analyzes the raw data on a single, physical LAN. It responds to commands from the Analyzer process, which maintains the interface to the Traffic Reporter and keeps track of which circuits to monitor. DAMT is written in C-language for HP-9000 series computers running HP-UX and Sun 3 and 4 series computers running SunOS. DAMT requires 1Mb of disk space and 4Mb of RAM for execution. This package requires MIT's X Window System, Version 11 Revision 4, with OSF/Motif 1.1. The HP-9000 version (GSC-13589) includes sample HP-9000/375 and HP-9000/730 executables which were compiled under HP-UX, and the Sun version (GSC-13559) includes sample Sun3 and Sun4 executables compiled under SunOS. The standard distribution medium for the HP version of DAMT is a .25 inch HP pre-formatted streaming magnetic tape cartridge in UNIX tar format. It is also available on a 4mm magnetic tape in UNIX tar format. The standard distribution medium for the Sun version of DAMT is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. DAMT was developed in 1992.

  16. Automated Monitoring with a BSP Fault-Detection Test

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L.; Herzog, James P.

    2003-01-01

    The figure schematically illustrates a method and procedure for automated monitoring of an asset, as well as a hardware- and-software system that implements the method and procedure. As used here, asset could signify an industrial process, power plant, medical instrument, aircraft, or any of a variety of other systems that generate electronic signals (e.g., sensor outputs). In automated monitoring, the signals are digitized and then processed in order to detect faults and otherwise monitor operational status and integrity of the monitored asset. The major distinguishing feature of the present method is that the fault-detection function is implemented by use of a Bayesian sequential probability (BSP) technique. This technique is superior to other techniques for automated monitoring because it affords sensitivity, not only to disturbances in the mean values, but also to very subtle changes in the statistical characteristics (variance, skewness, and bias) of the monitored signals.

  17. Monitoring the healing process of rat bones using Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Gamulin, O.; Serec, K.; Bilić, V.; Balarin, M.; Kosović, M.; Drmić, D.; Brčić, L.; Seiwerth, S.; Sikirić, P.

    2013-07-01

    The healing effect of BPC 157 on rat femoral head osteonecrosis was monitored by Raman spectroscopy. Three groups of rats were defined: an injured group treated with BPC 157 (10 μg/kg/daily ip), an injured control group (treated with saline, 5 ml/kg/daily ip), and an uninjured healthy group. The spectra were recorded and the healing effect assessed on samples harvested from animals which were sacrificed 3 and 6 weeks after being injured. The statistical analysis of the recorded spectra showed statistical differences between the BPC 157-treated, control, and healthy groups of animals. In particular, after 6 weeks the spectral resemblance between the healthy and BPC 157 samples indicated a positive BPC 157 influence on the healing process of rat femoral head.

  18. Statistical analysis of RHIC beam position monitors performance

    NASA Astrophysics Data System (ADS)

    Calaga, R.; Tomás, R.

    2004-04-01

    A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  19. Observing Ocean Ecosystems with Sonar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzner, Shari; Maxwell, Adam R.; Ham, Kenneth D.

    2016-12-01

    We present a real-time processing system for sonar to detect and track animals, and to extract water column biomass statistics in order to facilitate continuous monitoring of an underwater environment. The Nekton Interaction Monitoring System (NIMS) is built to connect to an instrumentation network, where it consumes a real-time stream of sonar data and archives tracking and biomass data.

  20. LOWLY CHLORINATED DIBENZODIOXINS AS TEQ INDICATORS. A COMBINED APPROACH USING SPECTROSCOPIC MEASUREMENTS WITH DLR JET-REMPI AND STATISTICAL CORRELATIONS WITH WASTE COMBUSTOR EMISSIONS

    EPA Science Inventory

    Continuous monitoring of trace gas species in incineration processes can serve two purposes: (i) monitoring precursors of polychlorinated dibenzodioxin and polychlorinated dibenzofuran (PCDD/F) or other indicator species in the raw gas will enable use of their on-line signals for...

  1. Statistical Analysis of Time-Series from Monitoring of Active Volcanic Vents

    NASA Astrophysics Data System (ADS)

    Lachowycz, S.; Cosma, I.; Pyle, D. M.; Mather, T. A.; Rodgers, M.; Varley, N. R.

    2016-12-01

    Despite recent advances in the collection and analysis of time-series from volcano monitoring, and the resulting insights into volcanic processes, challenges remain in forecasting and interpreting activity from near real-time analysis of monitoring data. Statistical methods have potential to characterise the underlying structure and facilitate intercomparison of these time-series, and so inform interpretation of volcanic activity. We explore the utility of multiple statistical techniques that could be widely applicable to monitoring data, including Shannon entropy and detrended fluctuation analysis, by their application to various data streams from volcanic vents during periods of temporally variable activity. Each technique reveals changes through time in the structure of some of the data that were not apparent from conventional analysis. For example, we calculate the Shannon entropy (a measure of the randomness of a signal) of time-series from the recent dome-forming eruptions of Volcán de Colima (Mexico) and Soufrière Hills (Montserrat). The entropy of real-time seismic measurements and the count rate of certain volcano-seismic event types from both volcanoes is found to be temporally variable, with these data generally having higher entropy during periods of lava effusion and/or larger explosions. In some instances, the entropy shifts prior to or coincident with changes in seismic or eruptive activity, some of which were not clearly recognised by real-time monitoring. Comparison with other statistics demonstrates the sensitivity of the entropy to the data distribution, but that it is distinct from conventional statistical measures such as coefficient of variation. We conclude that each analysis technique examined could provide valuable insights for interpretation of diverse monitoring time-series.

  2. ASCS online fault detection and isolation based on an improved MPCA

    NASA Astrophysics Data System (ADS)

    Peng, Jianxin; Liu, Haiou; Hu, Yuhui; Xi, Junqiang; Chen, Huiyan

    2014-09-01

    Multi-way principal component analysis (MPCA) has received considerable attention and been widely used in process monitoring. A traditional MPCA algorithm unfolds multiple batches of historical data into a two-dimensional matrix and cut the matrix along the time axis to form subspaces. However, low efficiency of subspaces and difficult fault isolation are the common disadvantages for the principal component model. This paper presents a new subspace construction method based on kernel density estimation function that can effectively reduce the storage amount of the subspace information. The MPCA model and the knowledge base are built based on the new subspace. Then, fault detection and isolation with the squared prediction error (SPE) statistic and the Hotelling ( T 2) statistic are also realized in process monitoring. When a fault occurs, fault isolation based on the SPE statistic is achieved by residual contribution analysis of different variables. For fault isolation of subspace based on the T 2 statistic, the relationship between the statistic indicator and state variables is constructed, and the constraint conditions are presented to check the validity of fault isolation. Then, to improve the robustness of fault isolation to unexpected disturbances, the statistic method is adopted to set the relation between single subspace and multiple subspaces to increase the corrective rate of fault isolation. Finally fault detection and isolation based on the improved MPCA is used to monitor the automatic shift control system (ASCS) to prove the correctness and effectiveness of the algorithm. The research proposes a new subspace construction method to reduce the required storage capacity and to prove the robustness of the principal component model, and sets the relationship between the state variables and fault detection indicators for fault isolation.

  3. Phase-I monitoring of standard deviations in multistage linear profiles

    NASA Astrophysics Data System (ADS)

    Kalaei, Mahdiyeh; Soleimani, Paria; Niaki, Seyed Taghi Akhavan; Atashgar, Karim

    2018-03-01

    In most modern manufacturing systems, products are often the output of some multistage processes. In these processes, the stages are dependent on each other, where the output quality of each stage depends also on the output quality of the previous stages. This property is called the cascade property. Although there are many studies in multistage process monitoring, there are fewer works on profile monitoring in multistage processes, especially on the variability monitoring of a multistage profile in Phase-I for which no research is found in the literature. In this paper, a new methodology is proposed to monitor the standard deviation involved in a simple linear profile designed in Phase I to monitor multistage processes with the cascade property. To this aim, an autoregressive correlation model between the stages is considered first. Then, the effect of the cascade property on the performances of three types of T 2 control charts in Phase I with shifts in standard deviation is investigated. As we show that this effect is significant, a U statistic is next used to remove the cascade effect, based on which the investigated control charts are modified. Simulation studies reveal good performances of the modified control charts.

  4. A Simple Approach for Monitoring Business Service Time Variation

    PubMed Central

    2014-01-01

    Control charts are effective tools for signal detection in both manufacturing processes and service processes. Much of the data in service industries comes from processes having nonnormal or unknown distributions. The commonly used Shewhart variable control charts, which depend heavily on the normality assumption, are not appropriately used here. In this paper, we propose a new asymmetric EWMA variance chart (EWMA-AV chart) and an asymmetric EWMA mean chart (EWMA-AM chart) based on two simple statistics to monitor process variance and mean shifts simultaneously. Further, we explore the sampling properties of the new monitoring statistics and calculate the average run lengths when using both the EWMA-AV chart and the EWMA-AM chart. The performance of the EWMA-AV and EWMA-AM charts and that of some existing variance and mean charts are compared. A numerical example involving nonnormal service times from the service system of a bank branch in Taiwan is used to illustrate the applications of the EWMA-AV and EWMA-AM charts and to compare them with the existing variance (or standard deviation) and mean charts. The proposed EWMA-AV chart and EWMA-AM charts show superior detection performance compared to the existing variance and mean charts. The EWMA-AV chart and EWMA-AM chart are thus recommended. PMID:24895647

  5. A simple approach for monitoring business service time variation.

    PubMed

    Yang, Su-Fen; Arnold, Barry C

    2014-01-01

    Control charts are effective tools for signal detection in both manufacturing processes and service processes. Much of the data in service industries comes from processes having nonnormal or unknown distributions. The commonly used Shewhart variable control charts, which depend heavily on the normality assumption, are not appropriately used here. In this paper, we propose a new asymmetric EWMA variance chart (EWMA-AV chart) and an asymmetric EWMA mean chart (EWMA-AM chart) based on two simple statistics to monitor process variance and mean shifts simultaneously. Further, we explore the sampling properties of the new monitoring statistics and calculate the average run lengths when using both the EWMA-AV chart and the EWMA-AM chart. The performance of the EWMA-AV and EWMA-AM charts and that of some existing variance and mean charts are compared. A numerical example involving nonnormal service times from the service system of a bank branch in Taiwan is used to illustrate the applications of the EWMA-AV and EWMA-AM charts and to compare them with the existing variance (or standard deviation) and mean charts. The proposed EWMA-AV chart and EWMA-AM charts show superior detection performance compared to the existing variance and mean charts. The EWMA-AV chart and EWMA-AM chart are thus recommended.

  6. SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.

    2013-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.

  7. A statistical power analysis of woody carbon flux from forest inventory data

    Treesearch

    James A. Westfall; Christopher W. Woodall; Mark A. Hatfield

    2013-01-01

    At a national scale, the carbon (C) balance of numerous forest ecosystem C pools can be monitored using a stock change approach based on national forest inventory data. Given the potential influence of disturbance events and/or climate change processes, the statistical detection of changes in forest C stocks is paramount to maintaining the net sequestration status of...

  8. Industrial implementation of spatial variability control by real-time SPC

    NASA Astrophysics Data System (ADS)

    Roule, O.; Pasqualini, F.; Borde, M.

    2016-10-01

    Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.

  9. Structural health monitoring feature design by genetic programming

    NASA Astrophysics Data System (ADS)

    Harvey, Dustin Y.; Todd, Michael D.

    2014-09-01

    Structural health monitoring (SHM) systems provide real-time damage and performance information for civil, aerospace, and other high-capital or life-safety critical structures. Conventional data processing involves pre-processing and extraction of low-dimensional features from in situ time series measurements. The features are then input to a statistical pattern recognition algorithm to perform the relevant classification or regression task necessary to facilitate decisions by the SHM system. Traditional design of signal processing and feature extraction algorithms can be an expensive and time-consuming process requiring extensive system knowledge and domain expertise. Genetic programming, a heuristic program search method from evolutionary computation, was recently adapted by the authors to perform automated, data-driven design of signal processing and feature extraction algorithms for statistical pattern recognition applications. The proposed method, called Autofead, is particularly suitable to handle the challenges inherent in algorithm design for SHM problems where the manifestation of damage in structural response measurements is often unclear or unknown. Autofead mines a training database of response measurements to discover information-rich features specific to the problem at hand. This study provides experimental validation on three SHM applications including ultrasonic damage detection, bearing damage classification for rotating machinery, and vibration-based structural health monitoring. Performance comparisons with common feature choices for each problem area are provided demonstrating the versatility of Autofead to produce significant algorithm improvements on a wide range of problems.

  10. Real-time monitoring of a coffee roasting process with near infrared spectroscopy using multivariate statistical analysis: A feasibility study.

    PubMed

    Catelani, Tiago A; Santos, João Rodrigo; Páscoa, Ricardo N M J; Pezza, Leonardo; Pezza, Helena R; Lopes, João A

    2018-03-01

    This work proposes the use of near infrared (NIR) spectroscopy in diffuse reflectance mode and multivariate statistical process control (MSPC) based on principal component analysis (PCA) for real-time monitoring of the coffee roasting process. The main objective was the development of a MSPC methodology able to early detect disturbances to the roasting process resourcing to real-time acquisition of NIR spectra. A total of fifteen roasting batches were defined according to an experimental design to develop the MSPC models. This methodology was tested on a set of five batches where disturbances of different nature were imposed to simulate real faulty situations. Some of these batches were used to optimize the model while the remaining was used to test the methodology. A modelling strategy based on a time sliding window provided the best results in terms of distinguishing batches with and without disturbances, resourcing to typical MSPC charts: Hotelling's T 2 and squared predicted error statistics. A PCA model encompassing a time window of four minutes with three principal components was able to efficiently detect all disturbances assayed. NIR spectroscopy combined with the MSPC approach proved to be an adequate auxiliary tool for coffee roasters to detect faults in a conventional roasting process in real-time. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Citizen surveillance for environmental monitoring: combining the efforts of citizen science and crowdsourcing in a quantitative data framework.

    PubMed

    Welvaert, Marijke; Caley, Peter

    2016-01-01

    Citizen science and crowdsourcing have been emerging as methods to collect data for surveillance and/or monitoring activities. They could be gathered under the overarching term citizen surveillance . The discipline, however, still struggles to be widely accepted in the scientific community, mainly because these activities are not embedded in a quantitative framework. This results in an ongoing discussion on how to analyze and make useful inference from these data. When considering the data collection process, we illustrate how citizen surveillance can be classified according to the nature of the underlying observation process measured in two dimensions-the degree of observer reporting intention and the control in observer detection effort. By classifying the observation process in these dimensions we distinguish between crowdsourcing, unstructured citizen science and structured citizen science. This classification helps the determine data processing and statistical treatment of these data for making inference. Using our framework, it is apparent that published studies are overwhelmingly associated with structured citizen science, and there are well developed statistical methods for the resulting data. In contrast, methods for making useful inference from purely crowd-sourced data remain under development, with the challenges of accounting for the unknown observation process considerable. Our quantitative framework for citizen surveillance calls for an integration of citizen science and crowdsourcing and provides a way forward to solve the statistical challenges inherent to citizen-sourced data.

  12. INDUCTIVE SYSTEM HEALTH MONITORING WITH STATISTICAL METRICS

    NASA Technical Reports Server (NTRS)

    Iverson, David L.

    2005-01-01

    Model-based reasoning is a powerful method for performing system monitoring and diagnosis. Building models for model-based reasoning is often a difficult and time consuming process. The Inductive Monitoring System (IMS) software was developed to provide a technique to automatically produce health monitoring knowledge bases for systems that are either difficult to model (simulate) with a computer or which require computer models that are too complex to use for real time monitoring. IMS processes nominal data sets collected either directly from the system or from simulations to build a knowledge base that can be used to detect anomalous behavior in the system. Machine learning and data mining techniques are used to characterize typical system behavior by extracting general classes of nominal data from archived data sets. In particular, a clustering algorithm forms groups of nominal values for sets of related parameters. This establishes constraints on those parameter values that should hold during nominal operation. During monitoring, IMS provides a statistically weighted measure of the deviation of current system behavior from the established normal baseline. If the deviation increases beyond the expected level, an anomaly is suspected, prompting further investigation by an operator or automated system. IMS has shown potential to be an effective, low cost technique to produce system monitoring capability for a variety of applications. We describe the training and system health monitoring techniques of IMS. We also present the application of IMS to a data set from the Space Shuttle Columbia STS-107 flight. IMS was able to detect an anomaly in the launch telemetry shortly after a foam impact damaged Columbia's thermal protection system.

  13. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    PubMed

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.

  14. Information Assurance Technology Analysis Center Information Assurance Tools Report Intrusion Detection

    DTIC Science & Technology

    1998-01-01

    such as central processing unit (CPU) usage, disk input/output (I/O), memory usage, user activity, and number of logins attempted. The statistics... EMERALD Commercial anomaly detection, system monitoring SRI porras@csl.sri.com www.csl.sri.com/ emerald /index. html Gabriel Commercial system...sensors, it starts to protect the network with minimal configuration and maximum intelligence. T 11 EMERALD TITLE EMERALD (Event Monitoring

  15. An innovative statistical approach for analysing non-continuous variables in environmental monitoring: assessing temporal trends of TBT pollution.

    PubMed

    Santos, José António; Galante-Oliveira, Susana; Barroso, Carlos

    2011-03-01

    The current work presents an innovative statistical approach to model ordinal variables in environmental monitoring studies. An ordinal variable has values that can only be compared as "less", "equal" or "greater" and it is not possible to have information about the size of the difference between two particular values. The example of ordinal variable under this study is the vas deferens sequence (VDS) used in imposex (superimposition of male sexual characters onto prosobranch females) field assessment programmes for monitoring tributyltin (TBT) pollution. The statistical methodology presented here is the ordered logit regression model. It assumes that the VDS is an ordinal variable whose values match up a process of imposex development that can be considered continuous in both biological and statistical senses and can be described by a latent non-observable continuous variable. This model was applied to the case study of Nucella lapillus imposex monitoring surveys conducted in the Portuguese coast between 2003 and 2008 to evaluate the temporal evolution of TBT pollution in this country. In order to produce more reliable conclusions, the proposed model includes covariates that may influence the imposex response besides TBT (e.g. the shell size). The model also provides an analysis of the environmental risk associated to TBT pollution by estimating the probability of the occurrence of females with VDS ≥ 2 in each year, according to OSPAR criteria. We consider that the proposed application of this statistical methodology has a great potential in environmental monitoring whenever there is the need to model variables that can only be assessed through an ordinal scale of values.

  16. Quality control troubleshooting tools for the mill floor

    Treesearch

    John Dramm

    2000-01-01

    Statistical Process Control (SPC) provides effective tools for improving process quality in the forest products industry resulting in reduced costs and improved productivity. Implementing SPC helps identify and locate problems that occur in wood products manufacturing. SPC tools achieve their real value when applied on the mill floor for monitoring and troubleshooting...

  17. A Data-Driven Solution for Performance Improvement

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.

  18. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    PubMed

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales.

  19. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions

    PubMed Central

    Hahn, Beth A.; Dreitz, Victoria J.; George, T. Luke

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer’s sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer’s sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales. PMID:29065128

  20. Control chart pattern recognition using RBF neural network with new training algorithm and practical features.

    PubMed

    Addeh, Abdoljalil; Khormali, Aminollah; Golilarz, Noorbakhsh Amiri

    2018-05-04

    The control chart patterns are the most commonly used statistical process control (SPC) tools to monitor process changes. When a control chart produces an out-of-control signal, this means that the process has been changed. In this study, a new method based on optimized radial basis function neural network (RBFNN) is proposed for control chart patterns (CCPs) recognition. The proposed method consists of four main modules: feature extraction, feature selection, classification and learning algorithm. In the feature extraction module, shape and statistical features are used. Recently, various shape and statistical features have been presented for the CCPs recognition. In the feature selection module, the association rules (AR) method has been employed to select the best set of the shape and statistical features. In the classifier section, RBFNN is used and finally, in RBFNN, learning algorithm has a high impact on the network performance. Therefore, a new learning algorithm based on the bees algorithm has been used in the learning module. Most studies have considered only six patterns: Normal, Cyclic, Increasing Trend, Decreasing Trend, Upward Shift and Downward Shift. Since three patterns namely Normal, Stratification, and Systematic are very similar to each other and distinguishing them is very difficult, in most studies Stratification and Systematic have not been considered. Regarding to the continuous monitoring and control over the production process and the exact type detection of the problem encountered during the production process, eight patterns have been investigated in this study. The proposed method is tested on a dataset containing 1600 samples (200 samples from each pattern) and the results showed that the proposed method has a very good performance. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Data monitoring committees: Promoting best practices to address emerging challenges.

    PubMed

    Fleming, Thomas R; DeMets, David L; Roe, Matthew T; Wittes, Janet; Calis, Karim A; Vora, Amit N; Meisel, Alan; Bain, Raymond P; Konstam, Marvin A; Pencina, Michael J; Gordon, David J; Mahaffey, Kenneth W; Hennekens, Charles H; Neaton, James D; Pearson, Gail D; Andersson, Tomas Lg; Pfeffer, Marc A; Ellenberg, Susan S

    2017-04-01

    Data monitoring committees are responsible for safeguarding the interests of study participants and assuring the integrity and credibility of clinical trials. The independence of data monitoring committees from sponsors and investigators is essential in achieving this mission. Creative approaches are needed to address ongoing and emerging challenges that potentially threaten data monitoring committees' independence and effectiveness. An expert panel of representatives from academia, industry and government sponsors, and regulatory agencies discussed these challenges and proposed best practices and operating principles for effective functioning of contemporary data monitoring committees. Prospective data monitoring committee members need better training. Options could include didactic instruction as well as apprenticeships to provide real-world experience. Data monitoring committee members should be protected against legal liability arising from their service. While avoiding breaches in confidentiality of interim data remains a high priority, data monitoring committees should have access to unblinded efficacy and safety data throughout the trial to enable informed judgments about risks and benefits. Because overly rigid procedures can compromise their independence, data monitoring committees should have the flexibility necessary to best fulfill their responsibilities. Data monitoring committee charters should articulate principles that guide the data monitoring committee process rather than list a rigid set of requirements. Data monitoring committees should develop their recommendations by consensus rather than through voting processes. The format for the meetings of the data monitoring committee should maintain the committee's independence and clearly establish the leadership of the data monitoring committee chair. The independent statistical group at the Statistical Data Analysis Center should have sufficient depth of knowledge about the study at hand and experience with trials in general to ensure that the data monitoring committee has access to timely, reliable, and readily interpretable insights about emerging evidence in the clinical trial. Contracts engaging data monitoring committee members for industry-sponsored trials should have language customized to the unique responsibilities of data monitoring committee members rather than use language appropriate to consultants for product development. Regulatory scientists would benefit from experiencing data monitoring committee service that does not conflict with their regulatory responsibilities.

  2. 1H NMR-based metabolic profiling for evaluating poppy seed rancidity and brewing.

    PubMed

    Jawień, Ewa; Ząbek, Adam; Deja, Stanisław; Łukaszewicz, Marcin; Młynarz, Piotr

    2015-12-01

    Poppy seeds are widely used in household and commercial confectionery. The aim of this study was to demonstrate the application of metabolic profiling for industrial monitoring of the molecular changes which occur during minced poppy seed rancidity and brewing processes performed on raw seeds. Both forms of poppy seeds were obtained from a confectionery company. Proton nuclear magnetic resonance (1H NMR) was applied as the analytical method of choice together with multivariate statistical data analysis. Metabolic fingerprinting was applied as a bioprocess control tool to monitor rancidity with the trajectory of change and brewing progressions. Low molecular weight compounds were found to be statistically significant biomarkers of these bioprocesses. Changes in concentrations of chemical compounds were explained relative to the biochemical processes and external conditions. The obtained results provide valuable and comprehensive information to gain a better understanding of the biology of rancidity and brewing processes, while demonstrating the potential for applying NMR spectroscopy combined with multivariate data analysis tools for quality control in food industries involved in the processing of oilseeds. This precious and versatile information gives a better understanding of the biology of these processes.

  3. Persistent Surveillance of Transient Events with Unknown Statistics

    DTIC Science & Technology

    2016-12-18

    different bird species by a documentary maker is shown in Fig. 1. Additional examples of scenarios following this setting include robots patrolling the...persistent monitoring application in which a documentary maker would like to monitor three different species of birds appearing in three discrete, species...specific locations. Bird sightings at each location follow a stochastic process with a rate that is initially unknown to the documentary maker and must

  4. Square2 - A Web Application for Data Monitoring in Epidemiological and Clinical Studies

    PubMed

    Schmidt, Carsten Oliver; Krabbe, Christine; Schössow, Janka; Albers, Martin; Radke, Dörte; Henke, Jörg

    2017-01-01

    Valid scientific inferences from epidemiological and clinical studies require high data quality. Data generating departments therefore aim to detect data irregularities as early as possible in order to guide quality management processes. In addition, after the completion of data collections the obtained data quality must be evaluated. This can be challenging in complex studies due to a wide scope of examinations, numerous study variables, multiple examiners, devices, and examination centers. This paper describes a Java EE web application used to monitor and evaluate data quality in institutions with complex and multiple studies, named Square 2 . It uses the Java libraries Apache MyFaces 2, extended by BootsFaces for layout and style. RServe and REngine manage calls to R server processes. All study data and metadata are stored in PostgreSQL. R is the statistics backend and LaTeX is used for the generation of print ready PDF reports. A GUI manages the entire workflow. Square 2 covers all steps in the data monitoring workflow, including the setup of studies and their structure, the handling of metadata for data monitoring purposes, selection of variables, upload of data, statistical analyses, and the generation as well as inspection of quality reports. To take into account data protection issues, Square 2 comprises an extensive user rights and roles concept.

  5. Multivariate statistical process control (MSPC) using Raman spectroscopy for in-line culture cell monitoring considering time-varying batches synchronized with correlation optimized warping (COW).

    PubMed

    Liu, Ya-Juan; André, Silvère; Saint Cristau, Lydia; Lagresle, Sylvain; Hannas, Zahia; Calvosa, Éric; Devos, Olivier; Duponchel, Ludovic

    2017-02-01

    Multivariate statistical process control (MSPC) is increasingly popular as the challenge provided by large multivariate datasets from analytical instruments such as Raman spectroscopy for the monitoring of complex cell cultures in the biopharmaceutical industry. However, Raman spectroscopy for in-line monitoring often produces unsynchronized data sets, resulting in time-varying batches. Moreover, unsynchronized data sets are common for cell culture monitoring because spectroscopic measurements are generally recorded in an alternate way, with more than one optical probe parallelly connecting to the same spectrometer. Synchronized batches are prerequisite for the application of multivariate analysis such as multi-way principal component analysis (MPCA) for the MSPC monitoring. Correlation optimized warping (COW) is a popular method for data alignment with satisfactory performance; however, it has never been applied to synchronize acquisition time of spectroscopic datasets in MSPC application before. In this paper we propose, for the first time, to use the method of COW to synchronize batches with varying durations analyzed with Raman spectroscopy. In a second step, we developed MPCA models at different time intervals based on the normal operation condition (NOC) batches synchronized by COW. New batches are finally projected considering the corresponding MPCA model. We monitored the evolution of the batches using two multivariate control charts based on Hotelling's T 2 and Q. As illustrated with results, the MSPC model was able to identify abnormal operation condition including contaminated batches which is of prime importance in cell culture monitoring We proved that Raman-based MSPC monitoring can be used to diagnose batches deviating from the normal condition, with higher efficacy than traditional diagnosis, which would save time and money in the biopharmaceutical industry. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Process defects and in situ monitoring methods in metal powder bed fusion: a review

    NASA Astrophysics Data System (ADS)

    Grasso, Marco; Colosimo, Bianca Maria

    2017-04-01

    Despite continuous technological enhancements of metal Additive Manufacturing (AM) systems, the lack of process repeatability and stability still represents a barrier for the industrial breakthrough. The most relevant metal AM applications currently involve industrial sectors (e.g. aerospace and bio-medical) where defects avoidance is fundamental. Because of this, there is the need to develop novel in situ monitoring tools able to keep under control the stability of the process on a layer-by-layer basis, and to detect the onset of defects as soon as possible. On the one hand, AM systems must be equipped with in situ sensing devices able to measure relevant quantities during the process, a.k.a. process signatures. On the other hand, in-process data analytics and statistical monitoring techniques are required to detect and localize the defects in an automated way. This paper reviews the literature and the commercial tools for in situ monitoring of powder bed fusion (PBF) processes. It explores the different categories of defects and their main causes, the most relevant process signatures and the in situ sensing approaches proposed so far. Particular attention is devoted to the development of automated defect detection rules and the study of process control strategies, which represent two critical fields for the development of future smart PBF systems.

  7. Intelligent Performance Analysis with a Natural Language Interface

    NASA Astrophysics Data System (ADS)

    Juuso, Esko K.

    2017-09-01

    Performance improvement is taken as the primary goal in the asset management. Advanced data analysis is needed to efficiently integrate condition monitoring data into the operation and maintenance. Intelligent stress and condition indices have been developed for control and condition monitoring by combining generalized norms with efficient nonlinear scaling. These nonlinear scaling methodologies can also be used to handle performance measures used for management since management oriented indicators can be presented in the same scale as intelligent condition and stress indices. Performance indicators are responses of the process, machine or system to the stress contributions analyzed from process and condition monitoring data. Scaled values are directly used in intelligent temporal analysis to calculate fluctuations and trends. All these methodologies can be used in prognostics and fatigue prediction. The meanings of the variables are beneficial in extracting expert knowledge and representing information in natural language. The idea of dividing the problems into the variable specific meanings and the directions of interactions provides various improvements for performance monitoring and decision making. The integrated temporal analysis and uncertainty processing facilitates the efficient use of domain expertise. Measurements can be monitored with generalized statistical process control (GSPC) based on the same scaling functions.

  8. Using Knowledge Base for Event-Driven Scheduling of Web Monitoring Systems

    NASA Astrophysics Data System (ADS)

    Kim, Yang Sok; Kang, Sung Won; Kang, Byeong Ho; Compton, Paul

    Web monitoring systems report any changes to their target web pages by revisiting them frequently. As they operate under significant resource constraints, it is essential to minimize revisits while ensuring minimal delay and maximum coverage. Various statistical scheduling methods have been proposed to resolve this problem; however, they are static and cannot easily cope with events in the real world. This paper proposes a new scheduling method that manages unpredictable events. An MCRDR (Multiple Classification Ripple-Down Rules) document classification knowledge base was reused to detect events and to initiate a prompt web monitoring process independent of a static monitoring schedule. Our experiment demonstrates that the approach improves monitoring efficiency significantly.

  9. Batch statistical process control of a fluid bed granulation process using in-line spatial filter velocimetry and product temperature measurements.

    PubMed

    Burggraeve, A; Van den Kerkhof, T; Hellings, M; Remon, J P; Vervaet, C; De Beer, T

    2011-04-18

    Fluid bed granulation is a batch process, which is characterized by the processing of raw materials for a predefined period of time, consisting of a fixed spraying phase and a subsequent drying period. The present study shows the multivariate statistical modeling and control of a fluid bed granulation process based on in-line particle size distribution (PSD) measurements (using spatial filter velocimetry) combined with continuous product temperature registration using a partial least squares (PLS) approach. Via the continuous in-line monitoring of the PSD and product temperature during granulation of various reference batches, a statistical batch model was developed allowing the real-time evaluation and acceptance or rejection of future batches. Continuously monitored PSD and product temperature process data of 10 reference batches (X-data) were used to develop a reference batch PLS model, regressing the X-data versus the batch process time (Y-data). Two PLS components captured 98.8% of the variation in the X-data block. Score control charts in which the average batch trajectory and upper and lower control limits are displayed were developed. Next, these control charts were used to monitor 4 new test batches in real-time and to immediately detect any deviations from the expected batch trajectory. By real-time evaluation of new batches using the developed control charts and by computation of contribution plots of deviating process behavior at a certain time point, batch losses or reprocessing can be prevented. Immediately after batch completion, all PSD and product temperature information (i.e., a batch progress fingerprint) was used to estimate some granule properties (density and flowability) at an early stage, which can improve batch release time. Individual PLS models relating the computed scores (X) of the reference PLS model (based on the 10 reference batches) and the density, respectively, flowabililty as Y-matrix, were developed. The scores of the 4 test batches were used to examine the predictive ability of the model. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Statistical loads data for Boeing 737-400 aircraft in commercial operations

    DOT National Transportation Integrated Search

    1998-08-01

    The primary objective of this research is to support the FAA Airborne Data Monitoring Systems Research Program by developing new and improved methods and criteria for processing and presenting large commercial transport airplane flight and ground loa...

  11. Statistical loads data for BE-1900D aircraft in commuter operations

    DOT National Transportation Integrated Search

    2000-04-01

    The primary objective of this research is to support the FAA Airborne Data Monitoring Systems Research Program by developing new and improved methods and criteria for processing and presenting commuter airplane flight and ground loads usage data. The...

  12. Development of ecological indicator guilds for land management

    USGS Publications Warehouse

    Krzysik, A.J.; Balbach, H.E.; Duda, J.J.; Emlen, J.M.; Freeman, D.C.; Graham, J.H.; Kovacic, D.A.; Smith, L.M.; Zak, J.C.

    2005-01-01

    Agency land-use must be efficiently and cost-effectively monitored to assess conditions and trends in ecosystem processes and natural resources relevant to mission requirements and legal mandates. Ecological Indicators represent important land management tools for tracking ecological changes and preventing irreversible environmental damage in disturbed landscapes. The overall objective of the research was to develop both individual and integrated sets (i.e., statistically derived guilds) of Ecological Indicators to: quantify habitat conditions and trends, track and monitor ecological changes, provide early warning or threshold detection, and provide guidance for land managers. The derivation of Ecological Indicators was based on statistical criteria, ecosystem relevance, reliability and robustness, economy and ease of use for land managers, multi-scale performance, and stress response criteria. The basis for the development of statistically based Ecological Indicators was the identification of ecosystem metrics that analytically tracked a landscape disturbance gradient.

  13. M-TraCE: a new tool for high-resolution computation and statistical elaboration of backward trajectories on the Italian domain

    NASA Astrophysics Data System (ADS)

    Vitali, Lina; Righini, Gaia; Piersanti, Antonio; Cremona, Giuseppe; Pace, Giandomenico; Ciancarella, Luisella

    2017-12-01

    Air backward trajectory calculations are commonly used in a variety of atmospheric analyses, in particular for source attribution evaluation. The accuracy of backward trajectory analysis is mainly determined by the quality and the spatial and temporal resolution of the underlying meteorological data set, especially in the cases of complex terrain. This work describes a new tool for the calculation and the statistical elaboration of backward trajectories. To take advantage of the high-resolution meteorological database of the Italian national air quality model MINNI, a dedicated set of procedures was implemented under the name of M-TraCE (MINNI module for Trajectories Calculation and statistical Elaboration) to calculate and process the backward trajectories of air masses reaching a site of interest. Some outcomes from the application of the developed methodology to the Italian Network of Special Purpose Monitoring Stations are shown to assess its strengths for the meteorological characterization of air quality monitoring stations. M-TraCE has demonstrated its capabilities to provide a detailed statistical assessment of transport patterns and region of influence of the site under investigation, which is fundamental for correctly interpreting pollutants measurements and ascertaining the official classification of the monitoring site based on meta-data information. Moreover, M-TraCE has shown its usefulness in supporting other assessments, i.e., spatial representativeness of a monitoring site, focussing specifically on the analysis of the effects due to meteorological variables.

  14. Using statistical process control for monitoring the prevalence of hospital-acquired pressure ulcers.

    PubMed

    Kottner, Jan; Halfens, Ruud

    2010-05-01

    Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.

  15. ROC analysis for diagnostic accuracy of fracture by using different monitors.

    PubMed

    Liang, Zhigang; Li, Kuncheng; Yang, Xiaolin; Du, Xiangying; Liu, Jiabin; Zhao, Xin; Qi, Xiangdong

    2006-09-01

    The purpose of this study was to compare diagnostic accuracy by using two types of monitors. Four radiologists with 10 years experience twice interpreted the films of 77 fracture cases by using the ViewSonic P75f+ and BARCO MGD221 monitors, with a time interval of 3 weeks. Each time the radiologists used one type of monitor to interpret the images. The image browser used was the Unisight software provided by Atlastiger Company (Shanghai, China), and interpretation result was analyzed via the LABMRMC software. In studies of receiver operating characteristics to score the presence or absence of fracture, the results of images interpreted through monochromic monitors showed significant statistical difference compared to those interpreted using the color monitors. A significant difference was observed in the results obtained by using two kinds of monitors. Color monitors cannot serve as substitutes for monochromatic monitors in the process of interpreting computed radiography (CR) images with fractures.

  16. [Statistical Process Control (SPC) can help prevent treatment errors without increasing costs in radiotherapy].

    PubMed

    Govindarajan, R; Llueguera, E; Melero, A; Molero, J; Soler, N; Rueda, C; Paradinas, C

    2010-01-01

    Statistical Process Control (SPC) was applied to monitor patient set-up in radiotherapy and, when the measured set-up error values indicated a loss of process stability, its root cause was identified and eliminated to prevent set-up errors. Set up errors were measured for medial-lateral (ml), cranial-caudal (cc) and anterior-posterior (ap) dimensions and then the upper control limits were calculated. Once the control limits were known and the range variability was acceptable, treatment set-up errors were monitored using sub-groups of 3 patients, three times each shift. These values were plotted on a control chart in real time. Control limit values showed that the existing variation was acceptable. Set-up errors, measured and plotted on a X chart, helped monitor the set-up process stability and, if and when the stability was lost, treatment was interrupted, the particular cause responsible for the non-random pattern was identified and corrective action was taken before proceeding with the treatment. SPC protocol focuses on controlling the variability due to assignable cause instead of focusing on patient-to-patient variability which normally does not exist. Compared to weekly sampling of set-up error in each and every patient, which may only ensure that just those sampled sessions were set-up correctly, the SPC method enables set-up error prevention in all treatment sessions for all patients and, at the same time, reduces the control costs. Copyright © 2009 SECA. Published by Elsevier Espana. All rights reserved.

  17. Novel Kalman Filter Algorithm for Statistical Monitoring of Extensive Landscapes with Synoptic Sensor Data

    PubMed Central

    Czaplewski, Raymond L.

    2015-01-01

    Wall-to-wall remotely sensed data are increasingly available to monitor landscape dynamics over large geographic areas. However, statistical monitoring programs that use post-stratification cannot fully utilize those sensor data. The Kalman filter (KF) is an alternative statistical estimator. I develop a new KF algorithm that is numerically robust with large numbers of study variables and auxiliary sensor variables. A National Forest Inventory (NFI) illustrates application within an official statistics program. Practical recommendations regarding remote sensing and statistical issues are offered. This algorithm has the potential to increase the value of synoptic sensor data for statistical monitoring of large geographic areas. PMID:26393588

  18. Monitoring concept for structural integration of PZT-fiber arrays in metal sheets: a numerical and experimental study

    NASA Astrophysics Data System (ADS)

    Drossel, Welf-Guntram; Schubert, Andreas; Putz, Matthias; Koriath, Hans-Joachim; Wittstock, Volker; Hensel, Sebastian; Pierer, Alexander; Müller, Benedikt; Schmidt, Marek

    2018-01-01

    The technique joining by forming allows the structural integration of piezoceramic fibers into locally microstructured metal sheets without any elastic interlayers. A high-volume production of the joining partners causes in statistical deviations from the nominal dimensions. A numerical simulation on geometric process sensitivity shows that the deviations have a high significant influence on the resulting fiber stresses after the joining by forming operation and demonstrate the necessity of a monitoring concept. On this basis, the electromechanical behavior of piezoceramic array transducers is investigated experimentally before, during and after the joining process. The piezoceramic array transducer consists of an arrangement of five electrical interconnected piezoceramic fibers. The findings show that the impedance spectrum depends on the fiber stresses and can be used for in-process monitoring during the joining process. Based on the impedance values the preload state of the interconnected piezoceramic fibers can be specifically controlled and a fiber overload.

  19. Statistical process control using optimized neural networks: a case study.

    PubMed

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  20. 40 CFR 51.369 - Improving repair effectiveness.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... technical questions that arise in the repair process, and answer questions related to the legal requirements of State and Federal law with regard to emission control device tampering, engine switching, or... vehicles for retest. Performance monitoring shall include statistics on the number of vehicles submitted...

  1. System for monitoring an industrial process and determining sensor status

    DOEpatents

    Gross, K.C.; Hoyer, K.K.; Humenik, K.E.

    1995-10-17

    A method and system for monitoring an industrial process and a sensor are disclosed. The method and system include generating a first and second signal characteristic of an industrial process variable. One of the signals can be an artificial signal generated by an auto regressive moving average technique. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the two pairs of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test. 17 figs.

  2. System for monitoring an industrial process and determining sensor status

    DOEpatents

    Gross, K.C.; Hoyer, K.K.; Humenik, K.E.

    1997-05-13

    A method and system are disclosed for monitoring an industrial process and a sensor. The method and system include generating a first and second signal characteristic of an industrial process variable. One of the signals can be an artificial signal generated by an auto regressive moving average technique. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the two pairs of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test. 17 figs.

  3. System for monitoring an industrial process and determining sensor status

    DOEpatents

    Gross, Kenneth C.; Hoyer, Kristin K.; Humenik, Keith E.

    1995-01-01

    A method and system for monitoring an industrial process and a sensor. The method and system include generating a first and second signal characteristic of an industrial process variable. One of the signals can be an artificial signal generated by an auto regressive moving average technique. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the two pairs of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test.

  4. System for monitoring an industrial process and determining sensor status

    DOEpatents

    Gross, Kenneth C.; Hoyer, Kristin K.; Humenik, Keith E.

    1997-01-01

    A method and system for monitoring an industrial process and a sensor. The method and system include generating a first and second signal characteristic of an industrial process variable. One of the signals can be an artificial signal generated by an auto regressive moving average technique. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the two pairs of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test.

  5. Urban land use monitoring from computer-implemented processing of airborne multispectral data

    NASA Technical Reports Server (NTRS)

    Todd, W. J.; Mausel, P. W.; Baumgardner, M. F.

    1976-01-01

    Machine processing techniques were applied to multispectral data obtained from airborne scanners at an elevation of 600 meters over central Indianapolis in August, 1972. Computer analysis of these spectral data indicate that roads (two types), roof tops (three types), dense grass (two types), sparse grass (two types), trees, bare soil, and water (two types) can be accurately identified. Using computers, it is possible to determine land uses from analysis of type, size, shape, and spatial associations of earth surface images identified from multispectral data. Land use data developed through machine processing techniques can be programmed to monitor land use changes, simulate land use conditions, and provide impact statistics that are required to analyze stresses placed on spatial systems.

  6. Monitoring the Earth System Grid Federation through the ESGF Dashboard

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Bell, G. M.; Drach, B.; Williams, D.; Aloisio, G.

    2012-12-01

    The Climate Model Intercomparison Project, phase 5 (CMIP5) is a global effort coordinated by the World Climate Research Programme (WCRP) involving tens of modeling groups spanning 19 countries. It is expected the CMIP5 distributed data archive will total upwards of 3.5 petabytes, stored across several ESGF Nodes on four continents (North America, Europe, Asia, and Australia). The Earth System Grid Federation (ESGF) provides the IT infrastructure to support the CMIP5. In this regard, the monitoring of the distributed ESGF infrastructure represents a crucial part carried out by the ESGF Dashboard. The ESGF Dashboard is a software component of the ESGF stack, responsible for collecting key information about the status of the federation in terms of: 1) Network topology (peer-groups composition), 2) Node type (host/services mapping), 3) Registered users (including their Identity Providers), 4) System metrics (e.g., round-trip time, service availability, CPU, memory, disk, processes, etc.), 5) Download metrics (both at the Node and federation level). The last class of information is very important since it provides a strong insight of the CMIP5 experiment: the data usage statistics. In this regard, CMCC and LLNL have developed a data analytics management system for the analysis of both node-level and federation-level data usage statistics. It provides data usage statistics aggregated by project, model, experiment, variable, realm, peer node, time, ensemble, datasetname (including version), etc. The back-end of the system is able to infer the data usage information of the entire federation, by carrying out: - at node level: a 18-step reconciliation process on the peer node databases (i.e. node manager and publisher DB) which provides a 15-dimension datawarehouse with local statistics and - at global level: an aggregation process which federates the data usage statistics into a 16-dimension datawarehouse with federation-level data usage statistics. The front-end of the Dashboard system exploits a web desktop approach, which joins the pervasivity of a web application with the flexibility of a desktop one.

  7. Robust Strategy for Rocket Engine Health Monitoring

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    2001-01-01

    Monitoring the health of rocket engine systems is essentially a two-phase process. The acquisition phase involves sensing physical conditions at selected locations, converting physical inputs to electrical signals, conditioning the signals as appropriate to establish scale or filter interference, and recording results in a form that is easy to interpret. The inference phase involves analysis of results from the acquisition phase, comparison of analysis results to established health measures, and assessment of health indications. A variety of analytical tools may be employed in the inference phase of health monitoring. These tools can be separated into three broad categories: statistical, rule based, and model based. Statistical methods can provide excellent comparative measures of engine operating health. They require well-characterized data from an ensemble of "typical" engines, or "golden" data from a specific test assumed to define the operating norm in order to establish reliable comparative measures. Statistical methods are generally suitable for real-time health monitoring because they do not deal with the physical complexities of engine operation. The utility of statistical methods in rocket engine health monitoring is hindered by practical limits on the quantity and quality of available data. This is due to the difficulty and high cost of data acquisition, the limited number of available test engines, and the problem of simulating flight conditions in ground test facilities. In addition, statistical methods incur a penalty for disregarding flow complexity and are therefore limited in their ability to define performance shift causality. Rule based methods infer the health state of the engine system based on comparison of individual measurements or combinations of measurements with defined health norms or rules. This does not mean that rule based methods are necessarily simple. Although binary yes-no health assessment can sometimes be established by relatively simple rules, the causality assignment needed for refined health monitoring often requires an exceptionally complex rule base involving complicated logical maps. Structuring the rule system to be clear and unambiguous can be difficult, and the expert input required to maintain a large logic network and associated rule base can be prohibitive.

  8. Emergency preparedness: community-based short-term eruption forecasting at Campi Flegrei

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Marzocchi, Warner; Civetta, Lucia; Del Pezzo, Edoardo; Papale, Paolo

    2010-05-01

    A key element in emergency preparedness is to define advance tools to assist decision makers and emergency management groups during crises. Such tools must be prepared in advance, accounting for all of expertise and scientific knowledge accumulated through time. During a pre-eruptive phase, the key for sound short-term eruption forecasting is the analysis of the monitoring signals. This involves the capability (i) to recognize anomalous signals and to relate single or combined anomalies to physical processes, assigning them probability values, and (ii) to quickly provide an answer to the observed phenomena even when unexpected. Here we present a > 4 years long process devoted to define the pre-eruptive Event Tree (ET) for Campi Flegrei. A community of about 40 experts in volcanology and volcano monitoring participating to two Italian Projects on Campi Flegrei funded by the Italian Civil Protection, has been constituted and trained during periodic meetings on the statistical methods and the model BET_EF (Marzocchi et al., 2008) that forms the statistical package tool for ET definition. Model calibration has been carried out through public elicitation sessions, preceded and followed by devoted meetings and web forum discussion on the monitoring parameters, their accuracy and relevance, and their potential meanings. The calibrated ET allows anomalies in the monitored parameters to be recognized and interpreted, assigning probability values to each set of data. This process de-personalizes the difficult task of interpreting multi-parametric sets of data during on-going emergencies, and provides a view of the observed variations that accounts for the averaged, weighted opinion of the scientific community. An additional positive outcome of the described ET calibration process is that of providing a picture of the degree of confidence by the expert community on the capability of the many different monitored quantities of recognizing significant variations in the state of the volcano. This picture is particularly useful since it can be used to guide future implementations in the monitoring network, as well as research investments aimed at substantially improving the capability to forecast the short-term volcanic hazard.

  9. Citizen science based monitoring of Greylag goose (Anser anser) in Bavaria (Germany): combining count data and bag data to estimate long-term trends between 1988/89 and 2010/11.

    PubMed

    Grauer, Andreas; König, Andreas; Bunnefeld, Nils

    2015-01-01

    Numbers of large grazing bird (geese, swans, cranes) have increased all over Europe, but monitoring these species, e.g. for management purposes, can be time consuming and costly. In Bavaria, sedentary Greylag geese (Anser anser) are monitored during the winter by two different citizen-based monitoring schemes: the International Waterbird Census [IWC] and hunting bag statistics. We compared the results of both schemes for the seasons 1988/89 to 2010/11 by analysing annual indices calculated using the software TRends and Indices for Monitoring Data-TRIM. We identified similar, highly significant rates of increase in both data sets for the entire region of Bavaria (IWC 14% [13-15%], bag 13% [12-14%]). Furthermore, in all of the seven Bavarian regions, trends in annual indices of both data sets correlated significantly. The quality of both datasets as indicators of abundances in Greylag geese populations in Bavaria was not undermined by either weaknesses typically associated with citizen based monitoring or problems generally assumed for IWC and bag data. We also show that bag data are, under the German system of collecting bag statistics, a reliable indicator of species' distribution, especially for detecting newly colonized areas. Therefore, wildlife managers may want to consider bag data from citizen science led monitoring programmes as evidence supporting the decision making processes. We also discuss requirements for any bag monitoring schemes being established to monitor trends in species' distribution and abundance.

  10. Monitoring Statistics Which Have Increased Power over a Reduced Time Range.

    ERIC Educational Resources Information Center

    Tang, S. M.; MacNeill, I. B.

    1992-01-01

    The problem of monitoring trends for changes at unknown times is considered. Statistics that permit one to focus high power on a segment of the monitored period are studied. Numerical procedures are developed to compute the null distribution of these statistics. (Author)

  11. Robust functional statistics applied to Probability Density Function shape screening of sEMG data.

    PubMed

    Boudaoud, S; Rix, H; Al Harrach, M; Marin, F

    2014-01-01

    Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.

  12. Online neural monitoring of statistical learning

    PubMed Central

    Batterink, Laura J.; Paller, Ken A.

    2017-01-01

    The extraction of patterns in the environment plays a critical role in many types of human learning, from motor skills to language acquisition. This process is known as statistical learning. Here we propose that statistical learning has two dissociable components: (1) perceptual binding of individual stimulus units into integrated composites and (2) storing those integrated representations for later use. Statistical learning is typically assessed using post-learning tasks, such that the two components are conflated. Our goal was to characterize the online perceptual component of statistical learning. Participants were exposed to a structured stream of repeating trisyllabic nonsense words and a random syllable stream. Online learning was indexed by an EEG-based measure that quantified neural entrainment at the frequency of the repeating words relative to that of individual syllables. Statistical learning was subsequently assessed using conventional measures in an explicit rating task and a reaction-time task. In the structured stream, neural entrainment to trisyllabic words was higher than in the random stream, increased as a function of exposure to track the progression of learning, and predicted performance on the RT task. These results demonstrate that monitoring this critical component of learning via rhythmic EEG entrainment reveals a gradual acquisition of knowledge whereby novel stimulus sequences are transformed into familiar composites. This online perceptual transformation is a critical component of learning. PMID:28324696

  13. Monitoring of bone regeneration process by means of texture analysis

    NASA Astrophysics Data System (ADS)

    Kokkinou, E.; Boniatis, I.; Costaridou, L.; Saridis, A.; Panagiotopoulos, E.; Panayiotakis, G.

    2009-09-01

    An image analysis method is proposed for the monitoring of the regeneration of the tibial bone. For this purpose, 130 digitized radiographs of 13 patients, who had undergone tibial lengthening by the Ilizarov method, were studied. For each patient, 10 radiographs, taken at an equal number of postoperative successive time moments, were available. Employing available software, 3 Regions Of Interest (ROIs), corresponding to the: (a) upper, (b) central, and (c) lower aspect of the gap, where bone regeneration was expected to occur, were determined on each radiograph. Employing custom developed algorithms: (i) a number of textural features were generated from each of the ROIs, and (ii) a texture-feature based regression model was designed for the quantitative monitoring of the bone regeneration process. Statistically significant differences (p < 0.05) were derived for the initial and the final textural features values, generated from the first and the last postoperatively obtained radiographs, respectively. A quadratic polynomial regression equation fitted data adequately (r2 = 0.9, p < 0.001). The suggested method may contribute to the monitoring of the tibial bone regeneration process.

  14. Surveillance of industrial processes with correlated parameters

    DOEpatents

    White, Andrew M.; Gross, Kenny C.; Kubic, William L.; Wigeland, Roald A.

    1996-01-01

    A system and method for surveillance of an industrial process. The system and method includes a plurality of sensors monitoring industrial process parameters, devices to convert the sensed data to computer compatible information and a computer which executes computer software directed to analyzing the sensor data to discern statistically reliable alarm conditions. The computer software is executed to remove serial correlation information and then calculate Mahalanobis distribution data to carry out a probability ratio test to determine alarm conditions.

  15. Remote sensing new model for monitoring the east Asian migratory locust infections based on its breeding circle

    NASA Astrophysics Data System (ADS)

    Han, Xiuzhen; Ma, Jianwen; Bao, Yuhai

    2006-12-01

    Currently the function of operational locust monitor system mainly focused on after-hazards monitoring and assessment, and to found the way effectively to perform early warning and prediction has more practical meaning. Through 2001, 2002 two years continuously field sample and statistics for locusts eggs hatching, nymph growth, adults 3 phases observation, sample statistics and calculation, spectral measurements as well as synchronically remote sensing data processing we raise the view point of Remote Sensing three stage monitor the locust hazards. Based on the point of view we designed remote sensing monitor in three stages: (1) during the egg hitching phase remote sensing can retrieve parameters of land surface temperature (LST) and soil moisture; (2) during nymph growth phase locust increases appetite greatly and remote sensing can calculate vegetation index, leaf area index, vegetation cover and analysis changes; (3) during adult phase the locust move and assembly towards ponds and water ditches as well as less than 75% vegetation cover areas and remote sensing combination with field data can monitor and predicts potential areas for adult locusts to assembly. In this way the priority of remote sensing technology is elaborated effectively and it also provides technique support for the locust monitor system. The idea and techniques used in the study can also be used as reference for other plant diseases and insect pests.

  16. Tracking signal test to monitor an intelligent time series forecasting model

    NASA Astrophysics Data System (ADS)

    Deng, Yan; Jaraiedi, Majid; Iskander, Wafik H.

    2004-03-01

    Extensive research has been conducted on the subject of Intelligent Time Series forecasting, including many variations on the use of neural networks. However, investigation of model adequacy over time, after the training processes is completed, remains to be fully explored. In this paper we demonstrate a how a smoothed error tracking signals test can be incorporated into a neuro-fuzzy model to monitor the forecasting process and as a statistical measure for keeping the forecasting model up-to-date. The proposed monitoring procedure is effective in the detection of nonrandom changes, due to model inadequacy or lack of unbiasedness in the estimation of model parameters and deviations from the existing patterns. This powerful detection device will result in improved forecast accuracy in the long run. An example data set has been used to demonstrate the application of the proposed method.

  17. What's to Be Done About Laboratory Quality? Process Indicators, Laboratory Stewardship, the Outcomes Problem, Risk Assessment, and Economic Value: Responding to Contemporary Global Challenges.

    PubMed

    Meier, Frederick A; Badrick, Tony C; Sikaris, Kenneth A

    2018-02-17

    For 50 years, structure, process, and outcomes measures have assessed health care quality. For clinical laboratories, structural quality has generally been assessed by inspection. For assessing process, quality indicators (QIs), statistical monitors of steps in the clinical laboratory total testing, have proliferated across the globe. Connections between structural and process laboratory measures and patient outcomes, however, have rarely been demonstrated. To inform further development of clinical laboratory quality systems, we conducted a selective but worldwide review of publications on clinical laboratory quality assessment. Some QIs, like seven generic College of American Pathologists Q-Tracks monitors, have demonstrated significant process improvement; other measures have uncovered critical opportunities to improve test selection and result management. The College of Pathologists of Australasia Key Indicator Monitoring and Management System has deployed risk calculations, introduced from failure mode effects analysis, as surrogate measures for outcomes. Showing economic value from clinical laboratory testing quality is a challenge. Clinical laboratories should converge on fewer (7-14) rather than more (21-35) process monitors; monitors should cover all steps of the testing process under laboratory control and include especially high-risk specimen-quality QIs. Clinical laboratory stewardship, the combination of education interventions among clinician test orderers and report consumers with revision of test order formats and result reporting schemes, improves test ordering, but improving result reception is more difficult. Risk calculation reorders the importance of quality monitors by balancing three probabilities: defect frequency, weight of potential harm, and detection difficulty. The triple approach of (1) a more focused suite of generic consensus quality indicators, (2) more active clinical laboratory testing stewardship, and (3) integration of formal risk assessment, rather than competing with economic value, enhances it.

  18. Cardiovascular oscillations at the bedside: early diagnosis of neonatal sepsis using heart rate characteristics monitoring

    PubMed Central

    Moorman, J. Randall; Delos, John B.; Flower, Abigail A.; Cao, Hanqing; Kovatchev, Boris P.; Richman, Joshua S.; Lake, Douglas E.

    2014-01-01

    We have applied principles of statistical signal processing and non-linear dynamics to analyze heart rate time series from premature newborn infants in order to assist in the early diagnosis of sepsis, a common and potentially deadly bacterial infection of the bloodstream. We began with the observation of reduced variability and transient decelerations in heart rate interval time series for hours up to days prior to clinical signs of illness. We find that measurements of standard deviation, sample asymmetry and sample entropy are highly related to imminent clinical illness. We developed multivariable statistical predictive models, and an interface to display the real-time results to clinicians. Using this approach, we have observed numerous cases in which incipient neonatal sepsis was diagnosed and treated without any clinical illness at all. This review focuses on the mathematical and statistical time series approaches used to detect these abnormal heart rate characteristics and present predictive monitoring information to the clinician. PMID:22026974

  19. Statistical process control for electron beam monitoring.

    PubMed

    López-Tarjuelo, Juan; Luquero-Llopis, Naika; García-Mollá, Rafael; Quirós-Higueras, Juan David; Bouché-Babiloni, Ana; Juan-Senabre, Xavier Jordi; de Marco-Blancas, Noelia; Ferrer-Albiach, Carlos; Santos-Serra, Agustín

    2015-07-01

    To assess the electron beam monitoring statistical process control (SPC) in linear accelerator (linac) daily quality control. We present a long-term record of our measurements and evaluate which SPC-led conditions are feasible for maintaining control. We retrieved our linac beam calibration, symmetry, and flatness daily records for all electron beam energies from January 2008 to December 2013, and retrospectively studied how SPC could have been applied and which of its features could be used in the future. A set of adjustment interventions designed to maintain these parameters under control was also simulated. All phase I data was under control. The dose plots were characterized by rising trends followed by steep drops caused by our attempts to re-center the linac beam calibration. Where flatness and symmetry trends were detected they were less-well defined. The process capability ratios ranged from 1.6 to 9.3 at a 2% specification level. Simulated interventions ranged from 2% to 34% of the total number of measurement sessions. We also noted that if prospective SPC had been applied it would have met quality control specifications. SPC can be used to assess the inherent variability of our electron beam monitoring system. It can also indicate whether a process is capable of maintaining electron parameters under control with respect to established specifications by using a daily checking device, but this is not practical unless a method to establish direct feedback from the device to the linac can be devised. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  20. Manufacturing Research: Self-Directed Control

    DTIC Science & Technology

    1991-01-01

    reduce this sensitivity. SDO is performing Taguchi’s parameter design . 1-13 Statistical Process Control SPC techniques will be used to monitor the process...Florida,R.E. Krieger Pub. Co., 1988. Dehnad, Khowrow, Quality Control . Robust Design . and the Taguchi Method, Pacific Grove, California, Wadsworth... control system. This turns out to be a non -trivial exercise. A human operator can see an event occur (such as the vessel pressurizing above its setpoint

  1. A sub-microwatt piezo-floating-gate sensor for long-term fatigue monitoring in biomechanical implants.

    PubMed

    Lajnef, Nizar; Chakrabartty, Shantanu; Elvin, Niell; Elvin, Alex

    2006-01-01

    In this paper we describe an implementation of a novel fatigue monitoring sensor based on integration of piezoelectric transduction with floating gate avalanche injection. The miniaturized sensor enables continuous battery-less monitoring and time-to-failure predictions of biomechanical implants. Measured results from a fabricated prototype in a 0.5 microm CMOS process indicate that the device can compute cumulative statistics of electrical signals generated by piezoelectric transducer, while consuming less that 1 microW of power. The ultra-low power operation makes the sensor attractive for integration with poly-vinylidene difluoride (PVDF) based transducers that have already proven to be biocompatible.

  2. Feasibility study of using statistical process control to customized quality assurance in proton therapy.

    PubMed

    Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-09-01

    To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.

  3. A New Performance Improvement Model: Adding Benchmarking to the Analysis of Performance Indicator Data.

    PubMed

    Al-Kuwaiti, Ahmed; Homa, Karen; Maruthamuthu, Thennarasu

    2016-01-01

    A performance improvement model was developed that focuses on the analysis and interpretation of performance indicator (PI) data using statistical process control and benchmarking. PIs are suitable for comparison with benchmarks only if the data fall within the statistically accepted limit-that is, show only random variation. Specifically, if there is no significant special-cause variation over a period of time, then the data are ready to be benchmarked. The proposed Define, Measure, Control, Internal Threshold, and Benchmark model is adapted from the Define, Measure, Analyze, Improve, Control (DMAIC) model. The model consists of the following five steps: Step 1. Define the process; Step 2. Monitor and measure the variation over the period of time; Step 3. Check the variation of the process; if stable (no significant variation), go to Step 4; otherwise, control variation with the help of an action plan; Step 4. Develop an internal threshold and compare the process with it; Step 5.1. Compare the process with an internal benchmark; and Step 5.2. Compare the process with an external benchmark. The steps are illustrated through the use of health care-associated infection (HAI) data collected for 2013 and 2014 from the Infection Control Unit, King Fahd Hospital, University of Dammam, Saudi Arabia. Monitoring variation is an important strategy in understanding and learning about a process. In the example, HAI was monitored for variation in 2013, and the need to have a more predictable process prompted the need to control variation by an action plan. The action plan was successful, as noted by the shift in the 2014 data, compared to the historical average, and, in addition, the variation was reduced. The model is subject to limitations: For example, it cannot be used without benchmarks, which need to be calculated the same way with similar patient populations, and it focuses only on the "Analyze" part of the DMAIC model.

  4. Application of Multiregressive Linear Models, Dynamic Kriging Models and Neural Network Models to Predictive Maintenance of Hydroelectric Power Systems

    NASA Astrophysics Data System (ADS)

    Lucifredi, A.; Mazzieri, C.; Rossi, M.

    2000-05-01

    Since the operational conditions of a hydroelectric unit can vary within a wide range, the monitoring system must be able to distinguish between the variations of the monitored variable caused by variations of the operation conditions and those due to arising and progressing of failures and misoperations. The paper aims to identify the best technique to be adopted for the monitoring system. Three different methods have been implemented and compared. Two of them use statistical techniques: the first, the linear multiple regression, expresses the monitored variable as a linear function of the process parameters (independent variables), while the second, the dynamic kriging technique, is a modified technique of multiple linear regression representing the monitored variable as a linear combination of the process variables in such a way as to minimize the variance of the estimate error. The third is based on neural networks. Tests have shown that the monitoring system based on the kriging technique is not affected by some problems common to the other two models e.g. the requirement of a large amount of data for their tuning, both for training the neural network and defining the optimum plane for the multiple regression, not only in the system starting phase but also after a trivial operation of maintenance involving the substitution of machinery components having a direct impact on the observed variable. Or, in addition, the necessity of different models to describe in a satisfactory way the different ranges of operation of the plant. The monitoring system based on the kriging statistical technique overrides the previous difficulties: it does not require a large amount of data to be tuned and is immediately operational: given two points, the third can be immediately estimated; in addition the model follows the system without adapting itself to it. The results of the experimentation performed seem to indicate that a model based on a neural network or on a linear multiple regression is not optimal, and that a different approach is necessary to reduce the amount of work during the learning phase using, when available, all the information stored during the initial phase of the plant to build the reference baseline, elaborating, if it is the case, the raw information available. A mixed approach using the kriging statistical technique and neural network techniques could optimise the result.

  5. Watershed Scale Stable Isotope Distribution and Implications on Soil Organic Carbon Loss Monitoring under Hydrologic Uncertainty

    NASA Astrophysics Data System (ADS)

    Ahmed, I.; Karim, A.; Boutton, T. W.; Strom, K.; Fox, J.

    2013-12-01

    The thematic focus of this 3-year period multidisciplinary USDA-CBG collaborative applied research is integrated monitoring of soil organic carbon (SOC) loss from multi-use lands using state-of-the-art stable isotope science under uncertain hydrologic influences. In this study, SOC loss and water runoff are being monitored on a 150 square kilometer watershed in Houston, Texas, using natural rainfall events, and total organic carbon/nitrogen concentration (TOC/TN) and stable isotope ratio (δ13C, δ15N) measurements with different land-use types. The work presents the interdisciplinary research results to uncover statistically valid and scientifically sound ways to monitor SOC loss by (i) application of Bayesian Markov Chain Monte Carlo statistical models to assess the relationship between rainfall-runoff and SOC release during soil erosion in space and time, (ii) capturing the episodic nature of rainfall events and its role in the spatial distribution of SOC loss from water erosion, (iii) stable isotope composition guided fingerprinting (source and quantity) of SOC by considering various types of erosion processes common in a heterogeneous watershed, to be able to tell what percentage of SOC is lost from various land-use types (Fox and Papanicolaou, 2008), (iv) creating an integrated watershed scale statistical soil loss monitoring model driven by spatial and temporal correlation of flow and stable isotope composition (Ahmed et. al., 2013a,b), and (v) creation of an integrated decision support system (DSS) for sustainable management of SOC under hydrologic uncertainty to assist the end users. References: Ahmed, I., Karim, A., Boutton, T.W., and Strom, K.B. (2013a). 'Monitoring Soil Organic Carbon Loss from Erosion Using Stable Isotopes.' Proc., Soil Carbon Sequestration, International Conference, May 26-29, Reykjavik, Iceland. Ahmed, I, Bouttom, T.W., Strom, K. B., Karim, A., and Irvin-Smith, N. (2013b). 'Soil carbon distribution and loss monitoring in the urbanized Buffalo Bayou watershed, Houston, Texas.' Proc., 4th Annual All Investigators Meeting of the North American Carbon Program, February 4-7, Albuquerque, NM. Fox, J.F. and Papanicolaou, A.N. (2008). An un-mixing model to study watershed erosion processes. Advances in Water Resources, 31, 96-108. ______________________________ * Corresponding author';s e-mail: ifahmed@pvamu.edu

  6. Surveillance of industrial processes with correlated parameters

    DOEpatents

    White, A.M.; Gross, K.C.; Kubic, W.L.; Wigeland, R.A.

    1996-12-17

    A system and method for surveillance of an industrial process are disclosed. The system and method includes a plurality of sensors monitoring industrial process parameters, devices to convert the sensed data to computer compatible information and a computer which executes computer software directed to analyzing the sensor data to discern statistically reliable alarm conditions. The computer software is executed to remove serial correlation information and then calculate Mahalanobis distribution data to carry out a probability ratio test to determine alarm conditions. 10 figs.

  7. The construction of control chart for PM10 functional data

    NASA Astrophysics Data System (ADS)

    Shaadan, Norshahida; Jemain, Abdul Aziz; Deni, Sayang Mohd

    2014-06-01

    In this paper, a statistical procedure to construct a control chart for monitoring air quality (PM10) using functional data is proposed. A set of daily indices that represent the daily PM10 curves were obtained using Functional Principal Component Analysis (FPCA). By means of an iterative charting procedure, a reference data set that represented a stable PM10 process was obtained. The data were then used as a reference for monitoring future data. The application of the procedure was conducted using seven-year (2004-2010) period of recorded data from the Klang air quality monitoring station located in the Klang Valley region of Peninsular Malaysia. The study showed that the control chart provided a useful visualization tool for monitoring air quality and was capable in detecting abnormality in the process system. As in the case of Klang station, the results showed that with reference to 2004-2008, the air quality (PM10) in 2010 was better than that in 2009.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benahmed, A.; Elkarch, H.

    This new portable radiological environmental monitor consists of 2 main components, Gamma ionization chamber and a FPGA-based electronic enclosure linked to convivial software for treatment and analyzing. The HPIC ion chamber is the heart of this radiation measurement system and is running in range from 0 to 100 mR/h, so that the sensitivity at the output is 20 mV/μR/h, with a nearly flat energy response from 0,07 to 10 MEV. This paper presents a contribution for developing a new nuclear measurement data acquisition system based on Cyclone III FPGA Starter Kit ALTERA, and a user-friendly software to run real-time controlmore » and data processing. It was developed to substitute the older radiation monitor RSS-112 PIC installed in CNESTEN's Laboratory in order to improve some of its functionalities related to acquisition time and data memory capacity. As for the associated acquisition software, it was conceived under the virtual LabView platform from National Instrument, and offers a variety of system setup for radiation environmental monitoring. It gives choice to display both the statistical data and the dose rate. Statistical data shows a summary of current data, current time/date and dose integrator values, and the dose rate displays the current dose rate in large numbers for viewing from a distance as well as the date and time. The prototype version of this new instrument and its data processing software has been successfully tested and validated for viewing and monitoring the environmental radiation of Moroccan nuclear center. (authors)« less

  9. Application of the statistical process control method for prospective patient safety monitoring during the learning phase: robotic kidney transplantation with regional hypothermia (IDEAL phase 2a-b).

    PubMed

    Sood, Akshay; Ghani, Khurshid R; Ahlawat, Rajesh; Modi, Pranjal; Abaza, Ronney; Jeong, Wooju; Sammon, Jesse D; Diaz, Mireya; Kher, Vijay; Menon, Mani; Bhandari, Mahendra

    2014-08-01

    Traditional evaluation of the learning curve (LC) of an operation has been retrospective. Furthermore, LC analysis does not permit patient safety monitoring. To prospectively monitor patient safety during the learning phase of robotic kidney transplantation (RKT) and determine when it could be considered learned using the techniques of statistical process control (SPC). From January through May 2013, 41 patients with end-stage renal disease underwent RKT with regional hypothermia at one of two tertiary referral centers adopting RKT. Transplant recipients were classified into three groups based on the robotic training and kidney transplant experience of the surgeons: group 1, robot trained with limited kidney transplant experience (n=7); group 2, robot trained and kidney transplant experienced (n=20); and group 3, kidney transplant experienced with limited robot training (n=14). We employed prospective monitoring using SPC techniques, including cumulative summation (CUSUM) and Shewhart control charts, to perform LC analysis and patient safety monitoring, respectively. Outcomes assessed included post-transplant graft function and measures of surgical process (anastomotic and ischemic times). CUSUM and Shewhart control charts are time trend analytic techniques that allow comparative assessment of outcomes following a new intervention (RKT) relative to those achieved with established techniques (open kidney transplant; target value) in a prospective fashion. CUSUM analysis revealed an initial learning phase for group 3, whereas groups 1 and 2 had no to minimal learning time. The learning phase for group 3 varied depending on the parameter assessed. Shewhart control charts demonstrated no compromise in functional outcomes for groups 1 and 2. Graft function was compromised in one patient in group 3 (p<0.05) secondary to reasons unrelated to RKT. In multivariable analysis, robot training was significantly associated with improved task-completion times (p<0.01). Graft function was not adversely affected by either the lack of robotic training (p=0.22) or kidney transplant experience (p=0.72). The LC and patient safety of a new surgical technique can be assessed prospectively using CUSUM and Shewhart control chart analytic techniques. These methods allow determination of the duration of mentorship and identification of adverse events in a timely manner. A new operation can be considered learned when outcomes achieved with the new intervention are at par with outcomes following established techniques. Statistical process control techniques allowed for robust, objective, and prospective monitoring of robotic kidney transplantation and can similarly be applied to other new interventions during the introduction and adoption phase. Copyright © 2014 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  10. Batch process fault detection and identification based on discriminant global preserving kernel slow feature analysis.

    PubMed

    Zhang, Hanyuan; Tian, Xuemin; Deng, Xiaogang; Cao, Yuping

    2018-05-16

    As an attractive nonlinear dynamic data analysis tool, global preserving kernel slow feature analysis (GKSFA) has achieved great success in extracting the high nonlinearity and inherently time-varying dynamics of batch process. However, GKSFA is an unsupervised feature extraction method and lacks the ability to utilize batch process class label information, which may not offer the most effective means for dealing with batch process monitoring. To overcome this problem, we propose a novel batch process monitoring method based on the modified GKSFA, referred to as discriminant global preserving kernel slow feature analysis (DGKSFA), by closely integrating discriminant analysis and GKSFA. The proposed DGKSFA method can extract discriminant feature of batch process as well as preserve global and local geometrical structure information of observed data. For the purpose of fault detection, a monitoring statistic is constructed based on the distance between the optimal kernel feature vectors of test data and normal data. To tackle the challenging issue of nonlinear fault variable identification, a new nonlinear contribution plot method is also developed to help identifying the fault variable after a fault is detected, which is derived from the idea of variable pseudo-sample trajectory projection in DGKSFA nonlinear biplot. Simulation results conducted on a numerical nonlinear dynamic system and the benchmark fed-batch penicillin fermentation process demonstrate that the proposed process monitoring and fault diagnosis approach can effectively detect fault and distinguish fault variables from normal variables. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Effect of Friction Stir Process Parameters on the Mechanical and Thermal Behavior of 5754-H111 Aluminum Plates.

    PubMed

    Serio, Livia Maria; Palumbo, Davide; De Filippis, Luigi Alberto Ciro; Galietti, Umberto; Ludovico, Antonio Domenico

    2016-02-23

    A study of the Friction Stir Welding (FSW) process was carried out in order to evaluate the influence of process parameters on the mechanical properties of aluminum plates (AA5754-H111). The process was monitored during each test by means of infrared cameras in order to correlate temperature information with eventual changes of the mechanical properties of joints. In particular, two process parameters were considered for tests: the welding tool rotation speed and the welding tool traverse speed. The quality of joints was evaluated by means of destructive and non-destructive tests. In this regard, the presence of defects and the ultimate tensile strength (UTS) were investigated for each combination of the process parameters. A statistical analysis was carried out to assess the correlation between the thermal behavior of joints and the process parameters, also proving the capability of Infrared Thermography for on-line monitoring of the quality of joints.

  12. Effect of Friction Stir Process Parameters on the Mechanical and Thermal Behavior of 5754-H111 Aluminum Plates

    PubMed Central

    Serio, Livia Maria; Palumbo, Davide; De Filippis, Luigi Alberto Ciro; Galietti, Umberto; Ludovico, Antonio Domenico

    2016-01-01

    A study of the Friction Stir Welding (FSW) process was carried out in order to evaluate the influence of process parameters on the mechanical properties of aluminum plates (AA5754-H111). The process was monitored during each test by means of infrared cameras in order to correlate temperature information with eventual changes of the mechanical properties of joints. In particular, two process parameters were considered for tests: the welding tool rotation speed and the welding tool traverse speed. The quality of joints was evaluated by means of destructive and non-destructive tests. In this regard, the presence of defects and the ultimate tensile strength (UTS) were investigated for each combination of the process parameters. A statistical analysis was carried out to assess the correlation between the thermal behavior of joints and the process parameters, also proving the capability of Infrared Thermography for on-line monitoring of the quality of joints. PMID:28773246

  13. Heterogeneous recurrence monitoring and control of nonlinear stochastic processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Hui, E-mail: huiyang@usf.edu; Chen, Yun

    Recurrence is one of the most common phenomena in natural and engineering systems. Process monitoring of dynamic transitions in nonlinear and nonstationary systems is more concerned with aperiodic recurrences and recurrence variations. However, little has been done to investigate the heterogeneous recurrence variations and link with the objectives of process monitoring and anomaly detection. Notably, nonlinear recurrence methodologies are based on homogeneous recurrences, which treat all recurrence states in the same way as black dots, and non-recurrence is white in recurrence plots. Heterogeneous recurrences are more concerned about the variations of recurrence states in terms of state properties (e.g., valuesmore » and relative locations) and the evolving dynamics (e.g., sequential state transitions). This paper presents a novel approach of heterogeneous recurrence analysis that utilizes a new fractal representation to delineate heterogeneous recurrence states in multiple scales, including the recurrences of both single states and multi-state sequences. Further, we developed a new set of heterogeneous recurrence quantifiers that are extracted from fractal representation in the transformed space. To that end, we integrated multivariate statistical control charts with heterogeneous recurrence analysis to simultaneously monitor two or more related quantifiers. Experimental results on nonlinear stochastic processes show that the proposed approach not only captures heterogeneous recurrence patterns in the fractal representation but also effectively monitors the changes in the dynamics of a complex system.« less

  14. LabVIEW: a software system for data acquisition, data analysis, and instrument control.

    PubMed

    Kalkman, C J

    1995-01-01

    Computer-based data acquisition systems play an important role in clinical monitoring and in the development of new monitoring tools. LabVIEW (National Instruments, Austin, TX) is a data acquisition and programming environment that allows flexible acquisition and processing of analog and digital data. The main feature that distinguishes LabVIEW from other data acquisition programs is its highly modular graphical programming language, "G," and a large library of mathematical and statistical functions. The advantage of graphical programming is that the code is flexible, reusable, and self-documenting. Subroutines can be saved in a library and reused without modification in other programs. This dramatically reduces development time and enables researchers to develop or modify their own programs. LabVIEW uses a large amount of processing power and computer memory, thus requiring a powerful computer. A large-screen monitor is desirable when developing larger applications. LabVIEW is excellently suited for testing new monitoring paradigms, analysis algorithms, or user interfaces. The typical LabVIEW user is the researcher who wants to develop a new monitoring technique, a set of new (derived) variables by integrating signals from several existing patient monitors, closed-loop control of a physiological variable, or a physiological simulator.

  15. Statistical approaches used to assess and redesign surface water-quality-monitoring networks.

    PubMed

    Khalil, B; Ouarda, T B M J

    2009-11-01

    An up-to-date review of the statistical approaches utilized for the assessment and redesign of surface water quality monitoring (WQM) networks is presented. The main technical aspects of network design are covered in four sections, addressing monitoring objectives, water quality variables, sampling frequency and spatial distribution of sampling locations. This paper discusses various monitoring objectives and related procedures used for the assessment and redesign of long-term surface WQM networks. The appropriateness of each approach for the design, contraction or expansion of monitoring networks is also discussed. For each statistical approach, its advantages and disadvantages are examined from a network design perspective. Possible methods to overcome disadvantages and deficiencies in the statistical approaches that are currently in use are recommended.

  16. Online neural monitoring of statistical learning.

    PubMed

    Batterink, Laura J; Paller, Ken A

    2017-05-01

    The extraction of patterns in the environment plays a critical role in many types of human learning, from motor skills to language acquisition. This process is known as statistical learning. Here we propose that statistical learning has two dissociable components: (1) perceptual binding of individual stimulus units into integrated composites and (2) storing those integrated representations for later use. Statistical learning is typically assessed using post-learning tasks, such that the two components are conflated. Our goal was to characterize the online perceptual component of statistical learning. Participants were exposed to a structured stream of repeating trisyllabic nonsense words and a random syllable stream. Online learning was indexed by an EEG-based measure that quantified neural entrainment at the frequency of the repeating words relative to that of individual syllables. Statistical learning was subsequently assessed using conventional measures in an explicit rating task and a reaction-time task. In the structured stream, neural entrainment to trisyllabic words was higher than in the random stream, increased as a function of exposure to track the progression of learning, and predicted performance on the reaction time (RT) task. These results demonstrate that monitoring this critical component of learning via rhythmic EEG entrainment reveals a gradual acquisition of knowledge whereby novel stimulus sequences are transformed into familiar composites. This online perceptual transformation is a critical component of learning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. STATISTICAL RELATIONAL LEARNING AND SCRIPT INDUCTION FOR TEXTUAL INFERENCE

    DTIC Science & Technology

    2017-12-01

    E 23rd St Austin , TX 78712 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) Air Force Research ...Processing (EMNLP), Austin , TX , 2016. 
 Pichotta, K. and Mooney, R.J., “Using Sentence-Level LSTM Language Models for Script Inference,” Proceedings of the...on Uphill Battles in Language Processing, Austin , TX , 2016. 
 Rajani, N., and Mooney, R. J., “Stacked Ensembles of Information Extractors for

  18. Monitoring Butterfly Abundance: Beyond Pollard Walks

    PubMed Central

    Pellet, Jérôme; Bried, Jason T.; Parietti, David; Gander, Antoine; Heer, Patrick O.; Cherix, Daniel; Arlettaz, Raphaël

    2012-01-01

    Most butterfly monitoring protocols rely on counts along transects (Pollard walks) to generate species abundance indices and track population trends. It is still too often ignored that a population count results from two processes: the biological process (true abundance) and the statistical process (our ability to properly quantify abundance). Because individual detectability tends to vary in space (e.g., among sites) and time (e.g., among years), it remains unclear whether index counts truly reflect population sizes and trends. This study compares capture-mark-recapture (absolute abundance) and count-index (relative abundance) monitoring methods in three species (Maculinea nausithous and Iolana iolas: Lycaenidae; Minois dryas: Satyridae) in contrasted habitat types. We demonstrate that intraspecific variability in individual detectability under standard monitoring conditions is probably the rule rather than the exception, which questions the reliability of count-based indices to estimate and compare specific population abundance. Our results suggest that the accuracy of count-based methods depends heavily on the ecology and behavior of the target species, as well as on the type of habitat in which surveys take place. Monitoring programs designed to assess the abundance and trends in butterfly populations should incorporate a measure of detectability. We discuss the relative advantages and inconveniences of current monitoring methods and analytical approaches with respect to the characteristics of the species under scrutiny and resources availability. PMID:22859980

  19. Self-powered monitoring of repeated head impacts using time-dilation energy measurement circuit.

    PubMed

    Feng, Tao; Aono, Kenji; Covassin, Tracey; Chakrabartty, Shantanu

    2015-04-01

    Due to the current epidemic levels of sport-related concussions (SRC) in the U.S., there is a pressing need for technologies that can facilitate long-term and continuous monitoring of head impacts. Existing helmet-sensor technology is inconsistent, inaccurate, and is not economically or logistically practical for large-scale human studies. In this paper, we present the design of a miniature, battery-less, self-powered sensor that can be embedded inside sport helmets and can continuously monitor and store different spatial and temporal statistics of the helmet impacts. At the core of the proposed sensor is a novel time-dilation circuit that allows measurement of a wide-range of impact energies. In this paper an array of linear piezo-floating-gate (PFG) injectors has been used for self-powered sensing and storage of linear and rotational head-impact statistics. The stored statistics are then retrieved using a plug-and-play reader and has been used for offline data analysis. We report simulation and measurement results validating the functionality of the time-dilation circuit for different levels of impact energies. Also, using prototypes of linear PFG integrated circuits fabricated in a 0.5 μm CMOS process, we demonstrate the functionality of the proposed helmet-sensors using controlled drop tests.

  20. Nitrifying biomass characterization and monitoring during bioaugmentation in a membrane bioreactor.

    PubMed

    D'Anteo, Sibilla; Mannucci, Alberto; Meliani, Matteo; Verni, Franco; Petroni, Giulio; Munz, Giulio; Lubello, Claudio; Mori, Gualtiero; Vannini, Claudia

    2015-01-01

    A membrane bioreactor (MBR), fed with domestic wastewater, was bioaugmented with nitrifying biomass selected in a side-stream MBR fed with a synthetic high nitrogen-loaded influent. Microbial communities evolution was monitored and comparatively analysed through an extensive bio-molecular investigation (16S rRNA gene library construction and terminal-restriction fragment length polymorphism techniques) followed by statistical analyses. As expected, a highly specialized nitrifying biomass was selected in the side-stream reactor fed with high-strength ammonia synthetic wastewater. The bioaugmentation process caused an increase of nitrifying bacteria of the genera Nitrosomonas (up to more than 30%) and Nitrobacter in the inoculated MBR reactor. The overall structure of the microbial community changed in the mainstream MBR as a result of bioaugmentation. The effect of bioaugmentation in the shift of the microbial community was also verified through statistical analysis.

  1. Database Performance Monitoring for the Photovoltaic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Katherine A.

    The Database Performance Monitoring (DPM) software (copyright in processes) is being developed at Sandia National Laboratories to perform quality control analysis on time series data. The software loads time indexed databases (currently csv format), performs a series of quality control tests defined by the user, and creates reports which include summary statistics, tables, and graphics. DPM can be setup to run on an automated schedule defined by the user. For example, the software can be run once per day to analyze data collected on the previous day. HTML formatted reports can be sent via email or hosted on a website.more » To compare performance of several databases, summary statistics and graphics can be gathered in a dashboard view which links to detailed reporting information for each database. The software can be customized for specific applications.« less

  2. The New Alternative DSM-5 Model for Personality Disorders: Issues and Controversies

    ERIC Educational Resources Information Center

    Porter, Jeffrey S.; Risler, Edwin

    2014-01-01

    Purpose: Assess the new alternative "Diagnostic and Statistical Manual of Mental Disorders", fifth edition (DSM-5) model for personality disorders (PDs) as it is seen by its creators and critics. Method: Follow the DSM revision process by monitoring the American Psychiatric Association website and the publication of pertinent journal…

  3. Direct Observation of Teacher and Student Behavior in School Settings: Trends, Issues and Future Directions

    ERIC Educational Resources Information Center

    Lewis, Timothy J.; Scott, Terrance M.; Wehby, Joseph H.; Wills, Howard P.

    2014-01-01

    Across the modern history of the field of special education and emotional/behavioral disorders (EBD), direct observation of student and educator behavior has been an essential component of the diagnostic process, student progress monitoring, and establishing functional and statistical relationships within research. This article provides an…

  4. Human Mobility Monitoring in Very Low Resolution Visual Sensor Network

    PubMed Central

    Bo Bo, Nyan; Deboeverie, Francis; Eldib, Mohamed; Guan, Junzhi; Xie, Xingzhe; Niño, Jorge; Van Haerenborgh, Dirk; Slembrouck, Maarten; Van de Velde, Samuel; Steendam, Heidi; Veelaert, Peter; Kleihorst, Richard; Aghajan, Hamid; Philips, Wilfried

    2014-01-01

    This paper proposes an automated system for monitoring mobility patterns using a network of very low resolution visual sensors (30 × 30 pixels). The use of very low resolution sensors reduces privacy concern, cost, computation requirement and power consumption. The core of our proposed system is a robust people tracker that uses low resolution videos provided by the visual sensor network. The distributed processing architecture of our tracking system allows all image processing tasks to be done on the digital signal controller in each visual sensor. In this paper, we experimentally show that reliable tracking of people is possible using very low resolution imagery. We also compare the performance of our tracker against a state-of-the-art tracking method and show that our method outperforms. Moreover, the mobility statistics of tracks such as total distance traveled and average speed derived from trajectories are compared with those derived from ground truth given by Ultra-Wide Band sensors. The results of this comparison show that the trajectories from our system are accurate enough to obtain useful mobility statistics. PMID:25375754

  5. Novel Kalman filter algorithm for statistical monitoring of extensive landscapes with synoptic sensor data

    Treesearch

    Raymond L. Czaplewski

    2015-01-01

    Wall-to-wall remotely sensed data are increasingly available to monitor landscape dynamics over large geographic areas. However, statistical monitoring programs that use post-stratification cannot fully utilize those sensor data. The Kalman filter (KF) is an alternative statistical estimator. I develop a new KF algorithm that is numerically robust with large numbers of...

  6. Monitoring of antisolvent crystallization of sodium scutellarein by combined FBRM-PVM-NIR.

    PubMed

    Liu, Xuesong; Sun, Di; Wang, Feng; Wu, Yongjiang; Chen, Yong; Wang, Longhu

    2011-06-01

    Antisolvent crystallization can be used as an alternative to cooling or evaporation for the separation and purification of solid product in the pharmaceutical industry. To improve the process understanding of antisolvent crystallization, the use of in-line tools is vital. In this study, the process analytical technology (PAT) tools including focused beam reflectance measurement (FBRM), particle video microscope (PVM), and near-infrared spectroscopy (NIRS) were utilized to monitor antisolvent crystallization of sodium scutellarein. FBRM was used to monitor chord count and chord length distribution of sodium scutellarein particles in the crystallizer, and PVM, as an in-line video camera, provided pictures imaging particle shape and dimension. In addition, a quantitative model of PLS was established by in-line NIRS to detect the concentration of sodium scutellarein in the solvent and good calibration statistics were obtained (r(2) = 0.976) with the residual predictive deviation value of 11.3. The discussion over sensitivities, strengths, and weaknesses of the PAT tools may be helpful in selection of suitable PAT techniques. These in-line techniques eliminate the need for sample preparation and offer a time-saving approach to understand and monitor antisolvent crystallization process. Copyright © 2011 Wiley-Liss, Inc.

  7. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.

    PubMed

    Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.

  8. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    PubMed

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  9. Risk Identification in a Smart Monitoring System Used to Preserve Artefacts Based on Textile Materials

    NASA Astrophysics Data System (ADS)

    Diaconescu, V. D.; Scripcariu, L.; Mătăsaru, P. D.; Diaconescu, M. R.; Ignat, C. A.

    2018-06-01

    Exhibited textile-materials-based artefacts can be affected by the environmental conditions. A smart monitoring system that commands an adaptive automatic environment control system is proposed for indoor exhibition spaces containing various textile artefacts. All exhibited objects are monitored by many multi-sensor nodes containing temperature, relative humidity and light sensors. Data collected periodically from the entire sensor network is stored in a database and statistically processed in order to identify and classify the environment risk. Risk consequences are analyzed depending on the risk class and the smart system commands different control measures in order to stabilize the indoor environment conditions to the recommended values and prevent material degradation.

  10. Feature and Statistical Model Development in Structural Health Monitoring

    NASA Astrophysics Data System (ADS)

    Kim, Inho

    All structures suffer wear and tear because of impact, excessive load, fatigue, corrosion, etc. in addition to inherent defects during their manufacturing processes and their exposure to various environmental effects. These structural degradations are often imperceptible, but they can severely affect the structural performance of a component, thereby severely decreasing its service life. Although previous studies of Structural Health Monitoring (SHM) have revealed extensive prior knowledge on the parts of SHM processes, such as the operational evaluation, data processing, and feature extraction, few studies have been conducted from a systematical perspective, the statistical model development. The first part of this dissertation, the characteristics of inverse scattering problems, such as ill-posedness and nonlinearity, reviews ultrasonic guided wave-based structural health monitoring problems. The distinctive features and the selection of the domain analysis are investigated by analytically searching the conditions of the uniqueness solutions for ill-posedness and are validated experimentally. Based on the distinctive features, a novel wave packet tracing (WPT) method for damage localization and size quantification is presented. This method involves creating time-space representations of the guided Lamb waves (GLWs), collected at a series of locations, with a spatially dense distribution along paths at pre-selected angles with respect to the direction, normal to the direction of wave propagation. The fringe patterns due to wave dispersion, which depends on the phase velocity, are selected as the primary features that carry information, regarding the wave propagation and scattering. The following part of this dissertation presents a novel damage-localization framework, using a fully automated process. In order to construct the statistical model for autonomous damage localization deep-learning techniques, such as restricted Boltzmann machine and deep belief network, are trained and utilized to interpret nonlinear far-field wave patterns. Next, a novel bridge scour estimation approach that comprises advantages of both empirical and data-driven models is developed. Two field datasets from the literature are used, and a Support Vector Machine (SVM), a machine-learning algorithm, is used to fuse the field data samples and classify the data with physical phenomena. The Fast Non-dominated Sorting Genetic Algorithm (NSGA-II) is evaluated on the model performance objective functions to search for Pareto optimal fronts.

  11. Advances in industrial biopharmaceutical batch process monitoring: Machine-learning methods for small data problems.

    PubMed

    Tulsyan, Aditya; Garvin, Christopher; Ündey, Cenk

    2018-04-06

    Biopharmaceutical manufacturing comprises of multiple distinct processing steps that require effective and efficient monitoring of many variables simultaneously in real-time. The state-of-the-art real-time multivariate statistical batch process monitoring (BPM) platforms have been in use in recent years to ensure comprehensive monitoring is in place as a complementary tool for continued process verification to detect weak signals. This article addresses a longstanding, industry-wide problem in BPM, referred to as the "Low-N" problem, wherein a product has a limited production history. The current best industrial practice to address the Low-N problem is to switch from a multivariate to a univariate BPM, until sufficient product history is available to build and deploy a multivariate BPM platform. Every batch run without a robust multivariate BPM platform poses risk of not detecting potential weak signals developing in the process that might have an impact on process and product performance. In this article, we propose an approach to solve the Low-N problem by generating an arbitrarily large number of in silico batches through a combination of hardware exploitation and machine-learning methods. To the best of authors' knowledge, this is the first article to provide a solution to the Low-N problem in biopharmaceutical manufacturing using machine-learning methods. Several industrial case studies from bulk drug substance manufacturing are presented to demonstrate the efficacy of the proposed approach for BPM under various Low-N scenarios. © 2018 Wiley Periodicals, Inc.

  12. In-line monitoring of pellet coating thickness growth by means of visual imaging.

    PubMed

    Oman Kadunc, Nika; Sibanc, Rok; Dreu, Rok; Likar, Boštjan; Tomaževič, Dejan

    2014-08-15

    Coating thickness is the most important attribute of coated pharmaceutical pellets as it directly affects release profiles and stability of the drug. Quality control of the coating process of pharmaceutical pellets is thus of utmost importance for assuring the desired end product characteristics. A visual imaging technique is presented and examined as a process analytic technology (PAT) tool for noninvasive continuous in-line and real time monitoring of coating thickness of pharmaceutical pellets during the coating process. Images of pellets were acquired during the coating process through an observation window of a Wurster coating apparatus. Image analysis methods were developed for fast and accurate determination of pellets' coating thickness during a coating process. The accuracy of the results for pellet coating thickness growth obtained in real time was evaluated through comparison with an off-line reference method and a good agreement was found. Information about the inter-pellet coating uniformity was gained from further statistical analysis of the measured pellet size distributions. Accuracy and performance analysis of the proposed method showed that visual imaging is feasible as a PAT tool for in-line and real time monitoring of the coating process of pharmaceutical pellets. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Statistical Patterns in Natural Lightning

    NASA Astrophysics Data System (ADS)

    Zoghzoghy, F. G.; Cohen, M.; Said, R.; Inan, U. S.

    2011-12-01

    Every day millions of lightning flashes occur around the globe but the understanding of this natural phenomenon is still lacking. Fundamentally, lightning is nature's way of destroying charge separation in clouds and restoring electric neutrality. Thus, statistical patterns of lightning activity indicate the scope of these electric discharges and offer a surrogate measure of timescales for charge buildup in thunderclouds. We present a statistical method to investigate spatio-temporal correlations among lightning flashes using National Lightning Detection Network (NLDN) stroke data. By monitoring the distribution of lightning activity, we can observe the charging and discharging processes in a given thunderstorm. In particular, within a given storm, the flashes do not occur as a memoryless random process. We introduce the No Flash Zone (NFZ) which results from the suppressed probability of two consecutive neighboring flashes. This effect lasts for tens of seconds and can extend up to 15 km around the location of the initial flash, decaying with time. This suppression effect may be a function of variables such as storm location, storm phase, and stroke peak current. We develop a clustering algorithm, Storm-Locator, which groups strokes into flashes, storm cells, and thunderstorms, and enables us to study lightning and the NFZ in different geographical regions, and for different storms. The recursive algorithm also helps monitor the interaction among spatially displaced storm cells, and can provide more insight into the spatial and temporal impacts of lightning discharges.

  14. Monitoring Space Weather Hazards caused by geomagnetic disturbances with Space Hazard Monitor (SHM) systems

    NASA Astrophysics Data System (ADS)

    Xu, Z.; Gannon, J. L.; Peek, T. A.; Lin, D.

    2017-12-01

    One space weather hazard is the Geomagnetically Induced Currents (GICs) in the electric power transmission systems, which is naturally induced geoelectric field during the geomagnetic disturbances (GMDs). GICs are a potentially catastrophic threat to bulk power systems. For instance, the Blackout in Quebec in March 1989 was caused by GMDs during a significant magnetic storm. To monitor the GMDs, the autonomous Space Hazard Monitor (SHM) system is developed recently. The system includes magnetic field measurement from magnetometers and geomagnetic field measurement from electrodes. In this presentation, we introduce the six sites of SHMs which have been deployed in the US continental regions. The data from the magnetometers are processed with the Multiple Observatory Geomagnetic Data Analysis Software (MOGDAS). And the statistical results are presented here. It reveals not only the impacts of space weather over US continental region but also the potential of improving instrumentation development to provide better space weather monitor.

  15. Trend indicators needed for effective recreation planning - a statistical blueprint for the 80's

    Treesearch

    H. Fred Kaiser; George H. Moeller

    1980-01-01

    Here we outline important elements in recreation planning and describe how the process is changing, using Federal land management agencies as our example. We outline some factors that will impact on planning in the 80's, encourage establishment of a system to monitor trends in key factors that influence recreation behavior.

  16. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    PubMed

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (p<0.001). Since 2010, the quarterly rate of severe PPH has not exceeded the upper control limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Integrating policy-based management and SLA performance monitoring

    NASA Astrophysics Data System (ADS)

    Liu, Tzong-Jye; Lin, Chin-Yi; Chang, Shu-Hsin; Yen, Meng-Tzu

    2001-10-01

    Policy-based management system provides the configuration capability for the system administrators to focus on the requirements of customers. The service level agreement performance monitoring mechanism helps system administrators to verify the correctness of policies. However, it is difficult for a device to process the policies directly because the policies are the management concept. This paper proposes a mechanism to decompose a policy into rules that can be efficiently processed by a device. Thus, the device may process the rule and collect the performance statistics information efficiently; and the policy-based management system may collect these performance statistics information and report the service-level agreement performance monitoring information to the system administrator. The proposed policy-based management system achieves both the policy configuration and service-level agreement performance monitoring requirements. A policy consists of a condition part and an action part. The condition part is a Boolean expression of a source host IP group, a destination host IP group, etc. The action part is the parameters of services. We say that an address group is compact if it only consists of a range of IP address that can be denoted by a pair of IP address and corresponding IP mask. If the condition part of a policy only consists of the compact address group, we say that the policy is a rule. Since a device can efficiently process a compact address and a system administrator prefers to define a range of IP address, the policy-based management system has to translate policy into rules and supplements the gaps between policy and rules. The proposed policy-based management system builds the relationships between VPN and policies, policy and rules. Since the system administrator wants to monitor the system performance information of VPNs and policies, the proposed policy-based management system downloads the relationships among VPNs, policies and rules to the SNMP agents. The SNMP agents build the management information base (MIB) of all VPNs, policies and rules according to the relationships obtained from the management server. Thus, the proposed policy-based management system may get all performance monitoring information of VPNs and policies from agents. The proposed policy-based manager achieves two goals: a) provide a management environment for the system administrator to configure their network only considering the policy requirement issues and b) let the device have only to process the packet and then collect the required performance information. These two things make the proposed management system satisfy both the user and device requirements.

  18. Processing speed can monitor stimulant-medication effects in adults with attention deficit disorder with hyperactivity.

    PubMed

    Nielsen, Niels Peter; Wiig, Elisabeth H; Bäck, Svante; Gustafsson, Jan

    2017-05-01

    Treatment responses to methylphenidate by adults with ADHD are generally monitored against DSM-IV/DSM-V symptomatology, rating scales or interviews during reviews. To evaluate the use of single- and dual-dimension processing-speed and efficiency measures to monitor the effects of pharmacological treatment with methylphenidate after a short period off medication. A Quick Test of Cognitive Speed (AQT) monitored the effects of immediate-release methylphenidate in 40 previously diagnosed and medicated adults with ADHD. Processing speed was evaluated with prior prescription medication, without medication after a 2-day period off ADHD medication, and with low-dose (10/20 mg) and high-dose (20/40 mg) methylphenidate hydrochloride (Medikinet IR). Thirty-three participants responded to the experimental treatments. One-way ANOVA with post-hoc analysis (Scheffe) indicated significant main effects for single dimension colour and form and dual-dimension colour-form naming. Post-hoc analysis indicated statistical differences between the no- and high-dose medication conditions for colour and form, measures of perceptual speed. For colour-form naming, a measure of cognitive speed, there was a significant difference between no- and low-dose medication and between no- and high-dose medications, but not between low- and high-dose medications. Results indicated that the AQT tests effectively monitored incremental effects of the methylphenidate dose on processing speed after a 2-day period off medication. Thus, perceptual (colour and form) and cognitive speed (two-dimensional colour-form naming) and processing efficiency (lowered shift costs) increased measurably with high-dose medication. These preliminary findings warrant validation with added measures of associated behavioural and cognitive changes.

  19. Real-time process monitoring in a semi-continuous fluid-bed dryer - microwave resonance technology versus near-infrared spectroscopy.

    PubMed

    Peters, Johanna; Teske, Andreas; Taute, Wolfgang; Döscher, Claas; Höft, Michael; Knöchel, Reinhard; Breitkreutz, Jörg

    2018-02-15

    The trend towards continuous manufacturing in the pharmaceutical industry is associated with an increasing demand for advanced control strategies. It is a mandatory requirement to obtain reliable real-time information on critical quality attributes (CQA) during every process step as the decision on diversion of material needs to be performed fast and automatically. Where possible, production equipment should provide redundant systems for in-process control (IPC) measurements to ensure continuous process monitoring even if one of the systems is not available. In this paper, two methods for real-time monitoring of granule moisture in a semi-continuous fluid-bed drying unit are compared. While near-infrared (NIR) spectroscopy has already proven to be a suitable process analytical technology (PAT) tool for moisture measurements in fluid-bed applications, microwave resonance technology (MRT) showed difficulties to monitor moistures above 8% until recently. The results indicate, that the newly developed MRT sensor operating at four resonances is capable to compete with NIR spectroscopy. While NIR spectra were preprocessed by mean centering and first derivative before application of partial least squares (PLS) regression to build predictive models (RMSEP = 0.20%), microwave moisture values of two resonances sufficed to build a statistically close multiple linear regression (MLR) model (RMSEP = 0.07%) for moisture prediction. Thereby, it could be verified that moisture monitoring by MRT sensor systems could be a valuable alternative to NIR spectroscopy or could be used as a redundant system providing great ease of application. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Feasibility study of using statistical process control to customized quality assurance in proton therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho

    Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% formore » D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.« less

  1. Optimizing liquid effluent monitoring at a large nuclear complex.

    PubMed

    Chou, Charissa J; Barnett, D Brent; Johnson, Vernon G; Olson, Phil M

    2003-12-01

    Effluent monitoring typically requires a large number of analytes and samples during the initial or startup phase of a facility. Once a baseline is established, the analyte list and sampling frequency may be reduced. Although there is a large body of literature relevant to the initial design, few, if any, published papers exist on updating established effluent monitoring programs. This paper statistically evaluates four years of baseline data to optimize the liquid effluent monitoring efficiency of a centralized waste treatment and disposal facility at a large defense nuclear complex. Specific objectives were to: (1) assess temporal variability in analyte concentrations, (2) determine operational factors contributing to waste stream variability, (3) assess the probability of exceeding permit limits, and (4) streamline the sampling and analysis regime. Results indicated that the probability of exceeding permit limits was one in a million under normal facility operating conditions, sampling frequency could be reduced, and several analytes could be eliminated. Furthermore, indicators such as gross alpha and gross beta measurements could be used in lieu of more expensive specific isotopic analyses (radium, cesium-137, and strontium-90) for routine monitoring. Study results were used by the state regulatory agency to modify monitoring requirements for a new discharge permit, resulting in an annual cost savings of US dollars 223,000. This case study demonstrates that statistical evaluation of effluent contaminant variability coupled with process knowledge can help plant managers and regulators streamline analyte lists and sampling frequencies based on detection history and environmental risk.

  2. System development over the monitoring for the purpose of early warning of population from the threat of landslides. (Invited)

    NASA Astrophysics Data System (ADS)

    Zakhidova, D. V.; Kadyrhodjaev, A.; Scientific Team Of Hydroengeo Institute On Natural Hazards

    2010-12-01

    Well-timed warning of the population about possible landslide threat is one of the main positions in order to provide safe and stable country development. The system of monitoring over dangerous geological processes includes such components, as observation, forecast, control and management. Aspects of forecasting take special place. Having wide row of observations there can be possible to reveal some regularity of the phenomena, basing on which, it is possible to proceed forecasting. We looked through many approaches of forecasting that are used in different countries. The analysis of the available work has allowed to draw up a conclusion that while referring to the question of landslide forecasting, it is necessary to approach in system form, taking into account interacting components of the nature. The study of landslide processes has shown that these processes lies within the framework of engineering-geological directions of the science and also interacts with tectonics, geomorphology, hydrogeology, hydrology, climate change, technogenesis and etc. Thereby, the necessity of system approach, achievements of modern science and technology the most expedient approach to make a decision at landslide forecasting is probabilistic-statistical method with complex use of geological and satellite data, specific images processed through geoinformation systems. In this connection, probabilistic-statistical approach, reflecting natural characteristics of interacting natural system, allows to take into account multi-factored processes of landslide activations. Among the many factors, influencing on landslide activation, there exist ones that are not amenable to numerical feature. The parameters of these factors have descriptive, qualitative, rather than quantitative nature. Leaving these factors with lack of attention is absolutely not reasonable. Proposed approach has one more advantage, which allows taking into account not only numerical, but also non-numeric parameters. Combination of multidisciplinary, systematic feature, multifactorness of the account, probabilistic and statistical methods of the calculation, complex use of geological and satellite data, using modern technology processing and analysis of information - all these aspects were collected in one at proposed by authors approach to solve the question of defining the area of possible landslide activation. Proposed by authors method could be a part of the monitoring system for early warning of landslide activation. Thus, the authors propose to initialize the project “System development over the monitoring for the purpose of early warning of population from the threat of landslides”. In the process of project implementation there to be revealed such results like: 1. System of Geo-indicators in order to early warn quick-running landslide processes. 2. United interconnected system for remote, surface and underground types of observations over Geo-indicators. 3. Notification system of population about forthcoming threat by means of alerts, light signals, mobilization of municipalities and related ministries. In the result of project implementation there considered to reveal economic, technical, and social outputs.

  3. The development of performance-monitoring function in the posterior medial frontal cortex

    PubMed Central

    Fitzgerald, Kate Dimond; Perkins, Suzanne C.; Angstadt, Mike; Johnson, Timothy; Stern, Emily R.; Welsh, Robert C.; Taylor, Stephan F.

    2009-01-01

    Background Despite its critical role in performance-monitoring, the development of posterior medial prefrontal cortex (pMFC) in goal-directed behaviors remains poorly understood. Performance monitoring depends on distinct, but related functions that may differentially activate the pMFC, such as monitoring response conflict and detecting errors. Developmental differences in conflict- and error-related activations, coupled with age-related changes in behavioral performance, may confound attempts to map the maturation of pMFC functions. To characterize the development of pMFC-based performance monitoring functions, we segregated interference and error-processing, while statistically controlling for performance. Methods Twenty-one adults and 23 youth performed an event-related version of the Multi-Source Interference Task during functional magnetic resonance imaging (fMRI). Linear modeling of interference and error contrast estimates derived from the pMFC were regressed on age, while covarying for performance. Results Interference- and error-processing were associated with robust activation of the pMFC in both youth and adults. Among youth, interference- and error-related activation of the pMFC increased with age, independent of performance. Greater accuracy associated with greater pMFC activity during error commission in both groups. Discussion Increasing pMFC response to interference and errors occurs with age, likely contributing to the improvement of performance monitoring capacity during development. PMID:19913101

  4. 75 FR 27221 - Fisheries of the Northeastern United States; Atlantic Bluefish Fishery; 2010 Atlantic Bluefish...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-14

    ... the Council's Bluefish Monitoring Committee (Monitoring Committee) and Scientific and Statistical... to, commercial and recreational catch/landing statistics, current estimates of fishing mortality... Marine Recreational Fisheries Statistics Survey (MRFSS) data through Wave 2 were available for 2009, and...

  5. Effective solutions for monitoring the electrostatic separation of metal and plastic granular waste from electric and electronic equipment.

    PubMed

    Senouci, Khouira; Medles, Karim; Dascalescu, Lucian

    2013-02-01

    The variability of the quantity and purity of the recovered materials is a serious drawback for the application of electrostatic separation technologies to the recycling of granular wastes. In a series of previous articles we have pointed out how capability and classic control chart concepts could be employed for better mastering the outcome of such processes. In the present work, the multiple exponentially weighted moving average (MEWMA) control chart is introduced and shown to be more effective than the Hotelling T2 chart for monitoring slow varying changes in the electrostatic separation of granular mixtures originating from electric and electronic equipment waste. The operation of the industrial process was simulated by using a laboratory roll-type electrostatic separator and granular samples resulting from shredded electric cable wastes. The 25 tests carried out during the observation phase enabled the calculation of the upper and lower control limits for the two control charts considered in the present study. The 11 additional tests that simulated the monitoring phase pointed out that the MEWMA chart is more effective than Hotelling's T(2) chart in detecting slow varying changes in the outcome of a process. As the reverse is true in the case of abrupt alterations of monitored process performances, simultaneous usage of the two control charts is strongly recommended. While this study focused on a specific electrostatic separation process, using the MEWMA chart together with the well known Hotelling's T(2) chart should be applicable to the statistical control of other complex processes in the field of waste processing.

  6. Automated process control for plasma etching

    NASA Astrophysics Data System (ADS)

    McGeown, Margaret; Arshak, Khalil I.; Murphy, Eamonn

    1992-06-01

    This paper discusses the development and implementation of a rule-based system which assists in providing automated process control for plasma etching. The heart of the system is to establish a correspondence between a particular data pattern -- sensor or data signals -- and one or more modes of failure, i.e., a data-driven monitoring approach. The objective of this rule based system, PLETCHSY, is to create a program combining statistical process control (SPC) and fault diagnosis to help control a manufacturing process which varies over time. This can be achieved by building a process control system (PCS) with the following characteristics. A facility to monitor the performance of the process by obtaining and analyzing the data relating to the appropriate process variables. Process sensor/status signals are input into an SPC module. If trends are present, the SPC module outputs the last seven control points, a pattern which is represented by either regression or scoring. The pattern is passed to the rule-based module. When the rule-based system recognizes a pattern, it starts the diagnostic process using the pattern. If the process is considered to be going out of control, advice is provided about actions which should be taken to bring the process back into control.

  7. Problems of psychological monitoring in astronaut training.

    PubMed

    Morgun, V V

    1997-10-01

    Monitoring of the goal-oriented psychological changes of a man during professional training is necessary. The level development of the astronaut psychic features is checked by means of psychological testing with the final aim to evaluate each professionally important psychological qualities and to evaluate in general. The list of psychological features needed for evaluation is determined and empirically selected weight factors based on wide statistical sampling is introduced. Accumulation of psychological test results can predict an astronaut's ability of solving complicated problems in a flight mission. It can help to correct the training process and reveal weakness.

  8. Procedural considerations for CPV outdoor power ratings per IEC 62670

    NASA Astrophysics Data System (ADS)

    Muller, Matthew; Kurtz, Sarah; Rodriguez, Jose

    2013-09-01

    The IEC Working Group 7 (WG7) is in the process of developing a draft procedure for an outdoor concentrating photovoltaic (CPV) module power rating at Concentrator Standard Operating Conditions (CSOC). WG7 recently achieved some consensus that using component reference cells to monitor/limit spectral variation is the preferred path for the outdoor power rating. To build on this consensus, the community must quantify these spectral limits and select a procedure for calculating and reporting a power rating. This work focuses on statistically comparing several procedures the community is considering in context with monitoring/limiting spectral variation.

  9. Measurement error in time-series analysis: a simulation study comparing modelled and monitored data.

    PubMed

    Butland, Barbara K; Armstrong, Ben; Atkinson, Richard W; Wilkinson, Paul; Heal, Mathew R; Doherty, Ruth M; Vieno, Massimo

    2013-11-13

    Assessing health effects from background exposure to air pollution is often hampered by the sparseness of pollution monitoring networks. However, regional atmospheric chemistry-transport models (CTMs) can provide pollution data with national coverage at fine geographical and temporal resolution. We used statistical simulation to compare the impact on epidemiological time-series analysis of additive measurement error in sparse monitor data as opposed to geographically and temporally complete model data. Statistical simulations were based on a theoretical area of 4 regions each consisting of twenty-five 5 km × 5 km grid-squares. In the context of a 3-year Poisson regression time-series analysis of the association between mortality and a single pollutant, we compared the error impact of using daily grid-specific model data as opposed to daily regional average monitor data. We investigated how this comparison was affected if we changed the number of grids per region containing a monitor. To inform simulations, estimates (e.g. of pollutant means) were obtained from observed monitor data for 2003-2006 for national network sites across the UK and corresponding model data that were generated by the EMEP-WRF CTM. Average within-site correlations between observed monitor and model data were 0.73 and 0.76 for rural and urban daily maximum 8-hour ozone respectively, and 0.67 and 0.61 for rural and urban loge(daily 1-hour maximum NO2). When regional averages were based on 5 or 10 monitors per region, health effect estimates exhibited little bias. However, with only 1 monitor per region, the regression coefficient in our time-series analysis was attenuated by an estimated 6% for urban background ozone, 13% for rural ozone, 29% for urban background loge(NO2) and 38% for rural loge(NO2). For grid-specific model data the corresponding figures were 19%, 22%, 54% and 44% respectively, i.e. similar for rural loge(NO2) but more marked for urban loge(NO2). Even if correlations between model and monitor data appear reasonably strong, additive classical measurement error in model data may lead to appreciable bias in health effect estimates. As process-based air pollution models become more widely used in epidemiological time-series analysis, assessments of error impact that include statistical simulation may be useful.

  10. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    PubMed

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  11. Evicase: an evidence-based case structuring approach for personalized healthcare.

    PubMed

    Carmeli, Boaz; Casali, Paolo; Goldbraich, Anna; Goldsteen, Abigail; Kent, Carmel; Licitra, Lisa; Locatelli, Paolo; Restifo, Nicola; Rinott, Ruty; Sini, Elena; Torresani, Michele; Waks, Zeev

    2012-01-01

    The personalized medicine era stresses a growing need to combine evidence-based medicine with case based reasoning in order to improve the care process. To address this need we suggest a framework to generate multi-tiered statistical structures we call Evicases. Evicase integrates established medical evidence together with patient cases from the bedside. It then uses machine learning algorithms to produce statistical results and aggregators, weighted predictions, and appropriate recommendations. Designed as a stand-alone structure, Evicase can be used for a range of decision support applications including guideline adherence monitoring and personalized prognostic predictions.

  12. Real-Time Mass Spectrometry Monitoring of Oak Wood Toasting: Elucidating Aroma Development Relevant to Oak-aged Wine Quality

    NASA Astrophysics Data System (ADS)

    Farrell, Ross R.; Wellinger, Marco; Gloess, Alexia N.; Nichols, David S.; Breadmore, Michael C.; Shellie, Robert A.; Yeretzian, Chahan

    2015-11-01

    We introduce a real-time method to monitor the evolution of oak aromas during the oak toasting process. French and American oak wood boards were toasted in an oven at three different temperatures, while the process-gas was continuously transferred to the inlet of a proton-transfer-reaction time-of-flight mass spectrometer for online monitoring. Oak wood aroma compounds important for their sensory contribution to oak-aged wine were tentatively identified based on soft ionization and molecular mass. The time-intensity profiles revealed toasting process dynamics illustrating in real-time how different compounds evolve from the oak wood during toasting. Sufficient sensitivity was achieved to observe spikes in volatile concentrations related to cracking phenomena on the oak wood surface. The polysaccharide-derived compounds exhibited similar profiles; whilst for lignin-derived compounds eugenol formation differed from that of vanillin and guaiacol at lower toasting temperatures. Significant generation of oak lactone from precursors was evident at 225 oC. Statistical processing of the real-time aroma data showed similarities and differences between individual oak boards and oak wood sourced from the different origins. This study enriches our understanding of the oak toasting process and demonstrates a new analytical approach for research on wood volatiles.

  13. Using Statistical Process Control for detecting anomalies in multivariate spatiotemporal Earth Observations

    NASA Astrophysics Data System (ADS)

    Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus

    2016-04-01

    The number of available Earth observations (EOs) is currently substantially increasing. Detecting anomalous patterns in these multivariate time series is an important step in identifying changes in the underlying dynamical system. Likewise, data quality issues might result in anomalous multivariate data constellations and have to be identified before corrupting subsequent analyses. In industrial application a common strategy is to monitor production chains with several sensors coupled to some statistical process control (SPC) algorithm. The basic idea is to raise an alarm when these sensor data depict some anomalous pattern according to the SPC, i.e. the production chain is considered 'out of control'. In fact, the industrial applications are conceptually similar to the on-line monitoring of EOs. However, algorithms used in the context of SPC or process monitoring are rarely considered for supervising multivariate spatio-temporal Earth observations. The objective of this study is to exploit the potential and transferability of SPC concepts to Earth system applications. We compare a range of different algorithms typically applied by SPC systems and evaluate their capability to detect e.g. known extreme events in land surface processes. Specifically two main issues are addressed: (1) identifying the most suitable combination of data pre-processing and detection algorithm for a specific type of event and (2) analyzing the limits of the individual approaches with respect to the magnitude, spatio-temporal size of the event as well as the data's signal to noise ratio. Extensive artificial data sets that represent the typical properties of Earth observations are used in this study. Our results show that the majority of the algorithms used can be considered for the detection of multivariate spatiotemporal events and directly transferred to real Earth observation data as currently assembled in different projects at the European scale, e.g. http://baci-h2020.eu/index.php/ and http://earthsystemdatacube.net/. Known anomalies such as the Russian heatwave are detected as well as anomalies which are not detectable with univariate methods.

  14. An Assessment of Statistical Process Control-Based Approaches for Charting Student Evaluation Scores

    ERIC Educational Resources Information Center

    Ding, Xin; Wardell, Don; Verma, Rohit

    2006-01-01

    We compare three control charts for monitoring data from student evaluations of teaching (SET) with the goal of improving student satisfaction with teaching performance. The two charts that we propose are a modified "p" chart and a z-score chart. We show that these charts overcome some of the shortcomings of the more traditional charts…

  15. Progress since the World Summit for Children: A Statistical Review.

    ERIC Educational Resources Information Center

    United Nations Children's Fund, New York, NY.

    One of the strengths of the 1990 World Summit for Children was its emphasis on goals to drive development and shape actions, and on the need to monitor progress, thereby transforming the way the world collected and processed data on children and women and creating a vital base and baseline for progress. In 2000, an exhaustive end-decade review of…

  16. Statistical process control based chart for information systems security

    NASA Astrophysics Data System (ADS)

    Khan, Mansoor S.; Cui, Lirong

    2015-07-01

    Intrusion detection systems have a highly significant role in securing computer networks and information systems. To assure the reliability and quality of computer networks and information systems, it is highly desirable to develop techniques that detect intrusions into information systems. We put forward the concept of statistical process control (SPC) in computer networks and information systems intrusions. In this article we propose exponentially weighted moving average (EWMA) type quality monitoring scheme. Our proposed scheme has only one parameter which differentiates it from the past versions. We construct the control limits for the proposed scheme and investigate their effectiveness. We provide an industrial example for the sake of clarity for practitioner. We give comparison of the proposed scheme with EWMA schemes and p chart; finally we provide some recommendations for the future work.

  17. Development of Statistical Process Control Methodology for an Environmentally Compliant Surface Cleaning Process in a Bonding Laboratory

    NASA Technical Reports Server (NTRS)

    Hutchens, Dale E.; Doan, Patrick A.; Boothe, Richard E.

    1997-01-01

    Bonding labs at both MSFC and the northern Utah production plant prepare bond test specimens which simulate or witness the production of NASA's Reusable Solid Rocket Motor (RSRM). The current process for preparing the bonding surfaces employs 1,1,1-trichloroethane vapor degreasing, which simulates the current RSRM process. Government regulations (e.g., the 1990 Amendments to the Clean Air Act) have mandated a production phase-out of a number of ozone depleting compounds (ODC) including 1,1,1-trichloroethane. In order to comply with these regulations, the RSRM Program is qualifying a spray-in-air (SIA) precision cleaning process using Brulin 1990, an aqueous blend of surfactants. Accordingly, surface preparation prior to bonding process simulation test specimens must reflect the new production cleaning process. The Bonding Lab Statistical Process Control (SPC) program monitors the progress of the lab and its capabilities, as well as certifies the bonding technicians, by periodically preparing D6AC steel tensile adhesion panels with EA-91 3NA epoxy adhesive using a standardized process. SPC methods are then used to ensure the process is statistically in control, thus producing reliable data for bonding studies, and identify any problems which might develop. Since the specimen cleaning process is being changed, new SPC limits must be established. This report summarizes side-by-side testing of D6AC steel tensile adhesion witness panels and tapered double cantilevered beams (TDCBs) using both the current baseline vapor degreasing process and a lab-scale spray-in-air process. A Proceco 26 inches Typhoon dishwasher cleaned both tensile adhesion witness panels and TDCBs in a process which simulates the new production process. The tests were performed six times during 1995, subsequent statistical analysis of the data established new upper control limits (UCL) and lower control limits (LCL). The data also demonstrated that the new process was equivalent to the vapor degreasing process.

  18. Evaluation of extreme temperature events in northern Spain based on process control charts

    NASA Astrophysics Data System (ADS)

    Villeta, M.; Valencia, J. L.; Saá, A.; Tarquis, A. M.

    2018-02-01

    Extreme climate events have recently attracted the attention of a growing number of researchers because these events impose a large cost on agriculture and associated insurance planning. This study focuses on extreme temperature events and proposes a new method for their evaluation based on statistical process control tools, which are unusual in climate studies. A series of minimum and maximum daily temperatures for 12 geographical areas of a Spanish region between 1931 and 2009 were evaluated by applying statistical process control charts to statistically test whether evidence existed for an increase or a decrease of extreme temperature events. Specification limits were determined for each geographical area and used to define four types of extreme anomalies: lower and upper extremes for the minimum and maximum anomalies. A new binomial Markov extended process that considers the autocorrelation between extreme temperature events was generated for each geographical area and extreme anomaly type to establish the attribute control charts for the annual fraction of extreme days and to monitor the occurrence of annual extreme days. This method was used to assess the significance of changes and trends of extreme temperature events in the analysed region. The results demonstrate the effectiveness of an attribute control chart for evaluating extreme temperature events. For example, the evaluation of extreme maximum temperature events using the proposed statistical process control charts was consistent with the evidence of an increase in maximum temperatures during the last decades of the last century.

  19. The 'fine structure' of nutrient dynamics in rivers: ten years of study using high-frequency monitoring

    NASA Astrophysics Data System (ADS)

    Jordan, Phil; Melland, Alice; Shore, Mairead; Mellander, Per-Erik; Shortle, Ger; Ryan, David; Crockford, Lucy; Macintosh, Katrina; Campbell, Julie; Arnscheidt, Joerg; Cassidy, Rachel

    2014-05-01

    A complete appraisal of material fluxes in flowing waters is really only possibly with high time resolution data synchronous with measurements of discharge. Defined by Kirchner et al. (2004; Hydrological Processes, 18/7) as the high-frequency wave of the future and with regard to disentangling signal noise from process pattern, this challenge has been met in terms of nutrient flux monitoring by automated bankside analysis. In Ireland over a ten-year period, time-series nutrient data collected on a sub-hourly basis in rivers have been used to distinguish fluxes from different catchment sources and pathways and to provide more certain temporal pictures of flux for the comparative definition of catchment nutrient dynamics. In catchments where nutrient fluxes are particularly high and exhibit a mix of extreme diffuse and point source influences, high time resolution data analysis indicates that there are no satisfactory statistical proxies for seasonal or annual flux predictions that use coarse datasets. Or at least exposes the limits of statistical approaches to catchment scale and hydrological response. This has profound implications for catchment monitoring programmes that rely on modelled relationships. However, using high resolution monitoring for long term assessments of catchment mitigation measures comes with further challenges. Sustaining continuous wet chemistry analysis at river stations is resource intensive in terms of capital, maintenance and quality assurance. Furthermore, big data capture requires investment in data management systems and analysis. These two institutional challenges are magnified when considering the extended time period required to identify the influences of land-based nutrient control measures on water based systems. Separating the 'climate signal' from the 'source signal' in river nutrient flux data is a major analysis challenge; more so when tackled with anything but higher resolution data. Nevertheless, there is scope to lower costs in bankside analysis through technology development, and the scientific advantages of these data are clear and exciting. When integrating its use with policy appraisal, it must be made clear that the advances in river process understanding from high resolution monitoring data capture come as a package with the ability to make more informed decisions through an investment in better information.

  20. A Statistical Representation of Pyrotechnic Igniter Output

    NASA Astrophysics Data System (ADS)

    Guo, Shuyue; Cooper, Marcia

    2017-06-01

    The output of simplified pyrotechnic igniters for research investigations is statistically characterized by monitoring the post-ignition external flow field with Schlieren imaging. Unique to this work is a detailed quantification of all measurable manufacturing parameters (e.g., bridgewire length, charge cavity dimensions, powder bed density) and associated shock-motion variability in the tested igniters. To demonstrate experimental precision of the recorded Schlieren images and developed image processing methodologies, commercial exploding bridgewires using wires of different parameters were tested. Finally, a statistically-significant population of manufactured igniters were tested within the Schlieren arrangement resulting in a characterization of the nominal output. Comparisons between the variances measured throughout the manufacturing processes and the calculated output variance provide insight into the critical device phenomena that dominate performance. Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's NNSA under contract DE-AC04-94AL85000.

  1. Development and Validation of Pathogen Environmental Monitoring Programs for Small Cheese Processing Facilities.

    PubMed

    Beno, Sarah M; Stasiewicz, Matthew J; Andrus, Alexis D; Ralyea, Robert D; Kent, David J; Martin, Nicole H; Wiedmann, Martin; Boor, Kathryn J

    2016-12-01

    Pathogen environmental monitoring programs (EMPs) are essential for food processing facilities of all sizes that produce ready-to-eat food products exposed to the processing environment. We developed, implemented, and evaluated EMPs targeting Listeria spp. and Salmonella in nine small cheese processing facilities, including seven farmstead facilities. Individual EMPs with monthly sample collection protocols were designed specifically for each facility. Salmonella was detected in only one facility, with likely introduction from the adjacent farm indicated by pulsed-field gel electrophoresis data. Listeria spp. were isolated from all nine facilities during routine sampling. The overall Listeria spp. (other than Listeria monocytogenes ) and L. monocytogenes prevalences in the 4,430 environmental samples collected were 6.03 and 1.35%, respectively. Molecular characterization and subtyping data suggested persistence of a given Listeria spp. strain in seven facilities and persistence of L. monocytogenes in four facilities. To assess routine sampling plans, validation sampling for Listeria spp. was performed in seven facilities after at least 6 months of routine sampling. This validation sampling was performed by independent individuals and included collection of 50 to 150 samples per facility, based on statistical sample size calculations. Two of the facilities had a significantly higher frequency of detection of Listeria spp. during the validation sampling than during routine sampling, whereas two other facilities had significantly lower frequencies of detection. This study provides a model for a science- and statistics-based approach to developing and validating pathogen EMPs.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Q; Read, P

    Purpose: Multiple error pathways can lead to delivery errors during the treatment course that cannot be caught with pre-treatment QA. While in vivo solutions are being developed for linacs, no such solution exists for tomotherapy. The purpose of this study is to develop a near real-time system for tomotherapy that can monitor the delivery and dose accumulation process during the treatment-delivery, which enable the user to assess the impact of delivery variations and/or errors and to interrupt the treatment if necessary. Methods: A program running on a tomotherapy planning station fetches the raw DAS data during treatment. Exit detector datamore » is extracted as well as output, gantry angle, and other machine parameters. For each sample, the MLC open-close state is determined. The delivered plan is compared with the original plan via a Monte Carlo dose engine which transports fluence deviations from a pre-treatment Monte Carlo run. A report containing the difference in fluence, dose and DVH statistics is created in html format. This process is repeated until the treatment is completed. Results: Since we only need to compute the dose for the difference in fluence for a few projections each time, dose with 2% statistical uncertainty can be computed in less than 1 second on a 4-core cpu. However, the current bottleneck in this near real-time system is the repeated fetching and processing the growing DAS data file throughout the delivery. The frame rate drops from 10Hz at the beginning of treatment to 5Hz after 3 minutes and to 2Hz after 10 minutes. Conclusion: A during-treatment delivery monitor system has been built to monitor tomotherapy treatments. The system improves patient safety by allowing operators to assess the delivery variations and errors during treatment delivery and adopt appropriate actions.« less

  3. Sources and characteristics of acoustic emissions from mechanically stressed geologic granular media — A review

    NASA Astrophysics Data System (ADS)

    Michlmayr, Gernot; Cohen, Denis; Or, Dani

    2012-05-01

    The formation of cracks and emergence of shearing planes and other modes of rapid macroscopic failure in geologic granular media involve numerous grain scale mechanical interactions often generating high frequency (kHz) elastic waves, referred to as acoustic emissions (AE). These acoustic signals have been used primarily for monitoring and characterizing fatigue and progressive failure in engineered systems, with only a few applications concerning geologic granular media reported in the literature. Similar to the monitoring of seismic events preceding an earthquake, AE may offer a means for non-invasive, in-situ, assessment of mechanical precursors associated with imminent landslides or other types of rapid mass movements (debris flows, rock falls, snow avalanches, glacier stick-slip events). Despite diverse applications and potential usefulness, a systematic description of the AE method and its relevance to mechanical processes in Earth sciences is lacking. This review is aimed at providing a sound foundation for linking observed AE with various micro-mechanical failure events in geologic granular materials, not only for monitoring of triggering events preceding mass mobilization, but also as a non-invasive tool in its own right for probing the rich spectrum of mechanical processes at scales ranging from a single grain to a hillslope. We review first studies reporting use of AE for monitoring of failure in various geologic materials, and describe AE generating source mechanisms in mechanically stressed geologic media (e.g., frictional sliding, micro-crackling, particle collisions, rupture of water bridges, etc.) including AE statistical features, such as frequency content and occurrence probabilities. We summarize available AE sensors and measurement principles. The high sampling rates of advanced AE systems enable detection of numerous discrete failure events within a volume and thus provide access to statistical descriptions of progressive collapse of systems with many interacting mechanical elements such as the fiber bundle model (FBM). We highlight intrinsic links between AE characteristics and established statistical models often used in structural engineering and material sciences, and outline potential applications for failure prediction and early-warning using the AE method in combination with the FBM. The biggest challenge to application of the AE method for field applications is strong signal attenuation. We provide an outlook for overcoming such limitations considering emergence of a class of fiber-optic based distributed AE sensors and deployment of acoustic waveguides as part of monitoring networks.

  4. Statistical process control of cocrystallization processes: A comparison between OPLS and PLS.

    PubMed

    Silva, Ana F T; Sarraguça, Mafalda Cruz; Ribeiro, Paulo R; Santos, Adenilson O; De Beer, Thomas; Lopes, João Almeida

    2017-03-30

    Orthogonal partial least squares regression (OPLS) is being increasingly adopted as an alternative to partial least squares (PLS) regression due to the better generalization that can be achieved. Particularly in multivariate batch statistical process control (BSPC), the use of OPLS for estimating nominal trajectories is advantageous. In OPLS, the nominal process trajectories are expected to be captured in a single predictive principal component while uncorrelated variations are filtered out to orthogonal principal components. In theory, OPLS will yield a better estimation of the Hotelling's T 2 statistic and corresponding control limits thus lowering the number of false positives and false negatives when assessing the process disturbances. Although OPLS advantages have been demonstrated in the context of regression, its use on BSPC was seldom reported. This study proposes an OPLS-based approach for BSPC of a cocrystallization process between hydrochlorothiazide and p-aminobenzoic acid monitored on-line with near infrared spectroscopy and compares the fault detection performance with the same approach based on PLS. A series of cocrystallization batches with imposed disturbances were used to test the ability to detect abnormal situations by OPLS and PLS-based BSPC methods. Results demonstrated that OPLS was generally superior in terms of sensibility and specificity in most situations. In some abnormal batches, it was found that the imposed disturbances were only detected with OPLS. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. A DMAIC approach for process capability improvement an engine crankshaft manufacturing process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa

    2014-05-01

    The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.

  6. Importance of good manufacturing practices in microbiological monitoring in processing human tissues for transplant.

    PubMed

    Pianigiani, Elisa; Ierardi, Francesca; Fimiani, Michele

    2013-12-01

    Skin allografts represent an important therapeutic resource in the treatment of severe skin loss. The risk associated with application of processed tissues in humans is very low, however, human material always carries the risk of disease transmission. To minimise the risk of contamination of grafts, processing is carried out in clean rooms where air quality is monitored. Procedures and quality control tests are performed to standardise the production process and to guarantee the final product for human use. Since we only validate and distribute aseptic tissues, we conducted a study to determine what type of quality controls for skin processing are the most suitable for detecting processing errors and intercurrent contamination, and for faithfully mapping the process without unduly increasing production costs. Two different methods for quality control were statistically compared using the Fisher exact test. On the basis of the current study we selected our quality control procedure based on pre- and post-processing tissue controls, operator and environmental controls. Evaluation of the predictability of our control methods showed that tissue control was the most reliable method of revealing microbial contamination of grafts. We obtained 100 % sensitivity by doubling tissue controls, while maintaining high specificity (77 %).

  7. Detecting temporal change in freshwater fisheries surveys: statistical power and the important linkages between management questions and monitoring objectives

    USGS Publications Warehouse

    Wagner, Tyler; Irwin, Brian J.; James R. Bence,; Daniel B. Hayes,

    2016-01-01

    Monitoring to detect temporal trends in biological and habitat indices is a critical component of fisheries management. Thus, it is important that management objectives are linked to monitoring objectives. This linkage requires a definition of what constitutes a management-relevant “temporal trend.” It is also important to develop expectations for the amount of time required to detect a trend (i.e., statistical power) and for choosing an appropriate statistical model for analysis. We provide an overview of temporal trends commonly encountered in fisheries management, review published studies that evaluated statistical power of long-term trend detection, and illustrate dynamic linear models in a Bayesian context, as an additional analytical approach focused on shorter term change. We show that monitoring programs generally have low statistical power for detecting linear temporal trends and argue that often management should be focused on different definitions of trends, some of which can be better addressed by alternative analytical approaches.

  8. Data-driven risk identification in phase III clinical trials using central statistical monitoring.

    PubMed

    Timmermans, Catherine; Venet, David; Burzykowski, Tomasz

    2016-02-01

    Our interest lies in quality control for clinical trials, in the context of risk-based monitoring (RBM). We specifically study the use of central statistical monitoring (CSM) to support RBM. Under an RBM paradigm, we claim that CSM has a key role to play in identifying the "risks to the most critical data elements and processes" that will drive targeted oversight. In order to support this claim, we first see how to characterize the risks that may affect clinical trials. We then discuss how CSM can be understood as a tool for providing a set of data-driven key risk indicators (KRIs), which help to organize adaptive targeted monitoring. Several case studies are provided where issues in a clinical trial have been identified thanks to targeted investigation after the identification of a risk using CSM. Using CSM to build data-driven KRIs helps to identify different kinds of issues in clinical trials. This ability is directly linked with the exhaustiveness of the CSM approach and its flexibility in the definition of the risks that are searched for when identifying the KRIs. In practice, a CSM assessment of the clinical database seems essential to ensure data quality. The atypical data patterns found in some centers and variables are seen as KRIs under a RBM approach. Targeted monitoring or data management queries can be used to confirm whether the KRIs point to an actual issue or not.

  9. Monitoring of chicken meat freshness by means of a colorimetric sensor array.

    PubMed

    Salinas, Yolanda; Ros-Lis, José V; Vivancos, José-L; Martínez-Máñez, Ramón; Marcos, M Dolores; Aucejo, Susana; Herranz, Nuria; Lorente, Inmaculada

    2012-08-21

    A new optoelectronic nose to monitor chicken meat ageing has been developed. It is based on 16 pigments prepared by the incorporation of different dyes (pH indicators, Lewis acids, hydrogen-bonding derivatives, selective probes and natural dyes) into inorganic materials (UVM-7, silica and alumina). The colour changes of the sensor array were characteristic of chicken ageing in a modified packaging atmosphere (30% CO(2)-70% N(2)). The chromogenic array data were processed with qualitative (PCA) and quantitative (PLS) tools. The PCA statistical analysis showed a high degree of dispersion, with nine dimensions required to explain 95% of variance. Despite this high dimensionality, a tridimensional representation of the three principal components was able to differentiate ageing with 2-day intervals. Moreover, the PLS statistical analysis allows the creation of a model to correlate the chromogenic data with chicken meat ageing. The model offers a PLS prediction model for ageing with values of 0.9937, 0.0389 and 0.994 for the slope, the intercept and the regression coefficient, respectively, and is in agreement with the perfect fit between the predicted and measured values observed. The results suggest the feasibility of this system to help develop optoelectronic noses that monitor food freshness.

  10. Quantification of model uncertainty in aerosol optical thickness retrieval from Ozone Monitoring Instrument (OMI) measurements

    NASA Astrophysics Data System (ADS)

    Määttä, A.; Laine, M.; Tamminen, J.; Veefkind, J. P.

    2013-09-01

    We study uncertainty quantification in remote sensing of aerosols in the atmosphere with top of the atmosphere reflectance measurements from the nadir-viewing Ozone Monitoring Instrument (OMI). Focus is on the uncertainty in aerosol model selection of pre-calculated aerosol models and on the statistical modelling of the model inadequacies. The aim is to apply statistical methodologies that improve the uncertainty estimates of the aerosol optical thickness (AOT) retrieval by propagating model selection and model error related uncertainties more realistically. We utilise Bayesian model selection and model averaging methods for the model selection problem and use Gaussian processes to model the smooth systematic discrepancies from the modelled to observed reflectance. The systematic model error is learned from an ensemble of operational retrievals. The operational OMI multi-wavelength aerosol retrieval algorithm OMAERO is used for cloud free, over land pixels of the OMI instrument with the additional Bayesian model selection and model discrepancy techniques. The method is demonstrated with four examples with different aerosol properties: weakly absorbing aerosols, forest fires over Greece and Russia, and Sahara dessert dust. The presented statistical methodology is general; it is not restricted to this particular satellite retrieval application.

  11. Statistical estimators for monitoring spotted owls in Oregon and Washington in 1987.

    Treesearch

    Tlmothy A. Max; Ray A. Souter; Kathleen A. O' Halloran

    1990-01-01

    Spotted owls (Strix occidentalis) were monitored on 11 National Forests in the Pacific Northwest Region of the USDA Forest Service between March and August of 1987. The basic intent of monitoring was to provide estimates of occupancy and reproduction rates for pairs of spotted owls. This paper documents the technical details of the statistical...

  12. Study on visual detection method for wind turbine blade failure

    NASA Astrophysics Data System (ADS)

    Chen, Jianping; Shen, Zhenteng

    2018-02-01

    Start your abstract here…At present, the non-destructive testing methods of the wind turbine blades has fiber bragg grating, sound emission and vibration detection, but there are all kinds of defects, and the engineering application is difficult. In this regard, three-point slope deviation method, which is a kind of visual inspection method, is proposed for monitoring the running status of wind turbine blade based on the image processing technology. A better blade image can be got through calibration, image splicing, pretreatment and threshold segmentation algorithm. Design of the early warning system to monitor wind turbine blade running condition, recognition rate, stability and impact factors of the method were statistically analysed. The experimental results shown showed that it has highly accurate and good monitoring effect.

  13. The utility of Bayesian predictive probabilities for interim monitoring of clinical trials

    PubMed Central

    Connor, Jason T.; Ayers, Gregory D; Alvarez, JoAnn

    2014-01-01

    Background Bayesian predictive probabilities can be used for interim monitoring of clinical trials to estimate the probability of observing a statistically significant treatment effect if the trial were to continue to its predefined maximum sample size. Purpose We explore settings in which Bayesian predictive probabilities are advantageous for interim monitoring compared to Bayesian posterior probabilities, p-values, conditional power, or group sequential methods. Results For interim analyses that address prediction hypotheses, such as futility monitoring and efficacy monitoring with lagged outcomes, only predictive probabilities properly account for the amount of data remaining to be observed in a clinical trial and have the flexibility to incorporate additional information via auxiliary variables. Limitations Computational burdens limit the feasibility of predictive probabilities in many clinical trial settings. The specification of prior distributions brings additional challenges for regulatory approval. Conclusions The use of Bayesian predictive probabilities enables the choice of logical interim stopping rules that closely align with the clinical decision making process. PMID:24872363

  14. Exploring infrared sensoring for real time welding defects monitoring in GTAW.

    PubMed

    Alfaro, Sadek C A; Franco, Fernand Díaz

    2010-01-01

    This paper presents an evaluation of an infrared sensor for monitoring the welding pool temperature in a Gas Tungsten Arc Welding (GTAW) process. The purpose of the study is to develop a real time system control. It is known that the arc welding pool temperature is related to the weld penetration depth; therefore, by monitoring the temperature, the arc pool temperature and penetration depth are also monitored. Various experiments were performed; in some of them the current was varied and the temperature changes were registered, in others, defects were induced throughout the path of the weld bead for a fixed current. These simulated defects resulted in abrupt changes in the average temperature values, thus providing an indication of the presence of a defect. The data has been registered with an acquisition card. To identify defects in the samples under infrared emissions, the timing series were analyzed through graphics and statistic methods. The selection of this technique demonstrates the potential for infrared emission as a welding monitoring parameter sensor.

  15. Exploring Infrared Sensoring for Real Time Welding Defects Monitoring in GTAW

    PubMed Central

    Alfaro, Sadek C. A.; Franco, Fernand Díaz

    2010-01-01

    This paper presents an evaluation of an infrared sensor for monitoring the welding pool temperature in a Gas Tungsten Arc Welding (GTAW) process. The purpose of the study is to develop a real time system control. It is known that the arc welding pool temperature is related to the weld penetration depth; therefore, by monitoring the temperature, the arc pool temperature and penetration depth are also monitored. Various experiments were performed; in some of them the current was varied and the temperature changes were registered, in others, defects were induced throughout the path of the weld bead for a fixed current. These simulated defects resulted in abrupt changes in the average temperature values, thus providing an indication of the presence of a defect. The data has been registered with an acquisition card. To identify defects in the samples under infrared emissions, the timing series were analyzed through graphics and statistic methods. The selection of this technique demonstrates the potential for infrared emission as a welding monitoring parameter sensor. PMID:22219697

  16. Estimating TCP Packet Loss Ratio from Sampled ACK Packets

    NASA Astrophysics Data System (ADS)

    Yamasaki, Yasuhiro; Shimonishi, Hideyuki; Murase, Tutomu

    The advent of various quality-sensitive applications has greatly changed the requirements for IP network management and made the monitoring of individual traffic flows more important. Since the processing costs of per-flow quality monitoring are high, especially in high-speed backbone links, packet sampling techniques have been attracting considerable attention. Existing sampling techniques, such as those used in Sampled NetFlow and sFlow, however, focus on the monitoring of traffic volume, and there has been little discussion of the monitoring of such quality indexes as packet loss ratio. In this paper we propose a method for estimating, from sampled packets, packet loss ratios in individual TCP sessions. It detects packet loss events by monitoring duplicate ACK events raised by each TCP receiver. Because sampling reveals only a portion of the actual packet loss, the actual packet loss ratio is estimated statistically. Simulation results show that the proposed method can estimate the TCP packet loss ratio accurately from a 10% sampling of packets.

  17. Wireless Monitoring of the Height of Condensed Water in Steam Pipes

    NASA Technical Reports Server (NTRS)

    Lee, Hyeong Jae; Bar-Cohen, Yoseph; Lih, Shyh-Shiuh; Badescu, Mircea; Dingizian, Arsham; Takano, Nobuyuki; Blosiu, Julian O.

    2014-01-01

    A wireless health monitoring system has been developed for determining the height of water condensation in the steam pipes and the data acquisition is done remotely using a wireless network system. The developed system is designed to operate in the harsh environment encountered at manholes and the pipe high temperature of over 200 °C. The test method is an ultrasonic pulse-echo and the hardware includes a pulser, receiver and wireless modem for communication. Data acquisition and signal processing software were developed to determine the water height using adaptive signal processing and data communication that can be controlled while the hardware is installed in a manhole. A statistical decision-making tool is being developed based on the field test data to determine the height of in the condensed water under high noise conditions and other environmental factors.

  18. Processing ultrasound backscatter to monitor high-intensity focused ultrasound (HIFU) therapy

    NASA Astrophysics Data System (ADS)

    Kaczkowski, Peter J.; Anand, Ajay; Bailey, Michael R.

    2005-09-01

    The development of new noninvasive surgical methods such as HIFU for the treatment of cancer and internal bleeding requires simultaneous development of new sensing approaches to guide, monitor, and assess the therapy. Ultrasound imaging using echo amplitude has long been used to map tissue morphology for diagnostic interpretation by the clinician. New quantitative ultrasonic methods that rely on amplitude and phase processing for tissue characterization are being developed for monitoring of ablative therapy. We have been developing the use of full wave ultrasound backscattering for real-time temperature estimation, and to image changes in tissue backscatter spectrum as therapy progresses. Both approaches rely on differential processing of the backscatter signal in time, and precise measurement of phase differences. Noise and artifacts from motion and nonstationary speckle statistics are addressed by constraining inversions for tissue parameters with physical models. We present results of HIFU experiments with static point and scanned HIFU exposures in which temperature rise can be accurately mapped using a new heat transfer equation (HTE) model-constrained inverse approach. We also present results of a recently developed spectral imaging method that elucidates microbubble-mediated nonlinearity not visible as a change in backscatter amplitude. [Work supported by Army MRMC.

  19. Monitoring Poisson observations using combined applications of Shewhart and EWMA charts

    NASA Astrophysics Data System (ADS)

    Abujiya, Mu'azu Ramat

    2017-11-01

    The Shewhart and exponentially weighted moving average (EWMA) charts for nonconformities are the most widely used procedures of choice for monitoring Poisson observations in modern industries. Individually, the Shewhart EWMA charts are only sensitive to large and small shifts, respectively. To enhance the detection abilities of the two schemes in monitoring all kinds of shifts in Poisson count data, this study examines the performance of combined applications of the Shewhart, and EWMA Poisson control charts. Furthermore, the study proposes modifications based on well-structured statistical data collection technique, ranked set sampling (RSS), to detect shifts in the mean of a Poisson process more quickly. The relative performance of the proposed Shewhart-EWMA Poisson location charts is evaluated in terms of the average run length (ARL), standard deviation of the run length (SDRL), median run length (MRL), average ratio ARL (ARARL), average extra quadratic loss (AEQL) and performance comparison index (PCI). Consequently, all the new Poisson control charts based on RSS method are generally more superior than most of the existing schemes for monitoring Poisson processes. The use of these combined Shewhart-EWMA Poisson charts is illustrated with an example to demonstrate the practical implementation of the design procedure.

  20. Damage detection in bridges through fiber optic structural health monitoring

    NASA Astrophysics Data System (ADS)

    Doornink, J. D.; Phares, B. M.; Wipf, T. J.; Wood, D. L.

    2006-10-01

    A fiber optic structural health monitoring (SHM) system was developed and deployed by the Iowa State University (ISU) Bridge Engineering Center (BEC) to detect gradual or sudden damage in fracture-critical bridges (FCBs). The SHM system is trained with measured performance data, which are collected by fiber optic strain sensors to identify typical bridge behavior when subjected to ambient traffic loads. Structural responses deviating from the trained behavior are considered to be signs of structural damage or degradation and are identified through analytical procedures similar to control chart analyses used in statistical process control (SPC). The demonstration FCB SHM system was installed on the US Highway 30 bridge near Ames, IA, and utilizes 40 fiber bragg grating (FBG) sensors to continuously monitor the bridge response when subjected to ambient traffic loads. After the data is collected and processed, weekly evaluation reports are developed that summarize the continuous monitoring results. Through use of the evaluation reports, the bridge owner is able to identify and estimate the location and severity of the damage. The information presented herein includes an overview of the SHM components, results from laboratory and field validation testing on the system components, and samples of the reduced and analyzed data.

  1. Auditory Magnetoencephalographic Frequency-Tagged Responses Mirror the Ongoing Segmentation Processes Underlying Statistical Learning.

    PubMed

    Farthouat, Juliane; Franco, Ana; Mary, Alison; Delpouve, Julie; Wens, Vincent; Op de Beeck, Marc; De Tiège, Xavier; Peigneux, Philippe

    2017-03-01

    Humans are highly sensitive to statistical regularities in their environment. This phenomenon, usually referred as statistical learning, is most often assessed using post-learning behavioural measures that are limited by a lack of sensibility and do not monitor the temporal dynamics of learning. In the present study, we used magnetoencephalographic frequency-tagged responses to investigate the neural sources and temporal development of the ongoing brain activity that supports the detection of regularities embedded in auditory streams. Participants passively listened to statistical streams in which tones were grouped as triplets, and to random streams in which tones were randomly presented. Results show that during exposure to statistical (vs. random) streams, tritone frequency-related responses reflecting the learning of regularities embedded in the stream increased in the left supplementary motor area and left posterior superior temporal sulcus (pSTS), whereas tone frequency-related responses decreased in the right angular gyrus and right pSTS. Tritone frequency-related responses rapidly developed to reach significance after 3 min of exposure. These results suggest that the incidental extraction of novel regularities is subtended by a gradual shift from rhythmic activity reflecting individual tone succession toward rhythmic activity synchronised with triplet presentation, and that these rhythmic processes are subtended by distinct neural sources.

  2. Covariation of depressive mood and spontaneous physical activity in major depressive disorder: toward continuous monitoring of depressive mood.

    PubMed

    Kim, Jinhyuk; Nakamura, Toru; Kikuchi, Hiroe; Yoshiuchi, Kazuhiro; Sasaki, Tsukasa; Yamamoto, Yoshiharu

    2015-07-01

    The objective evaluation of depressive mood is considered to be useful for the diagnosis and treatment of depressive disorders. Thus, we investigated psychobehavioral correlates, particularly the statistical associations between momentary depressive mood and behavioral dynamics measured objectively, in patients with major depressive disorder (MDD) and healthy subjects. Patients with MDD ( n = 14) and healthy subjects ( n = 43) wore a watch-type computer device and rated their momentary symptoms using ecological momentary assessment. Spontaneous physical activity in daily life, referred to as locomotor activity, was also continuously measured by an activity monitor built into the device. A multilevel modeling approach was used to model the associations between changes in depressive mood scores and the local statistics of locomotor activity simultaneously measured. We further examined the cross validity of such associations across groups. The statistical model established indicated that worsening of the depressive mood was associated with the increased intermittency of locomotor activity, as characterized by a lower mean and higher skewness. The model was cross validated across groups, suggesting that the same psychobehavioral correlates are shared by both healthy subjects and patients, although the latter had significantly higher mean levels of depressive mood scores. Our findings suggest the presence of robust as well as common associations between momentary depressive mood and behavioral dynamics in healthy individuals and patients with depression, which may lead to the continuous monitoring of the pathogenic processes (from healthy states) and pathological states of MDD.

  3. Hydroacoustic monitoring of a salt cavity: an analysis of precursory events of the collapse

    NASA Astrophysics Data System (ADS)

    Lebert, F.; Bernardie, S.; Mainsant, G.

    2011-09-01

    One of the main features of "post mining" research relates to available methods for monitoring mine-degradation processes that could directly threaten surface infrastructures. In this respect, GISOS, a French scientific interest group, is investigating techniques for monitoring the eventual collapse of underground cavities. One of the methods under investigation was monitoring the stability of a salt cavity through recording microseismic-precursor signals that may indicate the onset of rock failure. The data were recorded in a salt mine in Lorraine (France) when monitoring the controlled collapse of 2 000 000 m3 of rocks surrounding a cavity at 130 m depth. The monitoring in the 30 Hz to 3 kHz frequency range highlights the occurrence of events with high energy during periods of macroscopic movement, once the layers had ruptured; they appear to be the consequence of the post-rupture rock movements related to the intense deformation of the cavity roof. Moreover the analysis shows the presence of some interesting precursory signals before the cavity collapsed. They occurred a few hours before the failure phases, when the rocks were being weakened and damaged. They originated from the damaging and breaking process, when micro-cracks appear and then coalesce. From these results we expect that deeper signal analysis and statistical analysis on the complete event time distribution (several millions of files) will allow us to finalize a complete typology of each signal families and their relations with the evolution steps of the cavity over the five years monitoring.

  4. New S control chart using skewness correction method for monitoring process dispersion of skewed distributions

    NASA Astrophysics Data System (ADS)

    Atta, Abdu; Yahaya, Sharipah; Zain, Zakiyah; Ahmed, Zalikha

    2017-11-01

    Control chart is established as one of the most powerful tools in Statistical Process Control (SPC) and is widely used in industries. The conventional control charts rely on normality assumption, which is not always the case for industrial data. This paper proposes a new S control chart for monitoring process dispersion using skewness correction method for skewed distributions, named as SC-S control chart. Its performance in terms of false alarm rate is compared with various existing control charts for monitoring process dispersion, such as scaled weighted variance S chart (SWV-S); skewness correction R chart (SC-R); weighted variance R chart (WV-R); weighted variance S chart (WV-S); and standard S chart (STD-S). Comparison with exact S control chart with regards to the probability of out-of-control detections is also accomplished. The Weibull and gamma distributions adopted in this study are assessed along with the normal distribution. Simulation study shows that the proposed SC-S control chart provides good performance of in-control probabilities (Type I error) in almost all the skewness levels and sample sizes, n. In the case of probability of detection shift the proposed SC-S chart is closer to the exact S control chart than the existing charts for skewed distributions, except for the SC-R control chart. In general, the performance of the proposed SC-S control chart is better than all the existing control charts for monitoring process dispersion in the cases of Type I error and probability of detection shift.

  5. Monitoring the performances of a real scale municipal solid waste composting and a biodrying facility using respiration activity indices.

    PubMed

    Evangelou, Alexandros; Gerassimidou, Spyridoula; Mavrakis, Nikitas; Komilis, Dimitrios

    2016-05-01

    Objective of the work was to monitor two full-scale commingled municipal solid waste (MSW) mechanical and biological pretreatment (MBT) facilities in Greece, namely a biodrying and a composting facility. Monitoring data from a 1.5-year sampling period is presented, whilst microbial respiration indices were used to monitor the decomposition process and the stability status of the wastes in both facilities during the process. Results showed that in the composting facility, the organic matter reduced by 35 % after 8 weeks of combined composting/curing. Material exiting the biocells had a moisture content of less than 30 % (wb) indicating a moisture limitation during the active composting process. The static respiration indexes indicated that some stabilization occurred during the process, but the final material could not be characterized as stable compost. In the biodrying facility, the initial and final moisture contents were 50 % and less than 20 % wb, respectively, and the biodrying index was equal to 4.1 indicating effective biodrying. Lower heating values at the inlet and outlet were approximately 5.5 and 10 MJ/wet kg, respectively. The organic matter was reduced by 20 % during the process and specifically from a range of 63-77 % dw (inlet) to a range of 61-70 % dw. A significant respiration activity reduction was observed for some of the biodrying samples. A statistically significant correlation among all three respiration activity indices was recorded, with the two oxygen related activity indices (CRI7 and SRI24) observing the highest correlation.

  6. Statistical quality control for volumetric modulated arc therapy (VMAT) delivery by using the machine's log data

    NASA Astrophysics Data System (ADS)

    Cheong, Kwang-Ho; Lee, Me-Yeon; Kang, Sei-Kwon; Yoon, Jai-Woong; Park, Soah; Hwang, Taejin; Kim, Haeyoung; Kim, Kyoung Ju; Han, Tae Jin; Bae, Hoonsik

    2015-07-01

    The aim of this study is to set up statistical quality control for monitoring the volumetric modulated arc therapy (VMAT) delivery error by using the machine's log data. Eclipse and a Clinac iX linac with the RapidArc system (Varian Medical Systems, Palo Alto, USA) are used for delivery of the VMAT plan. During the delivery of the RapidArc fields, the machine determines the delivered monitor units (MUs) and the gantry angle's position accuracy and the standard deviations of the MU ( σMU: dosimetric error) and the gantry angle ( σGA: geometric error) are displayed on the console monitor after completion of the RapidArc delivery. In the present study, first, the log data were analyzed to confirm its validity and usability; then, statistical process control (SPC) was applied to monitor the σMU and the σGA in a timely manner for all RapidArc fields: a total of 195 arc fields for 99 patients. The MU and the GA were determined twice for all fields, that is, first during the patient-specific plan QA and then again during the first treatment. The sMU and the σGA time series were quite stable irrespective of the treatment site; however, the sGA strongly depended on the gantry's rotation speed. The σGA of the RapidArc delivery for stereotactic body radiation therapy (SBRT) was smaller than that for the typical VMAT. Therefore, SPC was applied for SBRT cases and general cases respectively. Moreover, the accuracy of the potential meter of the gantry rotation is important because the σGA can change dramatically due to its condition. By applying SPC to the σMU and σGA, we could monitor the delivery error efficiently. However, the upper and the lower limits of SPC need to be determined carefully with full knowledge of the machine and log data.

  7. Energy Monitoring and Targeting as diagnosis; Applying work analysis to adapt a statistical change detection strategy using representation aiding

    NASA Astrophysics Data System (ADS)

    Hilliard, Antony

    Energy Monitoring and Targeting is a well-established business process that develops information about utility energy consumption in a business or institution. While M&T has persisted as a worthwhile energy conservation support activity, it has not been widely adopted. This dissertation explains M&T challenges in terms of diagnosing and controlling energy consumption, informed by a naturalistic field study of M&T work. A Cognitive Work Analysis of M&T identifies structures that diagnosis can search, information flows un-supported in canonical support tools, and opportunities to extend the most popular tool for MM&T: Cumulative Sum of Residuals (CUSUM) charts. A design application outlines how CUSUM charts were augmented with a more contemporary statistical change detection strategy, Recursive Parameter Estimates, modified to better suit the M&T task using Representation Aiding principles. The design was experimentally evaluated in a controlled M&T synthetic task, and was shown to significantly improve diagnosis performance.

  8. Phase Space Dissimilarity Measures for Structural Health Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bubacz, Jacob A; Chmielewski, Hana T; Pape, Alexander E

    A novel method for structural health monitoring (SHM), known as the Phase Space Dissimilarity Measures (PSDM) approach, is proposed and developed. The patented PSDM approach has already been developed and demonstrated for a variety of equipment and biomedical applications. Here, we investigate SHM of bridges via analysis of time serial accelerometer measurements. This work has four aspects. The first is algorithm scalability, which was found to scale linearly from one processing core to four cores. Second, the same data are analyzed to determine how the use of the PSDM approach affects sensor placement. We found that a relatively low-density placementmore » sufficiently captures the dynamics of the structure. Third, the same data are analyzed by unique combinations of accelerometer axes (vertical, longitudinal, and lateral with respect to the bridge) to determine how the choice of axes affects the analysis. The vertical axis is found to provide satisfactory SHM data. Fourth, statistical methods were investigated to validate the PSDM approach for this application, yielding statistically significant results.« less

  9. A statistical study on seismo-ionospheric precursors in the total electron content of global ionosphere map associated with M×6.0 earthquakes in the West Pacific region during 1998-2012

    NASA Astrophysics Data System (ADS)

    Liu, Jann-Yenq; Chen, Koichi; Tsai, Ho-Fang; Hattori, Katsumi; Le, Huijun

    2013-04-01

    This paper reports statistical results of seismo-ionospheric precursors (SIPs) of the total electron content (TEC) in the global ionosphere map (GIM) over the epicenter of earthquakes with magnitude 6 and greater in China, Japan, and Taiwan during 1998-2012. To detect SIP, a quartile-based (i.e. median-based) process is performed. The earthquakes are sub-divided into various regions to have a better understanding on SIP characteristics, as well as examined with and without being led by magnetic storms to confirm the SIP existence. Results show that the SIPs mainly are the TEC significant increase in Japan, and decrease in Taiwan and China, respectively, which suggests the latitudinal effect playing an important role. Meanwhile, for a practical application of monitoring SIPs, the GIM TEC at a fixed point is tested. Results show that multi monitoring points and/or a spatial observation are required to enhance the SIP detection.

  10. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  11. Method for laser spot welding monitoring

    NASA Astrophysics Data System (ADS)

    Manassero, Giorgio

    1994-09-01

    As more powerful solid state laser sources appear on the market, new applications become technically possible and important from the economical point of view. For every process a preliminary optimization phase is necessary. The main parameters, used for a welding application by a high power Nd-YAG laser, are: pulse energy, pulse width, repetition rate and process duration or speed. In this paper an experimental methodology, for the development of an electrooptical laser spot welding monitoring system, is presented. The electromagnetic emission from the molten pool was observed and measured with appropriate sensors. The statistical method `Parameter Design' was used to obtain an accurate analysis of the process parameter that influence process results. A laser station with a solid state laser coupled to an optical fiber (1 mm in diameter) was utilized for the welding tests. The main material used for the experimental plan was zinc coated steel sheet 0.8 mm thick. This material and the related spot welding technique are extensively used in the automotive industry, therefore, the introduction of laser technology in production line will improve the quality of the final product. A correlation, between sensor signals and `through or not through' welds, was assessed. The investigation has furthermore shown the necessity, for the modern laser production systems, to use multisensor heads for process monitoring or control with more advanced signal elaboration procedures.

  12. Statistical analysis of modal properties of a cable-stayed bridge through long-term structural health monitoring with wireless smart sensor networks

    NASA Astrophysics Data System (ADS)

    Asadollahi, Parisa; Li, Jian

    2016-04-01

    Understanding the dynamic behavior of complex structures such as long-span bridges requires dense deployment of sensors. Traditional wired sensor systems are generally expensive and time-consuming to install due to cabling. With wireless communication and on-board computation capabilities, wireless smart sensor networks have the advantages of being low cost, easy to deploy and maintain and therefore facilitate dense instrumentation for structural health monitoring. A long-term monitoring project was recently carried out for a cable-stayed bridge in South Korea with a dense array of 113 smart sensors, which feature the world's largest wireless smart sensor network for civil structural monitoring. This paper presents a comprehensive statistical analysis of the modal properties including natural frequencies, damping ratios and mode shapes of the monitored cable-stayed bridge. Data analyzed in this paper is composed of structural vibration signals monitored during a 12-month period under ambient excitations. The correlation between environmental temperature and the modal frequencies is also investigated. The results showed the long-term statistical structural behavior of the bridge, which serves as the basis for Bayesian statistical updating for the numerical model.

  13. Improving satellite-based PM2.5 estimates in China using Gaussian processes modeling in a Bayesian hierarchical setting.

    PubMed

    Yu, Wenxi; Liu, Yang; Ma, Zongwei; Bi, Jun

    2017-08-01

    Using satellite-based aerosol optical depth (AOD) measurements and statistical models to estimate ground-level PM 2.5 is a promising way to fill the areas that are not covered by ground PM 2.5 monitors. The statistical models used in previous studies are primarily Linear Mixed Effects (LME) and Geographically Weighted Regression (GWR) models. In this study, we developed a new regression model between PM 2.5 and AOD using Gaussian processes in a Bayesian hierarchical setting. Gaussian processes model the stochastic nature of the spatial random effects, where the mean surface and the covariance function is specified. The spatial stochastic process is incorporated under the Bayesian hierarchical framework to explain the variation of PM 2.5 concentrations together with other factors, such as AOD, spatial and non-spatial random effects. We evaluate the results of our model and compare them with those of other, conventional statistical models (GWR and LME) by within-sample model fitting and out-of-sample validation (cross validation, CV). The results show that our model possesses a CV result (R 2  = 0.81) that reflects higher accuracy than that of GWR and LME (0.74 and 0.48, respectively). Our results indicate that Gaussian process models have the potential to improve the accuracy of satellite-based PM 2.5 estimates.

  14. Quantifying, Visualizing, and Monitoring Lead Optimization.

    PubMed

    Maynard, Andrew T; Roberts, Christopher D

    2016-05-12

    Although lead optimization (LO) is by definition a process, process-centric analysis and visualization of this important phase of pharmaceutical R&D has been lacking. Here we describe a simple statistical framework to quantify and visualize the progression of LO projects so that the vital signs of LO convergence can be monitored. We refer to the resulting visualizations generated by our methodology as the "LO telemetry" of a project. These visualizations can be automated to provide objective, holistic, and instantaneous analysis and communication of LO progression. This enhances the ability of project teams to more effectively drive LO process, while enabling management to better coordinate and prioritize LO projects. We present the telemetry of five LO projects comprising different biological targets and different project outcomes, including clinical compound selection, termination due to preclinical safety/tox, and termination due to lack of tractability. We demonstrate that LO progression is accurately captured by the telemetry. We also present metrics to quantify LO efficiency and tractability.

  15. On the degelation of networks – Case of the radiochemical degradation of methyl methacrylate – ethylene glycol dimethacrylate copolymers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richaud, Emmanuel; Gilormini, Pierre; Verdu, Jacques

    2016-05-18

    Methyl methacrylate networks were synthetized and submitted to radiochemical degradation. Ageing was monitored by means of sol-gel analysis and glass transition temperature measurements. Networks were shown to undergo exclusively chain scission process leading to the degelation of network. The critical conversion degree corresponding to degelation (loss of all elastically active chains) is discussed regarding a statistical theory.

  16. Satellite temperature monitoring and prediction system

    NASA Technical Reports Server (NTRS)

    Barnett, U. R.; Martsolf, J. D.; Crosby, F. L.

    1980-01-01

    The paper describes the Florida Satellite Freeze Forecast System (SFFS) in its current state. All data collection options have been demonstrated, and data collected over a three year period have been stored for future analysis. Presently, specific minimum temperature forecasts are issued routinely from November through March. The procedures for issuing these forecast are discussed. The automated data acquisition and processing system is described, and the physical and statistical models employed are examined.

  17. Improving Learning of Markov Logic Networks using Transfer and Bottom-Up Induction

    DTIC Science & Technology

    2007-05-01

    Texas at Austin Austin, TX 78712 lilyanam@cs.utexas.edu Doctoral Dissertation Proposal Supervising Professor: Raymond J. Mooney Abstract Statistical...maxima and plateaus. It is therefore an important research problem to develop learning algorithms that improve the speed and accuracy of this process. The...of Texas at Austin,Department of Computer Sciences,Austin,TX,78712 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S

  18. AgriSense-STARS: Advancing Methods of Agricultural Monitoring for Food Security in Smallholder Regions - the Case for Tanzania

    NASA Astrophysics Data System (ADS)

    Dempewolf, J.; Becker-Reshef, I.; Nakalembe, C. L.; Tumbo, S.; Maurice, S.; Mbilinyi, B.; Ntikha, O.; Hansen, M.; Justice, C. J.; Adusei, B.; Kongo, V.

    2015-12-01

    In-season monitoring of crop conditions provides critical information for agricultural policy and decision making and most importantly for food security planning and management. Nationwide agricultural monitoring in countries dominated by smallholder farming systems, generally relies on extensive networks of field data collectors. In Tanzania, extension agents make up this network and report on conditions across the country, approaching a "near-census". Data is collected on paper which is resource and time intensive, as well as prone to errors. Data quality is ambiguous and there is a general lack of clear and functional feedback loops between farmers, extension agents, analysts and decision makers. Moreover, the data are not spatially explicit, limiting the usefulness for analysis and quality of policy outcomes. Despite significant advances in remote sensing and information communication technologies (ICT) for monitoring agriculture, the full potential of these new tools is yet to be realized in Tanzania. Their use is constrained by the lack of resources, skills and infrastructure to access and process these data. The use of ICT technologies for data collection, processing and analysis is equally limited. The AgriSense-STARS project is developing and testing a system for national-scale in-season monitoring of smallholder agriculture using a combination of three main tools, 1) GLAM-East Africa, an automated MODIS satellite image processing system, 2) field data collection using GeoODK and unmanned aerial vehicles (UAVs), and 3) the Tanzania Crop Monitor, a collaborative online portal for data management and reporting. These tools are developed and applied in Tanzania through the National Food Security Division of the Ministry of Agriculture, Food Security and Cooperatives (MAFC) within a statistically representative sampling framework (area frame) that ensures data quality, representability and resource efficiency.

  19. An automated tool for a daily harmful algal bloom monitoring using MODIS imagery downscaled to 250 meters spatial resolution

    NASA Astrophysics Data System (ADS)

    El Alem, A.

    2016-12-01

    Harmful algal bloom (HAB) causes negative impacts to other organisms by producing natural toxins, mechanical damage to other micro-organisms, or simply by degrading waters quality. Contaminated waters could expose several billions of population to serious intoxications problems. Traditionally, HAB monitoring is made with standard methods limited to a restricted network of sampling points. However, rapid evolution of HABs makes it difficult to monitor their variation in time and space, threating then public safety. Daily monitoring is then the best way to control and to mitigate their harmful effect upon population, particularly for sources feeding cities. Recently, an approach for estimating chlorophyll-a (Chl-a) concentration, as a proxy of HAB presence, in inland waters based MODIS imagery downscaled to 250 meters spatial resolution was developed. Statistical evaluation of the developed approach highlighted the accuracy of Chl-a estimate with a R2 = 0.98, a relative RMSE of 15%, a relative BIAS of -2%, and a relative NASH of 0.95. Temporal resolution of MODIS sensor allows then a daily monitoring of HAB spatial distribution for inland waters of more than 2.25 Km2 of surface. Groupe-Hemisphere, a company specialized in environmental and sustainable planning in Quebec, has shown a great interest to the developed approach. Given the complexity of the preprocessing (geometric and atmospheric corrections as well as downscaling spatial resolution) and processing (Chl-a estimate) of images, a standalone application under the MATLAB's GUI environment was developed. The application allows an automated process for all preprocessing and processing steps. Outputs produced by the application for end users, many of whom may be decision makers or policy makers in the public and private sectors, allows a near-real time monitoring of water quality for a more efficient management.

  20. An Improved Incremental Learning Approach for KPI Prognosis of Dynamic Fuel Cell System.

    PubMed

    Yin, Shen; Xie, Xiaochen; Lam, James; Cheung, Kie Chung; Gao, Huijun

    2016-12-01

    The key performance indicator (KPI) has an important practical value with respect to the product quality and economic benefits for modern industry. To cope with the KPI prognosis issue under nonlinear conditions, this paper presents an improved incremental learning approach based on available process measurements. The proposed approach takes advantage of the algorithm overlapping of locally weighted projection regression (LWPR) and partial least squares (PLS), implementing the PLS-based prognosis in each locally linear model produced by the incremental learning process of LWPR. The global prognosis results including KPI prediction and process monitoring are obtained from the corresponding normalized weighted means of all the local models. The statistical indicators for prognosis are enhanced as well by the design of novel KPI-related and KPI-unrelated statistics with suitable control limits for non-Gaussian data. For application-oriented purpose, the process measurements from real datasets of a proton exchange membrane fuel cell system are employed to demonstrate the effectiveness of KPI prognosis. The proposed approach is finally extended to a long-term voltage prediction for potential reference of further fuel cell applications.

  1. End-user perspective of low-cost sensors for outdoor air pollution monitoring.

    PubMed

    Rai, Aakash C; Kumar, Prashant; Pilla, Francesco; Skouloudis, Andreas N; Di Sabatino, Silvana; Ratti, Carlo; Yasar, Ansar; Rickerby, David

    2017-12-31

    Low-cost sensor technology can potentially revolutionise the area of air pollution monitoring by providing high-density spatiotemporal pollution data. Such data can be utilised for supplementing traditional pollution monitoring, improving exposure estimates, and raising community awareness about air pollution. However, data quality remains a major concern that hinders the widespread adoption of low-cost sensor technology. Unreliable data may mislead unsuspecting users and potentially lead to alarming consequences such as reporting acceptable air pollutant levels when they are above the limits deemed safe for human health. This article provides scientific guidance to the end-users for effectively deploying low-cost sensors for monitoring air pollution and people's exposure, while ensuring reasonable data quality. We review the performance characteristics of several low-cost particle and gas monitoring sensors and provide recommendations to end-users for making proper sensor selection by summarizing the capabilities and limitations of such sensors. The challenges, best practices, and future outlook for effectively deploying low-cost sensors, and maintaining data quality are also discussed. For data quality assurance, a two-stage sensor calibration process is recommended, which includes laboratory calibration under controlled conditions by the manufacturer supplemented with routine calibration checks performed by the end-user under final deployment conditions. For large sensor networks where routine calibration checks are impractical, statistical techniques for data quality assurance should be utilised. Further advancements and adoption of sophisticated mathematical and statistical techniques for sensor calibration, fault detection, and data quality assurance can indeed help to realise the promised benefits of a low-cost air pollution sensor network. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Detecting seasonal and cyclical trends in agricultural runoff water quality-hypothesis tests and block bootstrap power analysis.

    PubMed

    Uddameri, Venkatesh; Singaraju, Sreeram; Hernandez, E Annette

    2018-02-21

    Seasonal and cyclic trends in nutrient concentrations at four agricultural drainage ditches were assessed using a dataset generated from a multivariate, multiscale, multiyear water quality monitoring effort in the agriculturally dominant Lower Rio Grande Valley (LRGV) River Watershed in South Texas. An innovative bootstrap sampling-based power analysis procedure was developed to evaluate the ability of Mann-Whitney and Noether tests to discern trends and to guide future monitoring efforts. The Mann-Whitney U test was able to detect significant changes between summer and winter nutrient concentrations at sites with lower depths and unimpeded flows. Pollutant dilution, non-agricultural loadings, and in-channel flow structures (weirs) masked the effects of seasonality. The detection of cyclical trends using the Noether test was highest in the presence of vegetation mainly for total phosphorus and oxidized nitrogen (nitrite + nitrate) compared to dissolved phosphorus and reduced nitrogen (total Kjeldahl nitrogen-TKN). Prospective power analysis indicated that while increased monitoring can lead to higher statistical power, the effect size (i.e., the total number of trend sequences within a time-series) had a greater influence on the Noether test. Both Mann-Whitney and Noether tests provide complementary information on seasonal and cyclic behavior of pollutant concentrations and are affected by different processes. The results from these statistical tests when evaluated in the context of flow, vegetation, and in-channel hydraulic alterations can help guide future data collection and monitoring efforts. The study highlights the need for long-term monitoring of agricultural drainage ditches to properly discern seasonal and cyclical trends.

  3. A practical approach to tramway track condition monitoring: vertical track defects detection and identification using time-frequency processing technique

    NASA Astrophysics Data System (ADS)

    Bocz, Péter; Vinkó, Ákos; Posgay, Zoltán

    2018-03-01

    This paper presents an automatic method for detecting vertical track irregularities on tramway operation using acceleration measurements on trams. For monitoring of tramway tracks, an unconventional measurement setup is developed, which records the data of 3-axes wireless accelerometers mounted on wheel discs. Accelerations are processed to obtain the vertical track irregularities to determine whether the track needs to be repaired. The automatic detection algorithm is based on time-frequency distribution analysis and determines the defect locations. Admissible limits (thresholds) are given for detecting moderate and severe defects using statistical analysis. The method was validated on frequented tram lines in Budapest and accurately detected severe defects with a hit rate of 100%, with no false alarms. The methodology is also sensitive to moderate and small rail surface defects at the low operational speed.

  4. Analysis and assessment on heavy metal sources in the coastal soils developed from alluvial deposits using multivariate statistical methods.

    PubMed

    Li, Jinling; He, Ming; Han, Wei; Gu, Yifan

    2009-05-30

    An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions.

  5. Treatment of automotive industry oily wastewater by electrocoagulation: statistical optimization of the operational parameters.

    PubMed

    GilPavas, Edison; Molina-Tirado, Kevin; Gómez-García, Miguel Angel

    2009-01-01

    An electrocoagulation process was used for the treatment of oily wastewater generated from an automotive industry in Medellín (Colombia). An electrochemical cell consisting of four parallel electrodes (Fe and Al) in bipolar configuration was implemented. A multifactorial experimental design was used for evaluating the influence of several parameters including: type and arrangement of electrodes, pH, and current density. Oil and grease removal was defined as the response variable for the statistical analysis. Additionally, the BOD(5), COD, and TOC were monitored during the treatment process. According to the results, at the optimum parameter values (current density = 4.3 mA/cm(2), distance between electrodes = 1.5 cm, Fe as anode, and pH = 12) it was possible to reach a c.a. 95% oils removal, COD and mineralization of 87.4% and 70.6%, respectively. A final biodegradability (BOD(5)/COD) of 0.54 was reached.

  6. Using assemblage data in ecological indicators: A comparison and evaluation of commonly available statistical tools

    USGS Publications Warehouse

    Smith, Joseph M.; Mather, Martha E.

    2012-01-01

    Ecological indicators are science-based tools used to assess how human activities have impacted environmental resources. For monitoring and environmental assessment, existing species assemblage data can be used to make these comparisons through time or across sites. An impediment to using assemblage data, however, is that these data are complex and need to be simplified in an ecologically meaningful way. Because multivariate statistics are mathematical relationships, statistical groupings may not make ecological sense and will not have utility as indicators. Our goal was to define a process to select defensible and ecologically interpretable statistical simplifications of assemblage data in which researchers and managers can have confidence. For this, we chose a suite of statistical methods, compared the groupings that resulted from these analyses, identified convergence among groupings, then we interpreted the groupings using species and ecological guilds. When we tested this approach using a statewide stream fish dataset, not all statistical methods worked equally well. For our dataset, logistic regression (Log), detrended correspondence analysis (DCA), cluster analysis (CL), and non-metric multidimensional scaling (NMDS) provided consistent, simplified output. Specifically, the Log, DCA, CL-1, and NMDS-1 groupings were ≥60% similar to each other, overlapped with the fluvial-specialist ecological guild, and contained a common subset of species. Groupings based on number of species (e.g., Log, DCA, CL and NMDS) outperformed groupings based on abundance [e.g., principal components analysis (PCA) and Poisson regression]. Although the specific methods that worked on our test dataset have generality, here we are advocating a process (e.g., identifying convergent groupings with redundant species composition that are ecologically interpretable) rather than the automatic use of any single statistical tool. We summarize this process in step-by-step guidance for the future use of these commonly available ecological and statistical methods in preparing assemblage data for use in ecological indicators.

  7. An empirical, integrated forest biomass monitoring system

    NASA Astrophysics Data System (ADS)

    Kennedy, Robert E.; Ohmann, Janet; Gregory, Matt; Roberts, Heather; Yang, Zhiqiang; Bell, David M.; Kane, Van; Hughes, M. Joseph; Cohen, Warren B.; Powell, Scott; Neeti, Neeti; Larrue, Tara; Hooper, Sam; Kane, Jonathan; Miller, David L.; Perkins, James; Braaten, Justin; Seidl, Rupert

    2018-02-01

    The fate of live forest biomass is largely controlled by growth and disturbance processes, both natural and anthropogenic. Thus, biomass monitoring strategies must characterize both the biomass of the forests at a given point in time and the dynamic processes that change it. Here, we describe and test an empirical monitoring system designed to meet those needs. Our system uses a mix of field data, statistical modeling, remotely-sensed time-series imagery, and small-footprint lidar data to build and evaluate maps of forest biomass. It ascribes biomass change to specific change agents, and attempts to capture the impact of uncertainty in methodology. We find that: • A common image framework for biomass estimation and for change detection allows for consistent comparison of both state and change processes controlling biomass dynamics. • Regional estimates of total biomass agree well with those from plot data alone. • The system tracks biomass densities up to 450-500 Mg ha-1 with little bias, but begins underestimating true biomass as densities increase further. • Scale considerations are important. Estimates at the 30 m grain size are noisy, but agreement at broad scales is good. Further investigation to determine the appropriate scales is underway. • Uncertainty from methodological choices is evident, but much smaller than uncertainty based on choice of allometric equation used to estimate biomass from tree data. • In this forest-dominated study area, growth and loss processes largely balance in most years, with loss processes dominated by human removal through harvest. In years with substantial fire activity, however, overall biomass loss greatly outpaces growth. Taken together, our methods represent a unique combination of elements foundational to an operational landscape-scale forest biomass monitoring program.

  8. Developing and validating a new national remote health advice syndromic surveillance system in England.

    PubMed

    Harcourt, S E; Morbey, R A; Loveridge, P; Carrilho, L; Baynham, D; Povey, E; Fox, P; Rutter, J; Moores, P; Tiffen, J; Bellerby, S; McIntosh, P; Large, S; McMenamin, J; Reynolds, A; Ibbotson, S; Smith, G E; Elliot, A J

    2017-03-01

    Public Health England (PHE) coordinates a suite of real-time national syndromic surveillance systems monitoring general practice, emergency department and remote health advice data. We describe the development and informal evaluation of a new syndromic surveillance system using NHS 111 remote health advice data. NHS 111 syndromic indicators were monitored daily at national and local level. Statistical models were applied to daily data to identify significant exceedances; statistical baselines were developed for each syndrome and area using a multi-level hierarchical mixed effects model. Between November 2013 and October 2014, there were on average 19 095 NHS 111 calls each weekday and 43 084 each weekend day in the PHE dataset. There was a predominance of females using the service (57%); highest percentage of calls received was in the age group 1-4 years (14%). This system was used to monitor respiratory and gastrointestinal infections over the winter of 2013-14, the potential public health impact of severe flooding across parts of southern England and poor air quality episodes across England in April 2014. This new system complements and supplements the existing PHE syndromic surveillance systems and is now integrated into the routine daily processes that form this national syndromic surveillance service. © Crown copyright 2016.

  9. Health research and systems' governance are at risk: should the right to data protection override health?

    PubMed

    Di Iorio, C T; Carinci, F; Oderkirk, J

    2014-07-01

    The European Union (EU) Data Protection Regulation will have profound implications for public health, health services research and statistics in Europe. The EU Commission's Proposal was a breakthrough in balancing privacy rights and rights to health and healthcare. The European Parliament, however, has proposed extensive amendments. This paper reviews the amendments proposed by the European Parliament Committee on Civil Liberties, Justice and Home Affairs and their implications for health research and statistics. The amendments eliminate most innovations brought by the Proposal. Notably, derogation to the general prohibition of processing sensitive data shall be allowed for public interests such as the management of healthcare services,but not health research, monitoring, surveillance and governance. The processing of personal health data for historical, statistical or scientific purposes shall be allowed only with the consent of the data subject or if the processing serves an exceptionally high public interest, cannot be performed otherwise and is legally authorised. Research, be it academic, government,corporate or market research, falls under the same rule.The proposed amendments will make difficult or render impossible research and statistics involving the linkage and analysis of the wealth of data from clinical,administrative, insurance and survey sources, which have contributed to improving health outcomes and health systems performance and governance; and may illegitimise efforts that have been made in some European countries to enable privacy-respectful data use for research and statistical purposes. If the amendments stand as written, the right to privacy is likely to override the right to health and healthcare in Europe.

  10. Statistical monitoring of the hand, foot and mouth disease in China.

    PubMed

    Zhang, Jingnan; Kang, Yicheng; Yang, Yang; Qiu, Peihua

    2015-09-01

    In a period starting around 2007, the Hand, Foot, and Mouth Disease (HFMD) became wide-spreading in China, and the Chinese public health was seriously threatened. To prevent the outbreak of infectious diseases like HFMD, effective disease surveillance systems would be especially helpful to give signals of disease outbreaks as early as possible. Statistical process control (SPC) charts provide a major statistical tool in industrial quality control for detecting product defectives in a timely manner. In recent years, SPC charts have been used for disease surveillance. However, disease surveillance data often have much more complicated structures, compared to the data collected from industrial production lines. Major challenges, including lack of in-control data, complex seasonal effects, and spatio-temporal correlations, make the surveillance data difficult to handle. In this article, we propose a three-step procedure for analyzing disease surveillance data, and our procedure is demonstrated using the HFMD data collected during 2008-2009 in China. Our method uses nonparametric longitudinal data and time series analysis methods to eliminate the possible impact of seasonality and temporal correlation before the disease incidence data are sequentially monitored by a SPC chart. At both national and provincial levels, our proposed method can effectively detect the increasing trend of disease incidence rate before the disease becomes wide-spreading. © 2015, The International Biometric Society.

  11. Sigma Metrics Across the Total Testing Process.

    PubMed

    Charuruks, Navapun

    2017-03-01

    Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Soft Sensors: Chemoinformatic Model for Efficient Control and Operation in Chemical Plants.

    PubMed

    Funatsu, Kimito

    2016-12-01

    Soft sensor is statistical model as an essential tool for controlling pharmaceutical, chemical and industrial plants. I introduce soft sensor, the roles, the applications, the problems and the research examples such as adaptive soft sensor, database monitoring and efficient process control. The use of soft sensor enables chemical industrial plants to be operated more effectively and stably. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Frequency Domain Analysis of Sensor Data for Event Classification in Real-Time Robot Assisted Deburring

    PubMed Central

    Pappachan, Bobby K; Caesarendra, Wahyu; Tjahjowidodo, Tegoeh; Wijaya, Tomi

    2017-01-01

    Process monitoring using indirect methods relies on the usage of sensors. Using sensors to acquire vital process related information also presents itself with the problem of big data management and analysis. Due to uncertainty in the frequency of events occurring, a higher sampling rate is often used in real-time monitoring applications to increase the chances of capturing and understanding all possible events related to the process. Advanced signal processing methods are used to further decipher meaningful information from the acquired data. In this research work, power spectrum density (PSD) of sensor data acquired at sampling rates between 40–51.2 kHz was calculated and the corelation between PSD and completed number of cycles/passes is presented. Here, the progress in number of cycles/passes is the event this research work intends to classify and the algorithm used to compute PSD is Welch’s estimate method. A comparison between Welch’s estimate method and statistical methods is also discussed. A clear co-relation was observed using Welch’s estimate to classify the number of cycles/passes. The paper also succeeds in classifying vibration signal generated by the spindle from the vibration signal acquired during finishing process. PMID:28556809

  14. Assessment of water quality monitoring for the optimal sensor placement in lake Yahuarcocha using pattern recognition techniques and geographical information systems.

    PubMed

    Jácome, Gabriel; Valarezo, Carla; Yoo, Changkyoo

    2018-03-30

    Pollution and the eutrophication process are increasing in lake Yahuarcocha and constant water quality monitoring is essential for a better understanding of the patterns occurring in this ecosystem. In this study, key sensor locations were determined using spatial and temporal analyses combined with geographical information systems (GIS) to assess the influence of weather features, anthropogenic activities, and other non-point pollution sources. A water quality monitoring network was established to obtain data on 14 physicochemical and microbiological parameters at each of seven sample sites over a period of 13 months. A spatial and temporal statistical approach using pattern recognition techniques, such as cluster analysis (CA) and discriminant analysis (DA), was employed to classify and identify the most important water quality parameters in the lake. The original monitoring network was reduced to four optimal sensor locations based on a fuzzy overlay of the interpolations of concentration variations of the most important parameters.

  15. Using eddy currents for noninvasive in vivo pH monitoring for bone tissue engineering.

    PubMed

    Beck-Broichsitter, Benedicta E; Daschner, Frank; Christofzik, David W; Knöchel, Reinhard; Wiltfang, Jörg; Becker, Stephan T

    2015-03-01

    The metabolic processes that regulate bone healing and bone induction in tissue engineering models are not fully understood. Eddy current excitation is widely used in technical approaches and in the food industry. The aim of this study was to establish eddy current excitation for monitoring metabolic processes during heterotopic osteoinduction in vivo. Hydroxyapatite scaffolds were implanted into the musculus latissimus dorsi of six rats. Bone morphogenetic protein 2 (BMP-2) was applied 1 and 2 weeks after implantation. Weekly eddy current excitation measurements were performed. Additionally, invasive pH measurements were obtained from the scaffolds using fiber optic detection devices. Correlations between the eddy current measurements and the metabolic values were calculated. The eddy current measurements and pH values decreased significantly in the first 2 weeks of the study, followed by a steady increase and stabilization at higher levels towards the end of the study. The measurement curves and statistical evaluations indicated a significant correlation between the resonance frequency values of the eddy current excitation measurements and the observed pH levels (p = 0.0041). This innovative technique was capable of noninvasively monitoring metabolic processes in living tissues according to pH values, showing a direct correlation between eddy current excitation and pH in an in vivo tissue engineering model.

  16. The effect of process parameters on audible acoustic emissions from high-shear granulation.

    PubMed

    Hansuld, Erin M; Briens, Lauren; Sayani, Amyn; McCann, Joe A B

    2013-02-01

    Product quality in high-shear granulation is easily compromised by minor changes in raw material properties or process conditions. It is desired to develop a process analytical technology (PAT) that can monitor the process in real-time and provide feedback for quality control. In this work, the application of audible acoustic emissions (AAEs) as a PAT tool was investigated. A condenser microphone was placed at the top of the air exhaust on a PMA-10 high-shear granulator to collect AAEs for a design of experiment (DOE) varying impeller speed, total binder volume and spray rate. The results showed the 10 Hz total power spectral densities (TPSDs) between 20 and 250 Hz were significantly affected by the changes in process conditions. Impeller speed and spray rate were shown to have statistically significant effects on granulation wetting, and impeller speed and total binder volume were significant in terms of process end-point. The DOE results were confirmed by a multivariate PLS model of the TPSDs. The scores plot showed separation based on impeller speed in the first component and spray rate in the second component. The findings support the use of AAEs to monitor changes in process conditions in real-time and achieve consistent product quality.

  17. Geo-Hydro Statistical Characterization of Preferential Flow and Transport Processes in Karst Groundwater Systems

    NASA Astrophysics Data System (ADS)

    Anaya, A. A.; Padilla, I. Y.; Macchiavelli, R. E.

    2011-12-01

    Karst groundwater systems are highly productive and provide an important fresh water resource for human development and ecological integrity. Their high productivity is often associated with conduit flow and high matrix permeability. The same characteristics that make these aquifers productive also make them highly vulnerable to contamination and a likely for contaminant exposure. Of particular interest are chlorinated organic contaminants and phthalates derived from industrial solvents and plastic by-products. These chemicals have been identified as potential precursors of pre-term birth, a leading cause of neonatal complications with a significant health and societal cost. The general objectives of this work are to: (1) develop fundamental knowledge and determine the processes controlling the release, mobility, persistence, and possible pathways of contaminants in karst groundwater systems, and (2) characterize transport processes in conduit and diffusion-dominated flow under base flow and storm flow conditions. The work presented herein focuses on the development of geo-hydro statistical tools to characterize flow and transport processes under different flow regimes. Multidimensional, laboratory-scale Geo-Hydrobed models were developed and tested for this purpose. The models consist of stainless-steel tanks containing karstified limestone blocks collected from the karst aquifer formation of northern Puerto Rico. The models a network of sampling wells to monitor flow, pressure, and solute concentrations temporally and spatially. Experimental work entailed making a series of point injections in wells while monitoring the hydraulic response in other wells. Statistical mixed models were applied to spatial probabilities of hydraulic response and weighted injected volume data, and were used to determinate the best spatial correlation structure to represent paths of preferential flow in the limestone units under different groundwater flow regimes. Preliminary testing of the karstified models show that the system can be used to represent the variable transport regime characterized by conduit and diffuses flow in the karst systems. Initial hydraulic characterization indicates a highly heterogeneous system resulting in large preferential flow components. Future works involve characterization of dual porosity system using conservative tracers, fate and transport experiments using phthalates and chlorinated solvents, geo-temporal statistical modeling, and the testing of "green" remediation technologies in karst groundwater. This work is supported by the U.S. Department of Energy, Savannah River (Grant Award No. DE-FG09-07SR22571), and the National Institute of Environmental Health Sciences (NIEHS, Grant Award No. P42ES017198).

  18. Monitoring Items in Real Time to Enhance CAT Security

    ERIC Educational Resources Information Center

    Zhang, Jinming; Li, Jie

    2016-01-01

    An IRT-based sequential procedure is developed to monitor items for enhancing test security. The procedure uses a series of statistical hypothesis tests to examine whether the statistical characteristics of each item under inspection have changed significantly during CAT administration. This procedure is compared with a previously developed…

  19. ENVIRONMENTAL MONITORING AND ASSESSMENT PROGRAM (EMAP): WESTERN STREAMS AND RIVERS STATISTICAL SUMMARY

    EPA Science Inventory

    This statistical summary reports data from the Environmental Monitoring and Assessment Program (EMAP) Western Pilot (EMAP-W). EMAP-W was a sample survey (or probability survey, often simply called 'random') of streams and rivers in 12 states of the western U.S. (Arizona, Californ...

  20. Many roads to synchrony: natural time scales and their algorithms.

    PubMed

    James, Ryan G; Mahoney, John R; Ellison, Christopher J; Crutchfield, James P

    2014-04-01

    We consider two important time scales-the Markov and cryptic orders-that monitor how an observer synchronizes to a finitary stochastic process. We show how to compute these orders exactly and that they are most efficiently calculated from the ε-machine, a process's minimal unifilar model. Surprisingly, though the Markov order is a basic concept from stochastic process theory, it is not a probabilistic property of a process. Rather, it is a topological property and, moreover, it is not computable from any finite-state model other than the ε-machine. Via an exhaustive survey, we close by demonstrating that infinite Markov and infinite cryptic orders are a dominant feature in the space of finite-memory processes. We draw out the roles played in statistical mechanical spin systems by these two complementary length scales.

  1. Three years of operational experience from Schauinsland CTBT monitoring station.

    PubMed

    Zähringer, M; Bieringer, J; Schlosser, C

    2008-04-01

    Data from three years of operation of a low-level aerosol sampler and analyzer (RASA) at Schauinsland monitoring station are reported. The system is part of the International Monitoring System (IMS) for verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The fully automatic system is capable to measure aerosol borne gamma emitters with high sensitivity and routinely quantifies 7Be and 212Pb. The system had a high level of data availability of 90% within the reporting period. A daily screening process rendered 66 tentative identifications of verification relevant radionuclides since the system entered IMS operation in February 2004. Two of these were real events and associated to a plausible source. The remaining 64 cases can consistently be explained by detector background and statistical phenomena. Inter-comparison with data from a weekly sampler operated at the same station shows instabilities of the calibration during the test phase and a good agreement since certification of the system.

  2. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.

  3. Computerized EEG analysis for studying the effect of drugs on the central nervous system.

    PubMed

    Rosadini, G; Cavazza, B; Rodriguez, G; Sannita, W G; Siccardi, A

    1977-11-01

    Samples of our experience in quantitative pharmaco-EEG are reviewed to discuss and define its applicability and limits. Simple processing systems, such as the computation of Hjorth's descriptors, are useful for on-line monitoring of drug-induced EEG modifications which are evident also at the visual visual analysis. Power spectral analysis is suitable to identify and quantify EEG effects not evident at the visual inspection. It demonstrated how the EEG effects of compounds in a long-acting formulation vary according to the sampling time and the explored cerebral area. EEG modifications not detected by power spectral analysis can be defined by comparing statistically (F test) the spectral values of the EEG from a single lead at the different samples (longitudinal comparison), or the spectral values from different leads at any sample (intrahemispheric comparison). The presently available procedures of quantitative pharmaco-EEG are effective when applied to the study of mutltilead EEG recordings in a statistically significant sample of population. They do not seem reliable in the monitoring of directing of neuropyschiatric therapies in single patients, due to individual variability of drug effects.

  4. What predicts successful literacy acquisition in a second language?

    PubMed Central

    Frost, Ram; Siegelman, Noam; Narkiss, Alona; Afek, Liron

    2013-01-01

    We examined whether success (or failure) in assimilating the structure of a second language could be predicted by general statistical learning abilities that are non-linguistic in nature. We employed a visual statistical learning (VSL) task, monitoring our participants’ implicit learning of the transitional probabilities of visual shapes. A pretest revealed that performance in the VSL task is not correlated with abilities related to a general G factor or working memory. We found that native speakers of English who picked up the implicit statistical structure embedded in the continuous stream of shapes, on average, better assimilated the Semitic structure of Hebrew words. Our findings thus suggest that languages and their writing systems are characterized by idiosyncratic correlations of form and meaning, and these are picked up in the process of literacy acquisition, as they are picked up in any other type of learning, for the purpose of making sense of the environment. PMID:23698615

  5. A better way to evaluate remote monitoring programs in chronic disease care: receiver operating characteristic analysis.

    PubMed

    Brown Connolly, Nancy E

    2014-12-01

    This foundational study applies the process of receiver operating characteristic (ROC) analysis to evaluate utility and predictive value of a disease management (DM) model that uses RM devices for chronic obstructive pulmonary disease (COPD). The literature identifies a need for a more rigorous method to validate and quantify evidence-based value for remote monitoring (RM) systems being used to monitor persons with a chronic disease. ROC analysis is an engineering approach widely applied in medical testing, but that has not been evaluated for its utility in RM. Classifiers (saturated peripheral oxygen [SPO2], blood pressure [BP], and pulse), optimum threshold, and predictive accuracy are evaluated based on patient outcomes. Parametric and nonparametric methods were used. Event-based patient outcomes included inpatient hospitalization, accident and emergency, and home health visits. Statistical analysis tools included Microsoft (Redmond, WA) Excel(®) and MedCalc(®) (MedCalc Software, Ostend, Belgium) version 12 © 1993-2013 to generate ROC curves and statistics. Persons with COPD were monitored a minimum of 183 days, with at least one inpatient hospitalization within 12 months prior to monitoring. Retrospective, de-identified patient data from a United Kingdom National Health System COPD program were used. Datasets included biometric readings, alerts, and resource utilization. SPO2 was identified as a predictive classifier, with an optimal average threshold setting of 85-86%. BP and pulse were failed classifiers, and areas of design were identified that may improve utility and predictive capacity. Cost avoidance methodology was developed. RESULTS can be applied to health services planning decisions. Methods can be applied to system design and evaluation based on patient outcomes. This study validated the use of ROC in RM program evaluation.

  6. Data Collection Procedures and Descriptive Statistics for the Grade One Achievement Monitoring Tests (Baseline, S-1, S-2, and S-3), Coordinated Study No. 1. Working Paper 316. Report from the Project on Studies in Mathematics.

    ERIC Educational Resources Information Center

    Buchanan, Anne E.; Romberg, Thomas A.

    As part of a 3-year study of arithmetic problem-solving skills in young children, pretests were administered to 180 middle class first grade students. Following each of three instructional units, another achievement test was administered. The three first grade units corresponded to the Developing Mathematical Processes curriculum and involved…

  7. Statistical process control as a tool for controlling operating room performance: retrospective analysis and benchmarking.

    PubMed

    Chen, Tsung-Tai; Chang, Yun-Jau; Ku, Shei-Ling; Chung, Kuo-Piao

    2010-10-01

    There is much research using statistical process control (SPC) to monitor surgical performance, including comparisons among groups to detect small process shifts, but few of these studies have included a stabilization process. This study aimed to analyse the performance of surgeons in operating room (OR) and set a benchmark by SPC after stabilized process. The OR profile of 499 patients who underwent laparoscopic cholecystectomy performed by 16 surgeons at a tertiary hospital in Taiwan during 2005 and 2006 were recorded. SPC was applied to analyse operative and non-operative times using the following five steps: first, the times were divided into two segments; second, they were normalized; third, they were evaluated as individual processes; fourth, the ARL(0) was calculated;, and fifth, the different groups (surgeons) were compared. Outliers were excluded to ensure stability for each group and to facilitate inter-group comparison. The results showed that in the stabilized process, only one surgeon exhibited a significantly shorter total process time (including operative time and non-operative time). In this study, we use five steps to demonstrate how to control surgical and non-surgical time in phase I. There are some measures that can be taken to prevent skew and instability in the process. Also, using SPC, one surgeon can be shown to be a real benchmark. © 2010 Blackwell Publishing Ltd.

  8. Monitoring the Level of Students' GPAs over Time

    ERIC Educational Resources Information Center

    Bakir, Saad T.; McNeal, Bob

    2010-01-01

    A nonparametric (or distribution-free) statistical quality control chart is used to monitor the cumulative grade point averages (GPAs) of students over time. The chart is designed to detect any statistically significant positive or negative shifts in student GPAs from a desired target level. This nonparametric control chart is based on the…

  9. Multicenter Study on Incubation Conditions for Environmental Monitoring and Aseptic Process Simulation.

    PubMed

    Guinet, Roland; Berthoumieu, Nicole; Dutot, Philippe; Triquet, Julien; Ratajczak, Medhi; Thibaudon, Michel; Bechaud, Philippe; Arliaud, Christophe; Miclet, Edith; Giordano, Florine; Larcon, Marjorie; Arthaud, Catherine

    Environmental monitoring and aseptic process simulations represent an integral part of the microbiological quality control system of sterile pharmaceutical products manufacturing operations. However, guidance documents and manufacturers practices differ regarding recommendations for incubation time and incubation temperature, and, consequently, the environmental monitoring and aseptic process simulation incubation strategy should be supported by validation data. To avoid any bias coming from in vitro studies or from single-site manufacturing in situ studies, we performed a collaborative study at four manufacturing sites with four samples at each location. The environmental monitoring study was performed with tryptic soy agar settle plates and contact plates, and the aseptic process simulation study was performed with tryptic soy broth and thioglycolate broth. The highest recovery rate was obtained with settle plates (97.7%) followed by contact plates (65.4%) and was less than 20% for liquid media (tryptic soy broth 19% and thioglycolate broth 17%). Gram-positive cocci and non-spore-forming Gram-positive rods were largely predominant with more than 95% of growth and recovered best at 32.5 °C. The highest recovery of molds was obtained at 22.5 °C alone or as the first incubation temperature. Strict anaerobes were not recovered. At the end of the five days of incubation no significant statistical difference was obtained between the four conditions. Based on these data a single incubation temperature at 32.5 °C could be recommended for these four manufacturing sites for both environmental monitoring and aseptic process simulation, and a second plate could be used, periodically incubated at 22.5 °C. Similar studies should be considered for all manufacturing facilities in order to determine the optimal incubation temperature regime for both viable environmental monitoring and aseptic process simulation. Microbiological environmental monitoring and aseptic process simulation confirm that pharmaceutical cleanrooms are in an appropriate hygienic condition for manufacturing of sterile drug products. Guidance documents from different health authorities or expert groups differ regarding recommendation of the applied incubation time and incubation temperature, leading to variable manufacturers practices. Some recent publications have demonstrated that laboratory studies are not relevant to determine the best incubation regime and that in situ manufacturing site studies should be used. To solve any possible bias coming from laboratory studies or single-site in situ studies, we conducted a multicenter study at four manufacturing sites with a significant amount of real environmental monitoring samples collected directly from the environment in pharmaceutical production during manufacturing operations with four solid and liquid nutrient media. These samples were then incubated under four different conditions suggested in the guidance documents. We believe that the results of our multicenter study confirming recent other single-site in situ studies could be the basis of the strategy to determine the best incubation regime for both viable environmental monitoring and aseptic process simulation in any manufacturing facility. © PDA, Inc. 2017.

  10. Monitoring and Evaluation of Cultivated Land Irrigation Guarantee Capability with Remote Sensing

    NASA Astrophysics Data System (ADS)

    Zhang, C., Sr.; Huang, J.; Li, L.; Wang, H.; Zhu, D.

    2015-12-01

    Abstract: Cultivated Land Quality Grade monitoring and evaluation is an important way to improve the land production capability and ensure the country food safety. Irrigation guarantee capability is one of important aspects in the cultivated land quality monitoring and evaluation. In the current cultivated land quality monitoring processing based on field survey, the irrigation rate need much human resources investment in long investigation process. This study choses Beijing-Tianjin-Hebei as study region, taking the 1 km × 1 km grid size of cultivated land unit with a winter wheat-summer maize double cropping system as study object. A new irrigation capacity evaluation index based on the ratio of the annual irrigation requirement retrieved from MODIS data and the actual quantity of irrigation was proposed. With the years of monitoring results the irrigation guarantee capability of study area was evaluated comprehensively. The change trend of the irrigation guarantee capability index (IGCI) with the agricultural drought disaster area in rural statistical yearbook of Beijing-Tianjin-Hebei area was generally consistent. The average of IGCI value, the probability of irrigation-guaranteed year and the weighted average which controlled by the irrigation demand index were used and compared in this paper. The experiment results indicate that the classification result from the present method was close to that from irrigation probability in the gradation on agriculture land quality in 2012, with overlap of 73% similar units. The method of monitoring and evaluation of cultivated land IGCI proposed in this paper has a potential in cultivated land quality level monitoring and evaluation in China. Key words: remote sensing, evapotranspiration, MODIS cultivated land quality, irrigation guarantee capability Authors: Chao Zhang, Jianxi Huang, Li Li, Hongshuo Wang, Dehai Zhu China Agricultural University zhangchaobj@gmail.com

  11. Multivariate statistical process control in product quality review assessment - A case study.

    PubMed

    Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A

    2017-11-01

    According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T 2 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  12. Processing of meteorological data with ultrasonic thermoanemometers

    NASA Astrophysics Data System (ADS)

    Telminov, A. E.; Bogushevich, A. Ya.; Korolkov, V. A.; Botygin, I. A.

    2017-11-01

    The article describes a software system intended for supporting scientific researches of the atmosphere during the processing of data gathered by multi-level ultrasonic complexes for automated monitoring of meteorological and turbulent parameters in the ground layer of the atmosphere. The system allows to process files containing data sets of temperature instantaneous values, three orthogonal components of wind speed, humidity and pressure. The processing task execution is done in multiple stages. During the first stage, the system executes researcher's query for meteorological parameters. At the second stage, the system computes series of standard statistical meteorological field properties, such as averages, dispersion, standard deviation, asymmetry coefficients, excess, correlation etc. The third stage is necessary to prepare for computing the parameters of atmospheric turbulence. The computation results are displayed to user and stored at hard drive.

  13. Clustering and Flow Conservation Monitoring Tool for Software Defined Networks.

    PubMed

    Puente Fernández, Jesús Antonio; García Villalba, Luis Javier; Kim, Tai-Hoon

    2018-04-03

    Prediction systems present some challenges on two fronts: the relation between video quality and observed session features and on the other hand, dynamics changes on the video quality. Software Defined Networks (SDN) is a new concept of network architecture that provides the separation of control plane (controller) and data plane (switches) in network devices. Due to the existence of the southbound interface, it is possible to deploy monitoring tools to obtain the network status and retrieve a statistics collection. Therefore, achieving the most accurate statistics depends on a strategy of monitoring and information requests of network devices. In this paper, we propose an enhanced algorithm for requesting statistics to measure the traffic flow in SDN networks. Such an algorithm is based on grouping network switches in clusters focusing on their number of ports to apply different monitoring techniques. Such grouping occurs by avoiding monitoring queries in network switches with common characteristics and then, by omitting redundant information. In this way, the present proposal decreases the number of monitoring queries to switches, improving the network traffic and preventing the switching overload. We have tested our optimization in a video streaming simulation using different types of videos. The experiments and comparison with traditional monitoring techniques demonstrate the feasibility of our proposal maintaining similar values decreasing the number of queries to the switches.

  14. Baby-Crying Acceptance

    NASA Astrophysics Data System (ADS)

    Martins, Tiago; de Magalhães, Sérgio Tenreiro

    The baby's crying is his most important mean of communication. The crying monitoring performed by devices that have been developed doesn't ensure the complete safety of the child. It is necessary to join, to these technological resources, means of communicating the results to the responsible, which would involve the digital processing of information available from crying. The survey carried out, enabled to understand the level of adoption, in the continental territory of Portugal, of a technology that will be able to do such a digital processing. It was used the TAM as the theoretical referential. The statistical analysis showed that there is a good probability of acceptance of such a system.

  15. Statistical techniques for sampling and monitoring natural resources

    Treesearch

    Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado

    2004-01-01

    We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....

  16. An introduction to statistical process control in research proteomics.

    PubMed

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.

  17. Local sensitivity analysis for inverse problems solved by singular value decomposition

    USGS Publications Warehouse

    Hill, M.C.; Nolan, B.T.

    2010-01-01

    Local sensitivity analysis provides computationally frugal ways to evaluate models commonly used for resource management, risk assessment, and so on. This includes diagnosing inverse model convergence problems caused by parameter insensitivity and(or) parameter interdependence (correlation), understanding what aspects of the model and data contribute to measures of uncertainty, and identifying new data likely to reduce model uncertainty. Here, we consider sensitivity statistics relevant to models in which the process model parameters are transformed using singular value decomposition (SVD) to create SVD parameters for model calibration. The statistics considered include the PEST identifiability statistic, and combined use of the process-model parameter statistics composite scaled sensitivities and parameter correlation coefficients (CSS and PCC). The statistics are complimentary in that the identifiability statistic integrates the effects of parameter sensitivity and interdependence, while CSS and PCC provide individual measures of sensitivity and interdependence. PCC quantifies correlations between pairs or larger sets of parameters; when a set of parameters is intercorrelated, the absolute value of PCC is close to 1.00 for all pairs in the set. The number of singular vectors to include in the calculation of the identifiability statistic is somewhat subjective and influences the statistic. To demonstrate the statistics, we use the USDA’s Root Zone Water Quality Model to simulate nitrogen fate and transport in the unsaturated zone of the Merced River Basin, CA. There are 16 log-transformed process-model parameters, including water content at field capacity (WFC) and bulk density (BD) for each of five soil layers. Calibration data consisted of 1,670 observations comprising soil moisture, soil water tension, aqueous nitrate and bromide concentrations, soil nitrate concentration, and organic matter content. All 16 of the SVD parameters could be estimated by regression based on the range of singular values. Identifiability statistic results varied based on the number of SVD parameters included. Identifiability statistics calculated for four SVD parameters indicate the same three most important process-model parameters as CSS/PCC (WFC1, WFC2, and BD2), but the order differed. Additionally, the identifiability statistic showed that BD1 was almost as dominant as WFC1. The CSS/PCC analysis showed that this results from its high correlation with WCF1 (-0.94), and not its individual sensitivity. Such distinctions, combined with analysis of how high correlations and(or) sensitivities result from the constructed model, can produce important insights into, for example, the use of sensitivity analysis to design monitoring networks. In conclusion, the statistics considered identified similar important parameters. They differ because (1) with CSS/PCC can be more awkward because sensitivity and interdependence are considered separately and (2) identifiability requires consideration of how many SVD parameters to include. A continuing challenge is to understand how these computationally efficient methods compare with computationally demanding global methods like Markov-Chain Monte Carlo given common nonlinear processes and the often even more nonlinear models.

  18. Nanopipettes as Monitoring Probes for the Single Living Cell: State of the Art and Future Directions in Molecular Biology.

    PubMed

    Bulbul, Gonca; Chaves, Gepoliano; Olivier, Joseph; Ozel, Rifat Emrah; Pourmand, Nader

    2018-06-06

    Examining the behavior of a single cell within its natural environment is valuable for understanding both the biological processes that control the function of cells and how injury or disease lead to pathological change of their function. Single-cell analysis can reveal information regarding the causes of genetic changes, and it can contribute to studies on the molecular basis of cell transformation and proliferation. By contrast, whole tissue biopsies can only yield information on a statistical average of several processes occurring in a population of different cells. Electrowetting within a nanopipette provides a nanobiopsy platform for the extraction of cellular material from single living cells. Additionally, functionalized nanopipette sensing probes can differentiate analytes based on their size, shape or charge density, making the technology uniquely suited to sensing changes in single-cell dynamics. In this review, we highlight the potential of nanopipette technology as a non-destructive analytical tool to monitor single living cells, with particular attention to integration into applications in molecular biology.

  19. Kinetic study of olive oil degradation monitored by fourier transform infrared spectrometry. Application to oil characterization.

    PubMed

    Román Falcó, Iván P; Grané Teruel, Nuria; Prats Moya, Soledad; Martín Carratalá, M Luisa

    2012-11-28

    A new approach for the determination of kinetic parameters of the cis/trans isomerization during the oxidation process of 24 virgin olive oils belonging to 8 different varieties is presented. The accelerated process of degradation at 100 °C was monitored by recording the Fourier transform infrared spectra. The parameters obtained confirm pseudo-first-order kinetics for the degradation of cis and the appearance of trans double bonds. The kinetic approach affords the induction time and the rate coefficient; these parameters are related to the fatty acid profile of the fresh olive oils. The data obtained were used to compare the oil stability of the samples with the help of multivariate statistical techniques. Fatty acid allowed a classification of the samples in five groups, one of them constituted by the cultivars with higher stability. Meanwhile, the kinetic parameters showed greater ability for the characterization of olive oils, allowing the classification in seven groups.

  20. Spatially distributed fiber sensor with dual processed outputs

    NASA Astrophysics Data System (ADS)

    Xu, X.; Spillman, William B., Jr.; Claus, Richard O.; Meissner, K. E.; Chen, K.

    2005-05-01

    Given the rapid aging of the world"s population, improvements in technology for automation of patient care and documentation are badly needed. We have previously demonstrated a 'smart bed' that can non-intrusively monitor a patient in bed and determine a patient's respiration, heart rate and movement without intrusive or restrictive medical measurements. This is an application of spatially distributed integrating fiber optic sensors. The basic concept is that any patient movement that also moves an optical fiber within a specified area will produce a change in the optical signal. Two modal modulation approaches were considered, a statistical mode (STM) sensor and a high order mode excitation (HOME) sensor. The present design includes an STM sensor combined with a HOME sensor, using both modal modulation approaches. A special lens system allows only the high order modes of the optical fiber to be excited and coupled into the sensor. For handling output from the dual STM-HOME sensor, computer processing methods are discussed that offer comprehensive perturbation analysis for more reliable patient monitoring.

  1. Statistical Study to Investigate Women’s Preference in the Phraseology of Lifetime and Age Specific Risk of Developing Breast Cancer

    DTIC Science & Technology

    2001-10-25

    anxiety, hypochondriasis, low self - esteem and a hypervigilant style of information processing, referred to as monitoring [7], have been associated with...collected through self -completed, postal questionnaires and responses were received from 137 out of 175 distributed questionnaires. Respondents had a...referral guidelines for family histories of breast cancer are [2]: • 1st degree relative (i.e. mother, sister) younger than 40; • 2nd degree paternal

  2. Equipment Health Monitoring with Non-Parametric Statistics for Online Early Detection and Scoring of Degradation

    DTIC Science & Technology

    2014-10-02

    defined by Eqs. (3)–(4) (Greenwell & Finch , 2004) (Kar & Mohanty, 2006). The p value provides the metric for novelty scoring. p = QKS(z) = 2 ∞∑ j=1 (−1...provides early detection of degradation and ability to score its significance in order to inform maintenance planning and consequently reduce disruption ...actionable information, sig- nals are typically processed from raw measurements into a reduced dimension novelty summary value that may be more easily

  3. Operation of a pond-cooler: the case of Berezovskaya GRES-1

    NASA Astrophysics Data System (ADS)

    Morozova, O. G.; Kamoza, T. L.; Koyupchenko, I. N.; Savelyev, A. S.; Pen, R. Z.; Veselkova, N. S.; Kudryavtsev, M. D.

    2017-08-01

    Pond-coolers at heat and nuclear power stations are natural-technological systems, so the program of their monitoring should include the effect made by the SRPS (state regional power station) on the pond ecosystem, including thermal discharge of cooling water. The objectives of this study were development and implementation of a monitoring program for the cooling pond of Berezovskaya SRPS-1 on the chemical and biological water quality indicators and identification of patterns of the thermal and hydrochemical regime when operating the progressive power plant (from 1996 to 2015). The quality of the cooling water of the pond-cooler BGRES-1 was studied under full-scale conditions by selecting and analyzing the water samples of the pond in accordance with the principles of complexity, systematic observation, and consistency of timing their conduct with the characteristic hydrological phases. Processing of the obtained array of monitoring data by methods of mathematical statistics makes it possible to identify the main factors affecting the water quality of the pond. The data on water quality obtained during their monitoring and mathematical processing over a long time interval are the scientific basis for forecasting the ecological state of the pond, which is necessary to economically ensure the efficient energy production and safety of water use. Recommendations proposed by these authors, including those partially already implemented, have been to prevent the development of eutrophication processes in the pond-cooler: the construction of a dam that cuts off the main peat massif and cleaning the river banks forming the cooling pond.

  4. What is the cost per millimetre? Challenging traditional GNSS equipment for precise geosciences and engineering applications

    NASA Astrophysics Data System (ADS)

    Hogg, William; Boreham, Nicholas; Benedetti, Elisa; Roberts, William

    2017-04-01

    Surveyors, civil and geotechnical engineers are the typical users of professional grade GNSS receiver that is capable of achieving positioning accuracies of sub-centimetre and navigation accuracies of 1-2cm. When choosing the equipment for their needs, they are often faced with a dilemma with each additional frequency, constellation and feature coming at a cost, resulting in professional GNSS equipment being regarded as high-priced specialist equipment. Indeed there are many users that have discounted GNSS on the grounds that it is too expensive and too operationally complex to warrant purchase. Having identified this situation, Nottingham Scientific Ltd (NSL) set about the development of equipment that would break down this barrier making high accuracy GNSS affordable to new users and applications and more cost effective to existing users. NSL created "STICK" which is a single frequency, multi-constellation, IMU-integrated GNSS sensor for precise movement detection of the natural and built environments and infrastructures, at approximately 1/20th of the price of a professional grade GNSS system. STICK has been developed within the context of three European Space Agency (ESA) Integrated Applications Programme Demonstration projects that use space assets to monitor the land stability and the status of different types of infrastructure, each with its own operational challenges. However through the careful selection of components, the implementation of certain operational constraints and the use of advanced statistical data processing, sub-centimetre positioning can be achieved for monitoring purposes. This paper describes STICK, the applications for which it has been developed, and the environments within which it is operating. We then explore the performance by directly comparing STICK to geodetic GNSS receivers setup in an operational, test bed environment. This test bed allows the receivers/antennas to be subjected to a three-dimensional displacement in the order of 1cm a day. The processing techniques that are used by the STICK monitoring service are described, including the GNSS data processing, the integration of IMU and the statistical analyses used to detect, quantify and report movement. By considering operational cost in terms of power, installation difficulty, remote communication and processing complexity and along with device price, we summarize the final cost to the user. Comparisons with other GNSS solutions shows whether cost truly scales with accuracy and precision. Benedetti E., L. Brack, W. Roberts, Performance Validation of Low-Cost GNSS Sensors for Land Monitoring and Hazard Mitigation, Paper presented at ION GNSS+ 2016 Session F4: Land-Based Applications 2, ION GNSS+ 2016 Proceedings (In Press) Roberts W., E. Benedetti, M. Hutchinson, G. Phipps and A. Keal, An Expendable GNSS Sensor for the Continuous Monitoring and Risk Profiling of Land and Infrastructure, Presentation at ION GNSS+ 2015 Session A5: Applications Using Consumer GNSS

  5. Using statistical process control methods to trace small changes in perinatal mortality after a training program in a low-resource setting.

    PubMed

    Mduma, Estomih R; Ersdal, Hege; Kvaloy, Jan Terje; Svensen, Erling; Mdoe, Paschal; Perlman, Jeffrey; Kidanto, Hussein Lessio; Soreide, Eldar

    2018-05-01

    To trace and document smaller changes in perinatal survival over time. Prospective observational study, with retrospective analysis. Labor ward and operating theater at Haydom Lutheran Hospital in rural north-central Tanzania. All women giving birth and birth attendants. Helping Babies Breathe (HBB) simulation training on newborn care and resuscitation and some other efforts to improve perinatal outcome. Perinatal survival, including fresh stillbirths and early (24-h) newborn survival. The variable life-adjusted plot and cumulative sum chart revealed a steady improvement in survival over time, after the baseline period. There were some variations throughout the study period, and some of these could be linked to different interventions and events. To our knowledge, this is the first time statistical process control methods have been used to document changes in perinatal mortality over time in a rural Sub-Saharan hospital, showing a steady increase in survival. These methods can be utilized to continuously monitor and describe changes in patient outcomes.

  6. HyperCard Monitor System.

    ERIC Educational Resources Information Center

    Harris, Julian; Maurer, Hermann

    An investigation into high level event monitoring within the scope of a well-known multimedia application, HyperCard--a program on the Macintosh computer, is carried out. A monitoring system is defined as a system which automatically monitors usage of some activity and gathers statistics based on what is has observed. Monitor systems can give the…

  7. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control.

    PubMed

    Létourneau, Daniel; Wang, An; Amin, Md Nurul; Pearce, Jim; McNiven, Andrea; Keller, Harald; Norrlinger, Bernhard; Jaffray, David A

    2014-12-01

    High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3-4 times/week over a period of 10-11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ± 0.5 and ± 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. The precision of the MLC performance monitoring QC test and the MLC itself was within ± 0.22 mm for most MLC leaves and the majority of the apparent leaf motion was attributed to beam spot displacements between irradiations. The MLC QC test was performed 193 and 162 times over the monitoring period for the studied units and recalibration had to be repeated up to three times on one of these units. For both units, rate of MLC interlocks was moderately associated with MLC servicing events. The strongest association with the MLC performance was observed between the MLC servicing events and the total number of out-of-control leaves. The average elapsed time for which the number of out-of-specification or out-of-control leaves was within a given performance threshold was computed and used to assess adequacy of MLC test frequency. A MLC performance monitoring system has been developed and implemented to acquire high-quality QC data at high frequency. This is enabled by the relatively short acquisition time for the images and automatic image analysis. The monitoring system was also used to record and track the rate of MLC-related interlocks and servicing events. MLC performances for two commercially available MLC models have been assessed and the results support monthly test frequency for widely accepted ± 1 mm specifications. Higher QC test frequency is however required to maintain tighter specification and in-control behavior.

  8. Statistical Methods for Quality Control of Steel Coils Manufacturing Process using Generalized Linear Models

    NASA Astrophysics Data System (ADS)

    García-Díaz, J. Carlos

    2009-11-01

    Fault detection and diagnosis is an important problem in process engineering. Process equipments are subject to malfunctions during operation. Galvanized steel is a value added product, furnishing effective performance by combining the corrosion resistance of zinc with the strength and formability of steel. Fault detection and diagnosis is an important problem in continuous hot dip galvanizing and the increasingly stringent quality requirements in automotive industry has also demanded ongoing efforts in process control to make the process more robust. When faults occur, they change the relationship among these observed variables. This work compares different statistical regression models proposed in the literature for estimating the quality of galvanized steel coils on the basis of short time histories. Data for 26 batches were available. Five variables were selected for monitoring the process: the steel strip velocity, four bath temperatures and bath level. The entire data consisting of 48 galvanized steel coils was divided into sets. The first training data set was 25 conforming coils and the second data set was 23 nonconforming coils. Logistic regression is a modeling tool in which the dependent variable is categorical. In most applications, the dependent variable is binary. The results show that the logistic generalized linear models do provide good estimates of quality coils and can be useful for quality control in manufacturing process.

  9. Successful water quality monitoring: The right combination of intent, measurement, interpretation, and a cooperating ecosystem

    USGS Publications Warehouse

    Soballe, D.M.

    1998-01-01

    Water quality monitoring is invaluable to ensure compliance with regulations, detect trends or patterns, and advance ecological understanding. However, monitoring typically measures only a few characteristics in a small fraction of a large and complex system, and thus the information contained in monitoring data depends upon which features of the ecosystem are actually captured by the measurements. Difficulties arise when these data contain something other than intended, but this can be minimized if the purpose of the sampling is clear, and the sampling design, measurements, and data interpretations are all compatible with this purpose. The monitoring program and data interpretation must also be properly matched to the structure and functioning of the system. Obtaining this match is sometimes an iterative process that demands a close link between research and monitoring. This paper focuses on water quality monitoring that is intended to track trends in aquatic resources and advance ecological understanding. It includes examples from three monitoring programs and a simulation exercise that illustrate problems that arise when the information content of monitoring data differs from expectation. The examples show (1) how inconsistencies among, or lack of information about, the basic elements of a monitoring program (intent, design, measurement, interpretation, and the monitored system) can produce a systematic difference (bias) between monitoring measurements and sampling intent or interpretation, and (2) that bias is not just a statistical consideration, but an insidious problem that can undermine the scientific integrity of a monitoring program. Some general suggestions are provided and hopefully these examples will help those engaged in water quality monitoring to enhance and protect the value of their monitoring investment.

  10. Software for Storage and Management of Microclimatic Data for Preventive Conservation of Cultural Heritage

    PubMed Central

    Fernández-Navajas, Ángel; Merello, Paloma; Beltrán, Pedro; García-Diego, Fernando-Juan

    2013-01-01

    Cultural Heritage preventive conservation requires the monitoring of the parameters involved in the process of deterioration of artworks. Thus, both long-term monitoring of the environmental parameters as well as further analysis of the recorded data are necessary. The long-term monitoring at frequencies higher than 1 data point/day generates large volumes of data that are difficult to store, manage and analyze. This paper presents software which uses a free open source database engine that allows managing and interacting with huge amounts of data from environmental monitoring of cultural heritage sites. It is of simple operation and offers multiple capabilities, such as detection of anomalous data, inquiries, graph plotting and mean trajectories. It is also possible to export the data to a spreadsheet for analyses with more advanced statistical methods (principal component analysis, ANOVA, linear regression, etc.). This paper also deals with a practical application developed for the Renaissance frescoes of the Cathedral of Valencia. The results suggest infiltration of rainwater in the vault and weekly relative humidity changes related with the religious service schedules. PMID:23447005

  11. Monitoring amphibians in Great Smoky Mountains National Park

    USGS Publications Warehouse

    Dodd, C. Kenneth

    2003-01-01

    This report provides an overview of the Park’s amphibians, the factors affecting their distribution, a review of important areas of biodiversity, and a summary of amphibian life history in the Southern Appalachians. In addition, survey techniques are described as well as examples of how the techniques are set up, a critique of what the results tell the observer, and a discussion of the limitations of the techniques and the data. The report reviews considerations for site selection, outlines steps for biosecurity and for processing diseased or dying animals, and provides resource managers with a decision tree on how to monitor the Park’s amphibians based on different levels of available resources. It concludes with an extensive list of references for inventorying and monitoring amphibians. USGS and Great Smoky Mountains National Park biologists need to establish cooperative efforts and training to ensure that congressionally mandated amphibian surveys are performed in a statistically rigorous and biologically meaningful manner, and that amphibian populations on Federal lands are monitored to ensure their long-term survival. The research detailed in this report will aid these cooperative efforts.

  12. Cleanrooms and tissue banking how happy I could be with either GMP or GTP?

    PubMed

    Klykens, J; Pirnay, J-P; Verbeken, G; Giet, O; Baudoux, E; Jashari, R; Vanderkelen, A; Ectors, N

    2013-12-01

    The regulatory framework of tissue banking introduces a number of requirements for monitoring cleanrooms for processing tissue or cell grafts. Although a number of requirements were clearly defined, some requirements are open for interpretation. This study aims to contribute to the interpretation of GMP or GTP guidelines for tissue banking. Based on the experience of the participating centers, the results of the monitoring program were evaluated to determine the feasibility of a cleanroom in tissue banking and the monitoring program. Also the microbial efficacy of a laminar airflow cabinet and an incubator in a cleanroom environment was evaluated. This study indicated that a monitoring program of a cleanroom at rest in combination with (final) product testing is a feasible approach. Although no statistical significance (0.90 < p < 0.95) was found there is a strong indication that a Grade D environment is not the ideal background environment for a Grade A obtained through a laminar airflow cabinet. The microbial contamination of an incubator in a cleanroom is limited but requires closed containers for tissue and cell products.

  13. Assessing groundwater vulnerability to agrichemical contamination in the Midwest US

    USGS Publications Warehouse

    Burkart, M.R.; Kolpin, D.W.; James, D.E.

    1999-01-01

    Agrichemicals (herbicides and nitrate) are significant sources of diffuse pollution to groundwater. Indirect methods are needed to assess the potential for groundwater contamination by diffuse sources because groundwater monitoring is too costly to adequately define the geographic extent of contamination at a regional or national scale. This paper presents examples of the application of statistical, overlay and index, and process-based modeling methods for groundwater vulnerability assessments to a variety of data from the Midwest U.S. The principles for vulnerability assessment include both intrinsic (pedologic, climatologic, and hydrogeologic factors) and specific (contaminant and other anthropogenic factors) vulnerability of a location. Statistical methods use the frequency of contaminant occurrence, contaminant concentration, or contamination probability as a response variable. Statistical assessments are useful for defining the relations among explanatory and response variables whether they define intrinsic or specific vulnerability. Multivariate statistical analyses are useful for ranking variables critical to estimating water quality responses of interest. Overlay and index methods involve intersecting maps of intrinsic and specific vulnerability properties and indexing the variables by applying appropriate weights. Deterministic models use process-based equations to simulate contaminant transport and are distinguished from the other methods in their potential to predict contaminant transport in both space and time. An example of a one-dimensional leaching model linked to a geographic information system (GIS) to define a regional metamodel for contamination in the Midwest is included.

  14. The economics and statistics of passive telematics monitoring as a source of traffic data [a study on the I-90 and I-87 corridors] : final report on phase 1

    DOT National Transportation Integrated Search

    2007-07-01

    A study was undertaken to determine the statistics and economics associated with the use of the passive monitoring of commercial vehicle telematics as a source of highway traffic data. relationships were established with a group of private sector com...

  15. Redefining "Learning" in Statistical Learning: What Does an Online Measure Reveal About the Assimilation of Visual Regularities?

    PubMed

    Siegelman, Noam; Bogaerts, Louisa; Kronenfeld, Ofer; Frost, Ram

    2017-10-07

    From a theoretical perspective, most discussions of statistical learning (SL) have focused on the possible "statistical" properties that are the object of learning. Much less attention has been given to defining what "learning" is in the context of "statistical learning." One major difficulty is that SL research has been monitoring participants' performance in laboratory settings with a strikingly narrow set of tasks, where learning is typically assessed offline, through a set of two-alternative-forced-choice questions, which follow a brief visual or auditory familiarization stream. Is that all there is to characterizing SL abilities? Here we adopt a novel perspective for investigating the processing of regularities in the visual modality. By tracking online performance in a self-paced SL paradigm, we focus on the trajectory of learning. In a set of three experiments we show that this paradigm provides a reliable and valid signature of SL performance, and it offers important insights for understanding how statistical regularities are perceived and assimilated in the visual modality. This demonstrates the promise of integrating different operational measures to our theory of SL. © 2017 Cognitive Science Society, Inc.

  16. Streamflow monitoring and statistics for development of water rights claims for Wild and Scenic Rivers, Owyhee Canyonlands Wilderness, Idaho, 2012

    USGS Publications Warehouse

    Wood, Molly S.; Fosness, Ryan L.

    2013-01-01

    The U.S. Geological Survey, in cooperation with the Bureau of Land Management (BLM), collected streamflow data in 2012 and estimated streamflow statistics for stream segments designated "Wild," "Scenic," or "Recreational" under the National Wild and Scenic Rivers System in the Owyhee Canyonlands Wilderness in southwestern Idaho. The streamflow statistics were used by BLM to develop and file a draft, federal reserved water right claim in autumn 2012 to protect federally designated "outstanding remarkable values" in the stream segments. BLM determined that the daily mean streamflow equaled or exceeded 20 and 80 percent of the time during bimonthly periods (two periods per month) and the bankfull streamflow are important streamflow thresholds for maintaining outstanding remarkable values. Prior to this study, streamflow statistics estimated using available datasets and tools for the Owyhee Canyonlands Wilderness were inaccurate for use in the water rights claim. Streamflow measurements were made at varying intervals during February–September 2012 at 14 monitoring sites; 2 of the monitoring sites were equipped with telemetered streamgaging equipment. Synthetic streamflow records were created for 11 of the 14 monitoring sites using a partial‑record method or a drainage-area-ratio method. Streamflow records were obtained directly from an operating, long-term streamgage at one monitoring site, and from discontinued streamgages at two monitoring sites. For 10 sites analyzed using the partial-record method, discrete measurements were related to daily mean streamflow at a nearby, telemetered “index” streamgage. Resulting regression equations were used to estimate daily mean and annual peak streamflow at the monitoring sites during the full period of record for the index sites. A synthetic streamflow record for Sheep Creek was developed using a drainage-area-ratio method, because measured streamflows did not relate well to any index site to allow use of the partial-record method. The synthetic and actual daily mean streamflow records were used to estimate daily mean streamflow that was exceeded 80, 50, and 20 percent of the time (80-, 50-, and 20-percent exceedances) for bimonthly and annual periods. Bankfull streamflow statistics were calculated by fitting the synthetic and actual annual peak streamflow records to a log Pearson Type III distribution using Bulletin 17B guidelines in the U.S. Geological Survey PeakFQ program. The coefficients of determination (R2) for the regressions between the monitoring and index sites ranged from 0.74 for Wickahoney Creek to 0.98 for the West Fork Bruneau River and Deep Creek. Confidence in computed streamflow statistics is highest among other sites for the East Fork Owyhee River and the West Fork Bruneau River on the basis of regression statistics, visual fit of the related data, and the range and number of streamflow measurements. Streamflow statistics for sites with the greatest uncertainty included Big Jacks, Little Jacks, Cottonwood, Wickahoney, and Sheep Creeks. The uncertainty in computed streamflow statistics was due to a number of factors which included the distance of index sites relative to monitoring sites, relatively low streamflow conditions that occurred during the study, and the limited number and range of streamflow measurements. However, the computed streamflow statistics are considered the best possible estimates given available datasets in the remote study area. Streamflow measurements over a wider range of hydrologic and climatic conditions would improve the relations between streamflow characteristics at monitoring and index sites. Additionally, field surveys are needed to verify if the streamflows selected for the water rights claims are sufficient for maintaining outstanding remarkable values in the Wild and Scenic rivers included in the study.

  17. Establishing Statistical Equivalence of Data from Different Sampling Approaches for Assessment of Bacterial Phenotypic Antimicrobial Resistance

    PubMed Central

    2018-01-01

    ABSTRACT To assess phenotypic bacterial antimicrobial resistance (AMR) in different strata (e.g., host populations, environmental areas, manure, or sewage effluents) for epidemiological purposes, isolates of target bacteria can be obtained from a stratum using various sample types. Also, different sample processing methods can be applied. The MIC of each target antimicrobial drug for each isolate is measured. Statistical equivalence testing of the MIC data for the isolates allows evaluation of whether different sample types or sample processing methods yield equivalent estimates of the bacterial antimicrobial susceptibility in the stratum. We demonstrate this approach on the antimicrobial susceptibility estimates for (i) nontyphoidal Salmonella spp. from ground or trimmed meat versus cecal content samples of cattle in processing plants in 2013-2014 and (ii) nontyphoidal Salmonella spp. from urine, fecal, and blood human samples in 2015 (U.S. National Antimicrobial Resistance Monitoring System data). We found that the sample types for cattle yielded nonequivalent susceptibility estimates for several antimicrobial drug classes and thus may gauge distinct subpopulations of salmonellae. The quinolone and fluoroquinolone susceptibility estimates for nontyphoidal salmonellae from human blood are nonequivalent to those from urine or feces, conjecturally due to the fluoroquinolone (ciprofloxacin) use to treat infections caused by nontyphoidal salmonellae. We also demonstrate statistical equivalence testing for comparing sample processing methods for fecal samples (culturing one versus multiple aliquots per sample) to assess AMR in fecal Escherichia coli. These methods yield equivalent results, except for tetracyclines. Importantly, statistical equivalence testing provides the MIC difference at which the data from two sample types or sample processing methods differ statistically. Data users (e.g., microbiologists and epidemiologists) may then interpret practical relevance of the difference. IMPORTANCE Bacterial antimicrobial resistance (AMR) needs to be assessed in different populations or strata for the purposes of surveillance and determination of the efficacy of interventions to halt AMR dissemination. To assess phenotypic antimicrobial susceptibility, isolates of target bacteria can be obtained from a stratum using different sample types or employing different sample processing methods in the laboratory. The MIC of each target antimicrobial drug for each of the isolates is measured, yielding the MIC distribution across the isolates from each sample type or sample processing method. We describe statistical equivalence testing for the MIC data for evaluating whether two sample types or sample processing methods yield equivalent estimates of the bacterial phenotypic antimicrobial susceptibility in the stratum. This includes estimating the MIC difference at which the data from the two approaches differ statistically. Data users (e.g., microbiologists, epidemiologists, and public health professionals) can then interpret whether that present difference is practically relevant. PMID:29475868

  18. Establishing Statistical Equivalence of Data from Different Sampling Approaches for Assessment of Bacterial Phenotypic Antimicrobial Resistance.

    PubMed

    Shakeri, Heman; Volkova, Victoriya; Wen, Xuesong; Deters, Andrea; Cull, Charley; Drouillard, James; Müller, Christian; Moradijamei, Behnaz; Jaberi-Douraki, Majid

    2018-05-01

    To assess phenotypic bacterial antimicrobial resistance (AMR) in different strata (e.g., host populations, environmental areas, manure, or sewage effluents) for epidemiological purposes, isolates of target bacteria can be obtained from a stratum using various sample types. Also, different sample processing methods can be applied. The MIC of each target antimicrobial drug for each isolate is measured. Statistical equivalence testing of the MIC data for the isolates allows evaluation of whether different sample types or sample processing methods yield equivalent estimates of the bacterial antimicrobial susceptibility in the stratum. We demonstrate this approach on the antimicrobial susceptibility estimates for (i) nontyphoidal Salmonella spp. from ground or trimmed meat versus cecal content samples of cattle in processing plants in 2013-2014 and (ii) nontyphoidal Salmonella spp. from urine, fecal, and blood human samples in 2015 (U.S. National Antimicrobial Resistance Monitoring System data). We found that the sample types for cattle yielded nonequivalent susceptibility estimates for several antimicrobial drug classes and thus may gauge distinct subpopulations of salmonellae. The quinolone and fluoroquinolone susceptibility estimates for nontyphoidal salmonellae from human blood are nonequivalent to those from urine or feces, conjecturally due to the fluoroquinolone (ciprofloxacin) use to treat infections caused by nontyphoidal salmonellae. We also demonstrate statistical equivalence testing for comparing sample processing methods for fecal samples (culturing one versus multiple aliquots per sample) to assess AMR in fecal Escherichia coli These methods yield equivalent results, except for tetracyclines. Importantly, statistical equivalence testing provides the MIC difference at which the data from two sample types or sample processing methods differ statistically. Data users (e.g., microbiologists and epidemiologists) may then interpret practical relevance of the difference. IMPORTANCE Bacterial antimicrobial resistance (AMR) needs to be assessed in different populations or strata for the purposes of surveillance and determination of the efficacy of interventions to halt AMR dissemination. To assess phenotypic antimicrobial susceptibility, isolates of target bacteria can be obtained from a stratum using different sample types or employing different sample processing methods in the laboratory. The MIC of each target antimicrobial drug for each of the isolates is measured, yielding the MIC distribution across the isolates from each sample type or sample processing method. We describe statistical equivalence testing for the MIC data for evaluating whether two sample types or sample processing methods yield equivalent estimates of the bacterial phenotypic antimicrobial susceptibility in the stratum. This includes estimating the MIC difference at which the data from the two approaches differ statistically. Data users (e.g., microbiologists, epidemiologists, and public health professionals) can then interpret whether that present difference is practically relevant. Copyright © 2018 Shakeri et al.

  19. On the use of attractor dimension as a feature in structural health monitoring

    USGS Publications Warehouse

    Nichols, J.M.; Virgin, L.N.; Todd, M.D.; Nichols, J.D.

    2003-01-01

    Recent works in the vibration-based structural health monitoring community have emphasised the use of correlation dimension as a discriminating statistic in seperating a damaged from undamaged response. This paper explores the utility of attractor dimension as a 'feature' and offers some comparisons between different metrics reflecting dimension. This focus is on evaluating the performance of two different measures of dimension as damage indicators in a structural health monitoring context. Results indicate that the correlation dimension is probably a poor choice of statistic for the purpose of signal discrimination. Other measures of dimension may be used for the same purposes with a higher degree of statistical reliability. The question of competing methodologies is placed in a hypothesis testing framework and answered with experimental data taken from a cantilivered beam.

  20. Bioregional monitoring design and occupancy estimation for two Sierra Nevadan amphibian taxa

    EPA Science Inventory

    Land-management agencies need quantitative, statistically rigorous monitoring data, often at large spatial and temporal scales, to support resource-management decisions. Monitoring designs typically must accommodate multiple ecological, logistical, political, and economic objec...

  1. Additive hazards regression and partial likelihood estimation for ecological monitoring data across space.

    PubMed

    Lin, Feng-Chang; Zhu, Jun

    2012-01-01

    We develop continuous-time models for the analysis of environmental or ecological monitoring data such that subjects are observed at multiple monitoring time points across space. Of particular interest are additive hazards regression models where the baseline hazard function can take on flexible forms. We consider time-varying covariates and take into account spatial dependence via autoregression in space and time. We develop statistical inference for the regression coefficients via partial likelihood. Asymptotic properties, including consistency and asymptotic normality, are established for parameter estimates under suitable regularity conditions. Feasible algorithms utilizing existing statistical software packages are developed for computation. We also consider a simpler additive hazards model with homogeneous baseline hazard and develop hypothesis testing for homogeneity. A simulation study demonstrates that the statistical inference using partial likelihood has sound finite-sample properties and offers a viable alternative to maximum likelihood estimation. For illustration, we analyze data from an ecological study that monitors bark beetle colonization of red pines in a plantation of Wisconsin.

  2. Learning predictive statistics from temporal sequences: Dynamics and strategies.

    PubMed

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-10-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics-that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments.

  3. Clustering and Flow Conservation Monitoring Tool for Software Defined Networks

    PubMed Central

    Puente Fernández, Jesús Antonio

    2018-01-01

    Prediction systems present some challenges on two fronts: the relation between video quality and observed session features and on the other hand, dynamics changes on the video quality. Software Defined Networks (SDN) is a new concept of network architecture that provides the separation of control plane (controller) and data plane (switches) in network devices. Due to the existence of the southbound interface, it is possible to deploy monitoring tools to obtain the network status and retrieve a statistics collection. Therefore, achieving the most accurate statistics depends on a strategy of monitoring and information requests of network devices. In this paper, we propose an enhanced algorithm for requesting statistics to measure the traffic flow in SDN networks. Such an algorithm is based on grouping network switches in clusters focusing on their number of ports to apply different monitoring techniques. Such grouping occurs by avoiding monitoring queries in network switches with common characteristics and then, by omitting redundant information. In this way, the present proposal decreases the number of monitoring queries to switches, improving the network traffic and preventing the switching overload. We have tested our optimization in a video streaming simulation using different types of videos. The experiments and comparison with traditional monitoring techniques demonstrate the feasibility of our proposal maintaining similar values decreasing the number of queries to the switches. PMID:29614049

  4. The statistical evaluation and comparison of ADMS-Urban model for the prediction of nitrogen dioxide with air quality monitoring network.

    PubMed

    Dėdelė, Audrius; Miškinytė, Auksė

    2015-09-01

    In many countries, road traffic is one of the main sources of air pollution associated with adverse effects on human health and environment. Nitrogen dioxide (NO2) is considered to be a measure of traffic-related air pollution, with concentrations tending to be higher near highways, along busy roads, and in the city centers, and the exceedances are mainly observed at measurement stations located close to traffic. In order to assess the air quality in the city and the air pollution impact on public health, air quality models are used. However, firstly, before the model can be used for these purposes, it is important to evaluate the accuracy of the dispersion modelling as one of the most widely used method. The monitoring and dispersion modelling are two components of air quality monitoring system (AQMS), in which statistical comparison was made in this research. The evaluation of the Atmospheric Dispersion Modelling System (ADMS-Urban) was made by comparing monthly modelled NO2 concentrations with the data of continuous air quality monitoring stations in Kaunas city. The statistical measures of model performance were calculated for annual and monthly concentrations of NO2 for each monitoring station site. The spatial analysis was made using geographic information systems (GIS). The calculation of statistical parameters indicated a good ADMS-Urban model performance for the prediction of NO2. The results of this study showed that the agreement of modelled values and observations was better for traffic monitoring stations compared to the background and residential stations.

  5. Cost considerations for long-term ecological monitoring

    USGS Publications Warehouse

    Caughlan, L.; Oakley, K.L.

    2001-01-01

    For an ecological monitoring program to be successful over the long-term, the perceived benefits of the information must justify the cost. Financial limitations will always restrict the scope of a monitoring program, hence the program's focus must be carefully prioritized. Clearly identifying the costs and benefits of a program will assist in this prioritization process, but this is easier said than done. Frequently, the true costs of monitoring are not recognized and are, therefore, underestimated. Benefits are rarely evaluated, because they are difficult to quantify. The intent of this review is to assist the designers and managers of long-term ecological monitoring programs by providing a general framework for building and operating a cost-effective program. Previous considerations of monitoring costs have focused on sampling design optimization. We present cost considerations of monitoring in a broader context. We explore monitoring costs, including both budgetary costs--what dollars are spent on--and economic costs, which include opportunity costs. Often, the largest portion of a monitoring program budget is spent on data collection, and other, critical aspects of the program, such as scientific oversight, training, data management, quality assurance, and reporting, are neglected. Recognizing and budgeting for all program costs is therefore a key factor in a program's longevity. The close relationship between statistical issues and cost is discussed, highlighting the importance of sampling design, replication and power, and comparing the costs of alternative designs through pilot studies and simulation modeling. A monitoring program development process that includes explicit checkpoints for considering costs is presented. The first checkpoint occur during the setting of objectives and during sampling design optimization. The last checkpoint occurs once the basic shape of the program is known, and the costs and benefits, or alternatively the cost-effectiveness, of each program element can be evaluated. Moving into the implementation phase without careful evaluation of costs and benefits is risky because if costs are later found to exceed benefits, the program will fail. The costs of development, which can be quite high, will have been largely wasted. Realistic expectations of costs and benefits will help ensure that monitoring programs survive the early, turbulent stages of development and the challenges posed by fluctuating budgets during implementation.

  6. Health status monitoring for ICU patients based on locally weighted principal component analysis.

    PubMed

    Ding, Yangyang; Ma, Xin; Wang, Youqing

    2018-03-01

    Intelligent status monitoring for critically ill patients can help medical stuff quickly discover and assess the changes of disease and then make appropriate treatment strategy. However, general-type monitoring model now widely used is difficult to adapt the changes of intensive care unit (ICU) patients' status due to its fixed pattern, and a more robust, efficient and fast monitoring model should be developed to the individual. A data-driven learning approach combining locally weighted projection regression (LWPR) and principal component analysis (PCA) is firstly proposed and applied to monitor the nonlinear process of patients' health status in ICU. LWPR is used to approximate the complex nonlinear process with local linear models, in which PCA could be further applied to status monitoring, and finally a global weighted statistic will be acquired for detecting the possible abnormalities. Moreover, some improved versions are developed, such as LWPR-MPCA and LWPR-JPCA, which also have superior performance. Eighteen subjects were selected from the Physiobank's Multi-parameter Intelligent Monitoring for Intensive Care II (MIMIC II) database, and two vital signs of each subject were chosen for online monitoring. The proposed method was compared with several existing methods including traditional PCA, Partial least squares (PLS), just in time learning combined with modified PCA (L-PCA), and Kernel PCA (KPCA). The experimental results demonstrated that the mean fault detection rate (FDR) of PCA can be improved by 41.7% after adding LWPR. The mean FDR of LWPR-MPCA was increased by 8.3%, compared with the latest reported method L-PCA. Meanwhile, LWPR spent less training time than others, especially KPCA. LWPR is first introduced into ICU patients monitoring and achieves the best monitoring performance including adaptability to changes in patient status, sensitivity for abnormality detection as well as its fast learning speed and low computational complexity. The algorithm is an excellent approach to establishing a personalized model for patients, which is the mainstream direction of modern medicine in the following development, as well as improving the global monitoring performance. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  7. STATISTICAL PERSPECTIVE ON THE DESIGN AND ANALYSIS ON NATURAL RESOURCE MONITORING PROGRAMS

    EPA Science Inventory

    Natural resource monitoring includes a wide variation in the type of natural resource monitored as well as in the objectives for the monitoring. Rather than address the entire breadth, the focus will be restricted to programs whose focus is to produce state, regional, or nationa...

  8. Information processing for aerospace structural health monitoring

    NASA Astrophysics Data System (ADS)

    Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.

    1998-06-01

    Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.

  9. Fronthaul evolution: From CPRI to Ethernet

    NASA Astrophysics Data System (ADS)

    Gomes, Nathan J.; Chanclou, Philippe; Turnbull, Peter; Magee, Anthony; Jungnickel, Volker

    2015-12-01

    It is proposed that using Ethernet in the fronthaul, between base station baseband unit (BBU) pools and remote radio heads (RRHs), can bring a number of advantages, from use of lower-cost equipment, shared use of infrastructure with fixed access networks, to obtaining statistical multiplexing and optimised performance through probe-based monitoring and software-defined networking. However, a number of challenges exist: ultra-high-bit-rate requirements from the transport of increased bandwidth radio streams for multiple antennas in future mobile networks, and low latency and jitter to meet delay requirements and the demands of joint processing. A new fronthaul functional division is proposed which can alleviate the most demanding bit-rate requirements by transport of baseband signals instead of sampled radio waveforms, and enable statistical multiplexing gains. Delay and synchronisation issues remain to be solved.

  10. The Joint Experiment for Crop Assessment and Monitoring (JECAM): Synthetic Aperture Radar (SAR) Inter-Comparison Experiment

    NASA Astrophysics Data System (ADS)

    Dingle Robertson, L.; Hosseini, M.; Davidson, A. M.; McNairn, H.

    2017-12-01

    The Joint Experiment for Crop Assessment and Monitoring (JECAM) is the research and development branch of GEOGLAM (Group on Earth Observations Global Agricultural Monitoring), a G20 initiative to improve the global monitoring of agriculture through the use of Earth Observation (EO) data and remote sensing. JECAM partners represent a diverse network of researchers collaborating towards a set of best practices and recommendations for global agricultural analysis using EO data, with well monitored test sites covering a wide range of agriculture types, cropping systems and climate regimes. Synthetic Aperture Radar (SAR) for crop inventory and condition monitoring offers many advantages particularly the ability to collect data under cloudy conditions. The JECAM SAR Inter-Comparison Experiment is a multi-year, multi-partner project that aims to compare global methods for (1) operational SAR & optical; multi-frequency SAR; and compact polarimetry methods for crop monitoring and inventory, and (2) the retrieval of Leaf Area Index (LAI) and biomass estimations using models such as the Water Cloud Model (WCM) employing single frequency SAR; multi-frequency SAR; and compact polarimetry. The results from these activities will be discussed along with an examination of the requirements of a global experiment including best-date determination for SAR data acquisition, pre-processing techniques, in situ data sharing, model development and statistical inter-comparison of the results.

  11. DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT

    EPA Science Inventory

    Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...

  12. Statistical analysis of environmental monitoring data: does a worst case time for monitoring clean rooms exist?

    PubMed

    Cundell, A M; Bean, R; Massimore, L; Maier, C

    1998-01-01

    To determine the relationship between the sampling time of the environmental monitoring, i.e., viable counts, in aseptic filling areas and the microbial count and frequency of alerts for air, surface and personnel microbial monitoring, statistical analyses were conducted on 1) the frequency of alerts versus the time of day for routine environmental sampling conducted in calendar year 1994, and 2) environmental monitoring data collected at 30-minute intervals during routine aseptic filling operations over two separate days in four different clean rooms with multiple shifts and equipment set-ups at a parenteral manufacturing facility. Statistical analyses showed, except for one floor location that had significantly higher number of counts but no alert or action level samplings in the first two hours of operation, there was no relationship between the number of counts and the time of sampling. Further studies over a 30-day period at the floor location showed no relationship between time of sampling and microbial counts. The conclusion reached in the study was that there is no worst case time for environmental monitoring at that facility and that sampling any time during the aseptic filling operation will give a satisfactory measure of the microbial cleanliness in the clean room during the set-up and aseptic filling operation.

  13. Spatial Statistical and Modeling Strategy for Inventorying and Monitoring Ecosystem Resources at Multiple Scales and Resolution Levels

    Treesearch

    Robin M. Reich; C. Aguirre-Bravo; M.S. Williams

    2006-01-01

    A statistical strategy for spatial estimation and modeling of natural and environmental resource variables and indicators is presented. This strategy is part of an inventory and monitoring pilot study that is being carried out in the Mexican states of Jalisco and Colima. Fine spatial resolution estimates of key variables and indicators are outputs that will allow the...

  14. Expert system for testing industrial processes and determining sensor status

    DOEpatents

    Gross, K.C.; Singer, R.M.

    1998-06-02

    A method and system are disclosed for monitoring both an industrial process and a sensor. The method and system include determining a minimum number of sensor pairs needed to test the industrial process as well as the sensor for evaluating the state of operation of both. The technique further includes generating a first and second signal characteristic of an industrial process variable. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the pair of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test. 24 figs.

  15. Expert system for testing industrial processes and determining sensor status

    DOEpatents

    Gross, Kenneth C.; Singer, Ralph M.

    1998-01-01

    A method and system for monitoring both an industrial process and a sensor. The method and system include determining a minimum number of sensor pairs needed to test the industrial process as well as the sensor for evaluating the state of operation of both. The technique further includes generating a first and second signal characteristic of an industrial process variable. After obtaining two signals associated with one physical variable, a difference function is obtained by determining the arithmetic difference between the pair of signals over time. A frequency domain transformation is made of the difference function to obtain Fourier modes describing a composite function. A residual function is obtained by subtracting the composite function from the difference function and the residual function (free of nonwhite noise) is analyzed by a statistical probability ratio test.

  16. Improved silicon nitride for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Yeh, Harry C.; Fang, Ho T.

    1991-01-01

    The results of a four year program to improve the strength and reliability of injection-molded silicon nitride are summarized. Statistically designed processing experiments were performed to identify and optimize critical processing parameters and compositions. Process improvements were monitored by strength testing at room and elevated temperatures, and microstructural characterization by optical, scanning electron microscopes, and scanning transmission electron microscope. Processing modifications resulted in a 20 percent strength and 72 percent Weibull slope improvement of the baseline material. Additional sintering aids screening and optimization experiments succeeded in developing a new composition (GN-10) capable of 581.2 MPa at 1399 C. A SiC whisker toughened composite using this material as a matrix achieved a room temperature toughness of 6.9 MPa m(exp .5) by the Chevron notched bar technique. Exploratory experiments were conducted on injection molding of turbocharger rotors.

  17. Modified SPC for short run test and measurement process in multi-stations

    NASA Astrophysics Data System (ADS)

    Koh, C. K.; Chin, J. F.; Kamaruddin, S.

    2018-03-01

    Due to short production runs and measurement error inherent in electronic test and measurement (T&M) processes, continuous quality monitoring through real-time statistical process control (SPC) is challenging. Industry practice allows the installation of guard band using measurement uncertainty to reduce the width of acceptance limit, as an indirect way to compensate the measurement errors. This paper presents a new SPC model combining modified guard band and control charts (\\bar{\\text{Z}} chart and W chart) for short runs in T&M process in multi-stations. The proposed model standardizes the observed value with measurement target (T) and rationed measurement uncertainty (U). S-factor (S f) is introduced to the control limits to improve the sensitivity in detecting small shifts. The model was embedded in automated quality control system and verified with a case study in real industry.

  18. Data-quality measures for stakeholder-implemented watershed-monitoring programs

    USGS Publications Warehouse

    Greve, Adrienne I.

    2002-01-01

    Community-based watershed groups, many of which collect environmental data, have steadily increased in number over the last decade. The data generated by these programs are often underutilized due to uncertainty in the quality of data produced. The incorporation of data-quality measures into stakeholder monitoring programs lends statistical validity to data. Data-quality measures are divided into three steps: quality assurance, quality control, and quality assessment. The quality-assurance step attempts to control sources of error that cannot be directly quantified. This step is part of the design phase of a monitoring program and includes clearly defined, quantifiable objectives, sampling sites that meet the objectives, standardized protocols for sample collection, and standardized laboratory methods. Quality control (QC) is the collection of samples to assess the magnitude of error in a data set due to sampling, processing, transport, and analysis. In order to design a QC sampling program, a series of issues needs to be considered: (1) potential sources of error, (2) the type of QC samples, (3) inference space, (4) the number of QC samples, and (5) the distribution of the QC samples. Quality assessment is the process of evaluating quality-assurance measures and analyzing the QC data in order to interpret the environmental data. Quality assessment has two parts: one that is conducted on an ongoing basis as the monitoring program is running, and one that is conducted during the analysis of environmental data. The discussion of the data-quality measures is followed by an example of their application to a monitoring program in the Big Thompson River watershed of northern Colorado.

  19. Entropy-based heavy tailed distribution transformation and visual analytics for monitoring massive network traffic

    NASA Astrophysics Data System (ADS)

    Han, Keesook J.; Hodge, Matthew; Ross, Virginia W.

    2011-06-01

    For monitoring network traffic, there is an enormous cost in collecting, storing, and analyzing network traffic datasets. Data mining based network traffic analysis has a growing interest in the cyber security community, but is computationally expensive for finding correlations between attributes in massive network traffic datasets. To lower the cost and reduce computational complexity, it is desirable to perform feasible statistical processing on effective reduced datasets instead of on the original full datasets. Because of the dynamic behavior of network traffic, traffic traces exhibit mixtures of heavy tailed statistical distributions or overdispersion. Heavy tailed network traffic characterization and visualization are important and essential tasks to measure network performance for the Quality of Services. However, heavy tailed distributions are limited in their ability to characterize real-time network traffic due to the difficulty of parameter estimation. The Entropy-Based Heavy Tailed Distribution Transformation (EHTDT) was developed to convert the heavy tailed distribution into a transformed distribution to find the linear approximation. The EHTDT linearization has the advantage of being amenable to characterize and aggregate overdispersion of network traffic in realtime. Results of applying the EHTDT for innovative visual analytics to real network traffic data are presented.

  20. Structural health monitoring of localized internal corrosion in high temperature piping for oil industry

    NASA Astrophysics Data System (ADS)

    Eason, Thomas J.; Bond, Leonard J.; Lozev, Mark G.

    2015-03-01

    Crude oil is becoming more corrosive with higher sulfur concentration, chloride concentration, and acidity. The increasing presence of naphthenic acids in oils with various environmental conditions at temperatures between 150°C and 400°C can lead to different internal degradation morphologies in refineries that are uniform, non-uniform, or localized pitting. Improved corrosion measurement technology is needed to better quantify the integrity risk associated with refining crude oils of higher acid concentration. This paper first reports a consolidated review of corrosion inspection technology to establish the foundation for structural health monitoring of localized internal corrosion in high temperature piping. An approach under investigation is to employ flexible ultrasonic thin-film piezoelectric transducer arrays fabricated by the sol-gel manufacturing process for monitoring localized internal corrosion at temperatures up to 400°C. A statistical analysis of sol-gel transducer measurement accuracy using various time of flight thickness calculation algorithms on a flat calibration block is demonstrated.

  1. Spatio-temporal statistical models for river monitoring networks.

    PubMed

    Clement, L; Thas, O; Vanrolleghem, P A; Ottoy, J P

    2006-01-01

    When introducing new wastewater treatment plants (WWTP), investors and policy makers often want to know if there indeed is a beneficial effect of the installation of a WWTP on the river water quality. Such an effect can be established in time as well as in space. Since both temporal and spatial components affect the output of a monitoring network, their dependence structure has to be modelled. River water quality data typically come from a river monitoring network for which the spatial dependence structure is unidirectional. Thus the traditional spatio-temporal models are not appropriate, as they cannot take advantage of this directional information. In this paper, a state-space model is presented in which the spatial dependence of the state variable is represented by a directed acyclic graph, and the temporal dependence by a first-order autoregressive process. The state-space model is extended with a linear model for the mean to estimate the effect of the activation of a WWTP on the dissolved oxygen concentration downstream.

  2. Evaluation of Using Caged Clams to Monitor Contaminated Groundwater Exposure in the Near-Shore Environment of the Hanford Site 300 Area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, Kyle B.; Poston, Ted M.; Tiller, Brett L.

    2008-01-31

    The Asiatic clam (Corbicula fluminea) has been identified as an indicator species for locating and monitoring contaminated groundwater in the Columbia River. Pacific Northwest National Laboratory conducted a field study to explore the use of caged Asiatic clams to monitor contaminated groundwater upwelling in the 300 Area near-shore environment and assess seasonal differences in uranium uptake in relation to seasonal flow regimes of the Columbia River. Additional objectives included examining the potential effects of uranium accumulation on growth, survival, and tissue condition of the clams. This report documents the field conditions and procedures, laboratory procedures, and statistical analyses used inmore » collecting samples and processing the data. Detailed results are presented and illustrated, followed by a discussion comparing uranium concentrations in Asiatic clams collected at the 300 Area and describing the relationship between river discharge, groundwater indicators, and uranium in clams. Growth and survival, histology, and other sources of environmental variation also are discussed.« less

  3. Relations Between Autonomous Motivation and Leisure-Time Physical Activity Participation: The Mediating Role of Self-Regulation Techniques.

    PubMed

    Nurmi, Johanna; Hagger, Martin S; Haukkala, Ari; Araújo-Soares, Vera; Hankonen, Nelli

    2016-04-01

    This study tested the predictive validity of a multitheory process model in which the effect of autonomous motivation from self-determination theory on physical activity participation is mediated by the adoption of self-regulatory techniques based on control theory. Finnish adolescents (N = 411, aged 17-19) completed a prospective survey including validated measures of the predictors and physical activity, at baseline and after one month (N = 177). A subsample used an accelerometer to objectively measure physical activity and further validate the physical activity self-report assessment tool (n = 44). Autonomous motivation statistically significantly predicted action planning, coping planning, and self-monitoring. Coping planning and self-monitoring mediated the effect of autonomous motivation on physical activity, although self-monitoring was the most prominent. Controlled motivation had no effect on self-regulation techniques or physical activity. Developing interventions that support autonomous motivation for physical activity may foster increased engagement in self-regulation techniques and positively affect physical activity behavior.

  4. A quantitative assessment of patient and nurse outcomes of bedside nursing report implementation.

    PubMed

    Sand-Jecklin, Kari; Sherman, Jay

    2014-10-01

    To quantify quantitative outcomes of a practice change to a blended form of bedside nursing report. The literature identifies several benefits of bedside nursing shift report. However, published studies have not adequately quantified outcomes related to this process change, having either small or unreported sample sizes or not testing for statistical significance. Quasi-experimental pre- and postimplementation design. Seven medical-surgical units in a large university hospital implemented a blend of recorded and bedside nursing report. Outcomes monitored included patient and nursing satisfaction, patient falls, nursing overtime and medication errors. We found statistically significant improvements postimplementation in four patient survey items specifically impacted by the change to bedside report. Nursing perceptions of report were significantly improved in the areas of patient safety and involvement in care and nurse accountability postimplementation. However, there was a decline in nurse perception that report took a reasonable amount of time after bedside report implementation; contrary to these perceptions, there was no significant increase in nurse overtime. Patient falls at shift change decreased substantially after the implementation of bedside report. An intervening variable during the study period invalidated the comparison of medication errors pre- and postintervention. There was some indication from both patients and nurses that bedside report was not always consistently implemented. Several positive outcomes were documented in relation to the implementation of a blended bedside shift report, with few drawbacks. Nurse attitudes about report at the final data collection were more positive than at the initial postimplementation data collection. If properly implemented, nursing bedside report can result in improved patient and nursing satisfaction and patient safety outcomes. However, managers should involve staff nurses in the implementation process and continue to monitor consistency in report format as well as satisfaction with the process. © 2014 John Wiley & Sons Ltd.

  5. Computerized clinical decision support systems for chronic disease management: a decision-maker-researcher partnership systematic review.

    PubMed

    Roshanov, Pavel S; Misra, Shikha; Gerstein, Hertzel C; Garg, Amit X; Sebaldt, Rolf J; Mackay, Jean A; Weise-Kelly, Lorraine; Navarro, Tamara; Wilczynski, Nancy L; Haynes, R Brian

    2011-08-03

    The use of computerized clinical decision support systems (CCDSSs) may improve chronic disease management, which requires recurrent visits to multiple health professionals, ongoing disease and treatment monitoring, and patient behavior modification. The objective of this review was to determine if CCDSSs improve the processes of chronic care (such as diagnosis, treatment, and monitoring of disease) and associated patient outcomes (such as effects on biomarkers and clinical exacerbations). We conducted a decision-maker-researcher partnership systematic review. We searched MEDLINE, EMBASE, Ovid's EBM Reviews database, Inspec, and reference lists for potentially eligible articles published up to January 2010. We included randomized controlled trials that compared the use of CCDSSs to usual practice or non-CCDSS controls. Trials were eligible if at least one component of the CCDSS was designed to support chronic disease management. We considered studies 'positive' if they showed a statistically significant improvement in at least 50% of relevant outcomes. Of 55 included trials, 87% (n = 48) measured system impact on the process of care and 52% (n = 25) of those demonstrated statistically significant improvements. Sixty-five percent (36/55) of trials measured impact on, typically, non-major (surrogate) patient outcomes, and 31% (n = 11) of those demonstrated benefits. Factors of interest to decision makers, such as cost, user satisfaction, system interface and feature sets, unique design and deployment characteristics, and effects on user workflow were rarely investigated or reported. A small majority (just over half) of CCDSSs improved care processes in chronic disease management and some improved patient health. Policy makers, healthcare administrators, and practitioners should be aware that the evidence of CCDSS effectiveness is limited, especially with respect to the small number and size of studies measuring patient outcomes.

  6. Computerized clinical decision support systems for chronic disease management: A decision-maker-researcher partnership systematic review

    PubMed Central

    2011-01-01

    Background The use of computerized clinical decision support systems (CCDSSs) may improve chronic disease management, which requires recurrent visits to multiple health professionals, ongoing disease and treatment monitoring, and patient behavior modification. The objective of this review was to determine if CCDSSs improve the processes of chronic care (such as diagnosis, treatment, and monitoring of disease) and associated patient outcomes (such as effects on biomarkers and clinical exacerbations). Methods We conducted a decision-maker-researcher partnership systematic review. We searched MEDLINE, EMBASE, Ovid's EBM Reviews database, Inspec, and reference lists for potentially eligible articles published up to January 2010. We included randomized controlled trials that compared the use of CCDSSs to usual practice or non-CCDSS controls. Trials were eligible if at least one component of the CCDSS was designed to support chronic disease management. We considered studies 'positive' if they showed a statistically significant improvement in at least 50% of relevant outcomes. Results Of 55 included trials, 87% (n = 48) measured system impact on the process of care and 52% (n = 25) of those demonstrated statistically significant improvements. Sixty-five percent (36/55) of trials measured impact on, typically, non-major (surrogate) patient outcomes, and 31% (n = 11) of those demonstrated benefits. Factors of interest to decision makers, such as cost, user satisfaction, system interface and feature sets, unique design and deployment characteristics, and effects on user workflow were rarely investigated or reported. Conclusions A small majority (just over half) of CCDSSs improved care processes in chronic disease management and some improved patient health. Policy makers, healthcare administrators, and practitioners should be aware that the evidence of CCDSS effectiveness is limited, especially with respect to the small number and size of studies measuring patient outcomes. PMID:21824386

  7. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2005

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Tytgat, Guy; Estes, Steve; McNutt, Stephen R.

    2006-01-01

    The Alaska Volcano Observatory (AVO), a cooperative program of the U.S. Geological Survey, the Geophysical Institute of the University of Alaska Fairbanks, and the Alaska Division of Geological and Geophysical Surveys, has maintained seismic monitoring networks at historically active volcanoes in Alaska since 1988 (Figure 1). The primary objectives of the seismic program are the real-time seismic monitoring of active, potentially hazardous, Alaskan volcanoes and the investigation of seismic processes associated with active volcanism. This catalog presents calculated earthquake hypocenters and seismic phase arrival data, and details changes in the seismic monitoring program for the period January 1 through December 31, 2005.The AVO seismograph network was used to monitor the seismic activity at thirty-two volcanoes within Alaska in 2005 (Figure 1). The network was augmented by two new subnetworks to monitor the Semisopochnoi Island volcanoes and Little Sitkin Volcano. Seismicity at these volcanoes was still being studied at the end of 2005 and has not yet been added to the list of permanently monitored volcanoes in the AVO weekly update. Following an extended period of monitoring to determine the background seismicity at the Mount Peulik, Ukinrek Maars, and Korovin Volcano, formal monitoring of these volcanoes began in 2005. AVO located 9,012 earthquakes in 2005.Monitoring highlights in 2005 include: (1) seismicity at Mount Spurr remaining above background, starting in February 2004, through the end of the year and into 2006; (2) an increase in seismicity at Augustine Volcano starting in May 2005, and continuing through the end of the year into 2006; (3) volcanic tremor and seismicity related to low-level strombolian activity at Mount Veniaminof in January to March and September; and (4) a seismic swarm at Tanaga Volcano in October and November.This catalog includes: (1) descriptions and locations of seismic instrumentation deployed in the field in 2005; (2) a description of earthquake detection, recording, analysis, and data archival systems; (3) a description of seismic velocity models used for earthquake locations; (4) a summary of earthquakes located in 2005; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, and location quality statistics; daily station usage statistics; and all HYPOELLIPSE files used to determine the earthquake locations in 2005.

  8. Useful Life Prediction for Payload Carrier Hardware

    NASA Technical Reports Server (NTRS)

    Ben-Arieh, David

    2002-01-01

    The Space Shuttle has been identified for use through 2020. Payload carrier systems will be needed to support missions through the same time frame. To support the future decision making process with reliable systems, it is necessary to analyze design integrity, identify possible sources of undesirable risk and recognize required upgrades for carrier systems. This project analyzed the information available regarding the carriers and developed the probability of becoming obsolete under different scenarios. In addition, this project resulted in a plan for an improved information system that will improve monitoring and control of the various carriers. The information collected throughout this project is presented in this report as process flow, historical records, and statistical analysis.

  9. Software Analytical Instrument for Assessment of the Process of Casting Slabs

    NASA Astrophysics Data System (ADS)

    Franěk, Zdeněk; Kavička, František; Štětina, Josef; Masarik, Miloš

    2010-06-01

    The paper describes the original proposal of ways of solution and function of the program equipment for assessment of the process of casting slabs. The program system LITIOS was developed and implemented in EVRAZ Vitkovice Steel Ostrava on the equipment of continuous casting of steel (further only ECC). This program system works on the data warehouse of technological parameters of casting and quality parameters of slabs. It enables an ECC technologist to analyze the course of casting melt and with using statistics methods to set the influence of single technological parameters on the duality of final slabs. The system also enables long term monitoring and optimization of the production.

  10. Spatial statistical analysis of tree deaths using airborne digital imagery

    NASA Astrophysics Data System (ADS)

    Chang, Ya-Mei; Baddeley, Adrian; Wallace, Jeremy; Canci, Michael

    2013-04-01

    High resolution digital airborne imagery offers unprecedented opportunities for observation and monitoring of vegetation, providing the potential to identify, locate and track individual vegetation objects over time. Analytical tools are required to quantify relevant information. In this paper, locations of trees over a large area of native woodland vegetation were identified using morphological image analysis techniques. Methods of spatial point process statistics were then applied to estimate the spatially-varying tree death risk, and to show that it is significantly non-uniform. [Tree deaths over the area were detected in our previous work (Wallace et al., 2008).] The study area is a major source of ground water for the city of Perth, and the work was motivated by the need to understand and quantify vegetation changes in the context of water extraction and drying climate. The influence of hydrological variables on tree death risk was investigated using spatial statistics (graphical exploratory methods, spatial point pattern modelling and diagnostics).

  11. Statistical analysis plan for the Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART). A randomized controlled trial

    PubMed Central

    Damiani, Lucas Petri; Berwanger, Otavio; Paisani, Denise; Laranjeira, Ligia Nasi; Suzumura, Erica Aranha; Amato, Marcelo Britto Passos; Carvalho, Carlos Roberto Ribeiro; Cavalcanti, Alexandre Biasi

    2017-01-01

    Background The Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART) is an international multicenter randomized pragmatic controlled trial with allocation concealment involving 120 intensive care units in Brazil, Argentina, Colombia, Italy, Poland, Portugal, Malaysia, Spain, and Uruguay. The primary objective of ART is to determine whether maximum stepwise alveolar recruitment associated with PEEP titration, adjusted according to the static compliance of the respiratory system (ART strategy), is able to increase 28-day survival in patients with acute respiratory distress syndrome compared to conventional treatment (ARDSNet strategy). Objective To describe the data management process and statistical analysis plan. Methods The statistical analysis plan was designed by the trial executive committee and reviewed and approved by the trial steering committee. We provide an overview of the trial design with a special focus on describing the primary (28-day survival) and secondary outcomes. We describe our data management process, data monitoring committee, interim analyses, and sample size calculation. We describe our planned statistical analyses for primary and secondary outcomes as well as pre-specified subgroup analyses. We also provide details for presenting results, including mock tables for baseline characteristics, adherence to the protocol and effect on clinical outcomes. Conclusion According to best trial practice, we report our statistical analysis plan and data management plan prior to locking the database and beginning analyses. We anticipate that this document will prevent analysis bias and enhance the utility of the reported results. Trial registration ClinicalTrials.gov number, NCT01374022. PMID:28977255

  12. Statistical analysis of long-term monitoring data for persistent organic pollutants in the atmosphere at 20 monitoring stations broadly indicates declining concentrations.

    PubMed

    Kong, Deguo; MacLeod, Matthew; Hung, Hayley; Cousins, Ian T

    2014-11-04

    During recent decades concentrations of persistent organic pollutants (POPs) in the atmosphere have been monitored at multiple stations worldwide. We used three statistical methods to analyze a total of 748 time series of selected POPs in the atmosphere to determine if there are statistically significant reductions in levels of POPs that have had control actions enacted to restrict or eliminate manufacture, use and emissions. Significant decreasing trends were identified in 560 (75%) of the 748 time series collected from the Arctic, North America, and Europe, indicating that the atmospheric concentrations of these POPs are generally decreasing, consistent with the overall effectiveness of emission control actions. Statistically significant trends in synthetic time series could be reliably identified with the improved Mann-Kendall (iMK) test and the digital filtration (DF) technique in time series longer than 5 years. The temporal trends of new (or emerging) POPs in the atmosphere are often unclear because time series are too short. A statistical detrending method based on the iMK test was not able to identify abrupt changes in the rates of decline of atmospheric POP concentrations encoded into synthetic time series.

  13. Adaptive online monitoring for ICU patients by combining just-in-time learning and principal component analysis.

    PubMed

    Li, Xuejian; Wang, Youqing

    2016-12-01

    Offline general-type models are widely used for patients' monitoring in intensive care units (ICUs), which are developed by using past collected datasets consisting of thousands of patients. However, these models may fail to adapt to the changing states of ICU patients. Thus, to be more robust and effective, the monitoring models should be adaptable to individual patients. A novel combination of just-in-time learning (JITL) and principal component analysis (PCA), referred to learning-type PCA (L-PCA), was proposed for adaptive online monitoring of patients in ICUs. JITL was used to gather the most relevant data samples for adaptive modeling of complex physiological processes. PCA was used to build an online individual-type model and calculate monitoring statistics, and then to judge whether the patient's status is normal or not. The adaptability of L-PCA lies in the usage of individual data and the continuous updating of the training dataset. Twelve subjects were selected from the Physiobank's Multi-parameter Intelligent Monitoring for Intensive Care II (MIMIC II) database, and five vital signs of each subject were chosen. The proposed method was compared with the traditional PCA and fast moving-window PCA (Fast MWPCA). The experimental results demonstrated that the fault detection rates respectively increased by 20 % and 47 % compared with PCA and Fast MWPCA. L-PCA is first introduced into ICU patients monitoring and achieves the best monitoring performance in terms of adaptability to changes in patient status and sensitivity for abnormality detection.

  14. Evaluation of the Capillary Blood Glucose Self-monitoring Program

    PubMed Central

    Augusto, Mariana Cristina; Nitsche, Maria José Trevizani; Parada, Cristina Maria Garcia de Lima; Zanetti, Maria Lúcia; Carvalhaes, Maria Antonieta de Barros Leite

    2014-01-01

    OBJECTIVE: to evaluate the structure, process and results of the Capillary Blood Glucose Self-monitoring Program in a Brazilian city. METHOD: epidemiological, cross-sectional study. The methodological framework of Donabedian was used to construct indicators of structure, process and outcome. A random sample (n = 288) of users enrolled and 96 health professionals who worked in the program was studied. Two questionnaires were used that were constructed for this study, one for professionals and one for users, both containing data for the evaluation of structure, process and outcome. Anthropometric measures and laboratory results were collected by consulting the patients' health records. The analysis involved descriptive statistics. RESULTS: most of the professionals were not qualified to work in the program and were not knowledgeable about the set of criteria for patient registration. None of the patients received complete and correct orientations about the program and the percentage with skills to perform conducts autonomously was 10%. As regards the result indicators, 86.4% of the patients and 81.3% of the professionals evaluated the program positively. CONCLUSION: the evaluation indicators designed revealed that one of the main objectives of the program, self-care skills, has not been achieved. PMID:25493676

  15. Soil Biogeochemical Properties and Erosion Source Prediction Model Summary for the Buffalo Bayou Watershed, Houston, Texas

    NASA Astrophysics Data System (ADS)

    Ahmed, I.

    2015-12-01

    We draw conclusions on the research output and findings from a 4-year multidisciplinary USDA-CBG collaborative program in sustainable integrated monitoring of soil organic carbon (SOC) loss prediction via erosion. The underlying method uses the state-of-the-art stable isotope science of sediment tracing under uncertain hydrologic influences. The research finds are rooted in the (i) application of Bayesian Markov Chain Monte Carlo statistical models to assess the relationship between rainfall-runoff and soil erosion in space and time, (ii) capture of the episodic nature of rainfall events and its role in the spatial distribution of SOC loss from water erosion, (iii) stable isotope composition guided fingerprinting (source and quantity) of eroded soil, and (iv) the creation of an integrated watershed scale statistical soil loss monitoring model driven by spatial and temporal correlation of flow and stable isotope composition. The research theme was successfully applied on the urbanized Buffalo Bayou Watershed in Houston, Texas. The application brought to light novel future research conceptual outlines which will also be discussed in this deliverable to the AGU meeting. These include but not limited to: regional rainfall cluster research, physics of muddy river-bank soil and suspended sediment interaction, and friction & mobility that together make up the plasticity of soil aggregates that control erosion processes and landscape changes in a riparian corridor. References: Ahmed, I., Karim, A., Boutton, T.W., and Strom, K.B. (2013a). "Monitoring Soil Organic Carbon Loss from Erosion Using Stable Isotopes." Proc., Soil Carbon Sequestration, International Conference, May 26-29, Reykjavik, Iceland. Ahmed, I, Bouttom, T.W., Strom, K. B., Karim, A., and Irvin-Smith, N. (2013b). "Soil carbon distribution and loss monitoring in the urbanized Buffalo Bayou watershed, Houston, Texas." Proc., 4th Annual All Investigators Meeting of the North American Carbon Program, February 4-7, Albuquerque, NM. Fox, J.F. and Papanicolaou, A.N. (2008). An un-mixing model to study watershed erosion processes. Advances in Water Resources, 31, 96-108.

  16. Improving academic leadership and oversight in large industry-sponsored clinical trials: the ARO-CRO model

    PubMed Central

    Goldenberg, Neil A.; Spyropoulos, Alex C.; Halperin, Jonathan L.; Kessler, Craig M.; Schulman, Sam; Turpie, Alexander G. G.; Skene, Allan M.; Cutler, Neal R.

    2011-01-01

    Standards for clinical trial design, execution, and publication have increased in recent years. However, the current structure for interaction among the pharmaceutical sponsor funding a drug or device development program, the contract research organization (CRO) that typically assists in executing the trial, regulatory agencies, and academicians, provides inadequate leadership and oversight of the development process. Conventional academic steering committees are not provided with the independent infrastructure by which to verify statistical analyses and conclusions regarding safety and efficacy. We propose an alternative approach centered on partnerships between CROs and university-based academic research organizations (AROs). In this model, the ARO takes responsibility for processes that address journal requirements and regulatory expectations for independent academic oversight (including oversight of Steering Committee and Data and Safety Monitoring Board activities), whereas the CRO provides infrastructure for efficient trial execution, site monitoring, and data management. The ARO engages academic experts throughout the trial process and minimizes conflicts of interest in individual industry relationships via diversification of sponsors, agents, and therapeutic areas. Although numerous models can be entertained, the ARO-CRO model is uniquely structured to meet the demand for greater assurance of integrity in clinical trials and the needs of each stakeholder in the process. PMID:21068436

  17. Using the Leitz LMS 2000 for monitoring and improvement of an e-beam

    NASA Astrophysics Data System (ADS)

    Blaesing-Bangert, Carola; Roeth, Klaus-Dieter; Ogawa, Yoichi

    1994-11-01

    Kaizen--a continuously improving--is a philosophy lived in Japan which is also becoming more and more important in Western companies. To implement this philosophy in the semiconductor industry, a high performance metrology tool is essential to determine the status of production quality periodically. An important prerequisite for statistical process control is the high stability of the metrology tool over several months or years; the tool-induced shift should be as small as possible. The pattern placement metrology tool Leitz LMS 2000 has been used in a major European mask house for several years now to qualify masks within the tightest specifications and to monitor the MEBES III and its cassettes. The mask shop's internal specification for the long term repeatability of the pattern placement metrology tool is 19 nm instead of 42 nm as specified by the supplier of the tool. Then the process capability of the LMS 2000 over 18 months is represented by an average cpk value of 2.8 for orthogonality, 5.2 for x-scaling, and 3.0 for y-scaling. The process capability of the MEBES III and its cassettes was improved in the past years. For instance, 100% of the masks produced with a process tolerance of +/- 200 nm are now within this limit.

  18. Performance analysis of gamma ray spectrometric parameters on digital signal and analog signal processing based MCA systems using NaI(Tl) detector.

    PubMed

    Kukreti, B M; Sharma, G K

    2012-05-01

    Accurate and speedy estimations of ppm range uranium and thorium in the geological and rock samples are most useful towards ongoing uranium investigations and identification of favorable radioactive zones in the exploration field areas. In this study with the existing 5 in. × 4 in. NaI(Tl) detector setup, prevailing background and time constraints, an enhanced geometrical setup has been worked out to improve the minimum detection limits for primordial radioelements K(40), U(238) and Th(232). This geometrical setup has been integrated with the newly introduced, digital signal processing based MCA system for the routine spectrometric analysis of low concentration rock samples. Stability performance, during the long counting hours, for digital signal processing MCA system and its predecessor NIM bin based MCA system has been monitored, using the concept of statistical process control. Monitored results, over a time span of few months, have been quantified in terms of spectrometer's parameters such as Compton striping constants and Channel sensitivities, used for evaluating primordial radio element concentrations (K(40), U(238) and Th(232)) in geological samples. Results indicate stable dMCA performance, with a tendency of higher relative variance, about mean, particularly for Compton stripping constants. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Efficient species-level monitoring at the landscape scale

    Treesearch

    Barry R. Noon; Larissa L. Bailey; Thomas D. Sisk; Kevin S. McKelvey

    2012-01-01

    Monitoring the population trends of multiple animal species at a landscape scale is prohibitively expensive. However, advances in survey design, statistical methods, and the ability to estimate species presence on the basis of detection­nondetection data have greatly increased the feasibility of species-level monitoring. For example, recent advances in monitoring make...

  20. A method for estimation of bias and variability of continuous gas monitor data: application to carbon monoxide monitor accuracy.

    PubMed

    Shulman, Stanley A; Smith, Jerome P

    2002-01-01

    A method is presented for the evaluation of the bias, variability, and accuracy of gas monitors. This method is based on using the parameters for the fitted response curves of the monitors. Thereby, variability between calibrations, between dates within each calibration period, and between different units can be evaluated at several different standard concentrations. By combining variability information with bias information, accuracy can be assessed. An example using carbon monoxide monitor data is provided. Although the most general statistical software required for these tasks is not available on a spreadsheet, when the same number of dates in a calibration period are evaluated for each monitor unit, the calculations can be done on a spreadsheet. An example of such calculations, together with the formulas needed for their implementation, is provided. In addition, the methods can be extended by use of appropriate statistical models and software to evaluate monitor trends within calibration periods, as well as consider the effects of other variables, such as humidity and temperature, on monitor variability and bias.

  1. Assessment of disinfection of hospital surfaces using different monitoring methods1

    PubMed Central

    Ferreira, Adriano Menis; de Andrade, Denise; Rigotti, Marcelo Alessandro; de Almeida, Margarete Teresa Gottardo; Guerra, Odanir Garcia; dos Santos, Aires Garcia

    2015-01-01

    OBJECTIVE: to assess the efficiency of cleaning/disinfection of surfaces of an Intensive Care Unit. METHOD: descriptive-exploratory study with quantitative approach conducted over the course of four weeks. Visual inspection, bioluminescence adenosine triphosphate and microbiological indicators were used to indicate cleanliness/disinfection. Five surfaces (bed rails, bedside tables, infusion pumps, nurses' counter, and medical prescription table) were assessed before and after the use of rubbing alcohol at 70% (w/v), totaling 160 samples for each method. Non-parametric tests were used considering statistically significant differences at p<0.05. RESULTS: after the cleaning/disinfection process, 87.5, 79.4 and 87.5% of the surfaces were considered clean using the visual inspection, bioluminescence adenosine triphosphate and microbiological analyses, respectively. A statistically significant decrease was observed in the disapproval rates after the cleaning process considering the three assessment methods; the visual inspection was the least reliable. CONCLUSION: the cleaning/disinfection method was efficient in reducing microbial load and organic matter of surfaces, however, these findings require further study to clarify aspects related to the efficiency of friction, its frequency, and whether or not there is association with other inputs to achieve improved results of the cleaning/disinfection process. PMID:26312634

  2. Assessment of disinfection of hospital surfaces using different monitoring methods.

    PubMed

    Ferreira, Adriano Menis; de Andrade, Denise; Rigotti, Marcelo Alessandro; de Almeida, Margarete Teresa Gottardo; Guerra, Odanir Garcia; dos Santos Junior, Aires Garcia

    2015-01-01

    to assess the efficiency of cleaning/disinfection of surfaces of an Intensive Care Unit. descriptive-exploratory study with quantitative approach conducted over the course of four weeks. Visual inspection, bioluminescence adenosine triphosphate and microbiological indicators were used to indicate cleanliness/disinfection. Five surfaces (bed rails, bedside tables, infusion pumps, nurses' counter, and medical prescription table) were assessed before and after the use of rubbing alcohol at 70% (w/v), totaling 160 samples for each method. Non-parametric tests were used considering statistically significant differences at p<0.05. after the cleaning/disinfection process, 87.5, 79.4 and 87.5% of the surfaces were considered clean using the visual inspection, bioluminescence adenosine triphosphate and microbiological analyses, respectively. A statistically significant decrease was observed in the disapproval rates after the cleaning process considering the three assessment methods; the visual inspection was the least reliable. the cleaning/disinfection method was efficient in reducing microbial load and organic matter of surfaces, however, these findings require further study to clarify aspects related to the efficiency of friction, its frequency, and whether or not there is association with other inputs to achieve improved results of the cleaning/disinfection process.

  3. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculatingmore » a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the security of treatments. They also showed that the dose delivery processes in the cancer center were in control for prostate and head-and-neck treatments. In parallel, long term process performance indices (P{sub p}, P{sub pk}, and P{sub pm}) have been analyzed. Their analysis helped defining which actions should be undertaken in order to improve the performance of the process. The prostate dose delivery process has been shown statistically capable (0.08% of the results is expected to be outside the clinical tolerances) contrary to the head-and-neck dose delivery process (5.76% of the results are expected to be outside the clinical tolerances).« less

  4. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    PubMed

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the security of treatments. They also showed that the dose delivery processes in the cancer center were in control for prostate and head-and-neck treatments. In parallel, long-term process performance indices (P(p), P(pk), and P(pm)) have been analyzed. Their analysis helped defining which actions should be undertaken in order to improve the performance of the process. The prostate dose delivery process has been shown statistically capable (0.08% of the results is expected to be outside the clinical tolerances) contrary to the head-and-neck dose delivery process (5.76% of the results are expected to be outside the clinical tolerances).

  5. A flexibly shaped space-time scan statistic for disease outbreak detection and monitoring.

    PubMed

    Takahashi, Kunihiko; Kulldorff, Martin; Tango, Toshiro; Yih, Katherine

    2008-04-11

    Early detection of disease outbreaks enables public health officials to implement disease control and prevention measures at the earliest possible time. A time periodic geographical disease surveillance system based on a cylindrical space-time scan statistic has been used extensively for disease surveillance along with the SaTScan software. In the purely spatial setting, many different methods have been proposed to detect spatial disease clusters. In particular, some spatial scan statistics are aimed at detecting irregularly shaped clusters which may not be detected by the circular spatial scan statistic. Based on the flexible purely spatial scan statistic, we propose a flexibly shaped space-time scan statistic for early detection of disease outbreaks. The performance of the proposed space-time scan statistic is compared with that of the cylindrical scan statistic using benchmark data. In order to compare their performances, we have developed a space-time power distribution by extending the purely spatial bivariate power distribution. Daily syndromic surveillance data in Massachusetts, USA, are used to illustrate the proposed test statistic. The flexible space-time scan statistic is well suited for detecting and monitoring disease outbreaks in irregularly shaped areas.

  6. Fundamentals and Catalytic Innovation: The Statistical and Data Management Center of the Antibacterial Resistance Leadership Group

    PubMed Central

    Huvane, Jacqueline; Komarow, Lauren; Hill, Carol; Tran, Thuy Tien T.; Pereira, Carol; Rosenkranz, Susan L.; Finnemeyer, Matt; Earley, Michelle; Jiang, Hongyu (Jeanne); Wang, Rui; Lok, Judith

    2017-01-01

    Abstract The Statistical and Data Management Center (SDMC) provides the Antibacterial Resistance Leadership Group (ARLG) with statistical and data management expertise to advance the ARLG research agenda. The SDMC is active at all stages of a study, including design; data collection and monitoring; data analyses and archival; and publication of study results. The SDMC enhances the scientific integrity of ARLG studies through the development and implementation of innovative and practical statistical methodologies and by educating research colleagues regarding the application of clinical trial fundamentals. This article summarizes the challenges and roles, as well as the innovative contributions in the design, monitoring, and analyses of clinical trials and diagnostic studies, of the ARLG SDMC. PMID:28350899

  7. Real-time monitoring of the budding index in Saccharomyces cerevisiae batch cultivations with in situ microscopy.

    PubMed

    Marbà-Ardébol, Anna-Maria; Emmerich, Jörn; Muthig, Michael; Neubauer, Peter; Junne, Stefan

    2018-05-15

    The morphology of yeast cells changes during budding, depending on the growth rate and cultivation conditions. A photo-optical microscope was adapted and used to observe such morphological changes of individual cells directly in the cell suspension. In order to obtain statistically representative samples of the population without the influence of sampling, in situ microscopy (ISM) was applied in the different phases of a Saccharomyces cerevisiae batch cultivation. The real-time measurement was performed by coupling a photo-optical probe to an automated image analysis based on a neural network approach. Automatic cell recognition and classification of budding and non-budding cells was conducted successfully. Deviations between automated and manual counting were considerably low. A differentiation of growth activity across all process stages of a batch cultivation in complex media became feasible. An increased homogeneity among the population during the growth phase was well observable. At growth retardation, the portion of smaller cells increased due to a reduced bud formation. The maturation state of the cells was monitored by determining the budding index as a ratio between the number of cells, which were detected with buds and the total number of cells. A linear correlation between the budding index as monitored with ISM and the growth rate was found. It is shown that ISM is a meaningful analytical tool, as the budding index can provide valuable information about the growth activity of a yeast cell, e.g. in seed breeding or during any other cultivation process. The determination of the single-cell size and shape distributions provided information on the morphological heterogeneity among the populations. The ability to track changes in cell morphology directly on line enables new perspectives for monitoring and control, both in process development and on a production scale.

  8. Monitoring Urban Land Cover/land Use Change in Algiers City Using Landsat Images (1987-2016)

    NASA Astrophysics Data System (ADS)

    Bouchachi, B.; Zhong, Y.

    2017-09-01

    Monitoring the Urban Land Cover/Land Use change detection is important as one of the main driving forces of environmental change because Urbanization is the biggest changes in form of Land, resulting in a decrease in cultivated areas. Using remote sensing ability to solve land resources problems. The purpose of this research is to map the urban areas at different times to monitor and predict possible urban changes, were studied the annual growth urban land during the last 29 years in Algiers City. Improving the productiveness of long-term training in land mapping, were have developed an approach by the following steps: 1) pre-processing for improvement of image characteristics; 2) extract training sample candidates based on the developed methods; and 3) Derive maps and analyzed of Algiers City on an annual basis from 1987 to 2016 using a Supervised Classifier Support Vector Machine (SVMs). Our result shows that the strategy of urban land followed in the region of Algiers City, developed areas mostly were extended to East, West, and South of Central Regions. The urban growth rate is linked with National Office of Statistics data. Future studies are required to understand the impact of urban rapid lands on social, economy and environmental sustainability, it will also close the gap in data of urbanism available, especially on the lack of reliable data, environmental and urban planning for each municipality in Algiers, develop experimental models to predict future land changes with statistically significant confidence.

  9. Institutional racism in public health contracting: Findings of a nationwide survey from New Zealand.

    PubMed

    Came, H; Doole, C; McKenna, B; McCreanor, T

    2018-02-01

    Public institutions within New Zealand have long been accused of mono-culturalism and institutional racism. This study sought to identify inconsistencies and bias by comparing government funded contracting processes for Māori public health providers (n = 60) with those of generic providers (n = 90). Qualitative and quantitative data were collected (November 2014-May 2015), through a nationwide telephone survey of public health providers, achieving a 75% response rate. Descriptive statistical analyses were applied to quantitative responses and an inductive approach was taken to analyse data from open-ended responses in the survey domains of relationships with portfolio contract managers, contracting and funding. The quantitative data showed four sites of statistically significant variation: length of contracts, intensity of monitoring, compliance costs and frequency of auditing. Non-significant data involved access to discretionary funding and cost of living adjustments, the frequency of monitoring, access to Crown (government) funders and representation on advisory groups. The qualitative material showed disparate provider experiences, dependent on individual portfolio managers, with nuanced differences between generic and Māori providers' experiences. This study showed that monitoring government performance through a nationwide survey was an innovative way to identify sites of institutional racism. In a policy context where health equity is a key directive to the health sector, this study suggests there is scope for New Zealand health funders to improve their contracting practices. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. On damage detection in wind turbine gearboxes using outlier analysis

    NASA Astrophysics Data System (ADS)

    Antoniadou, Ifigeneia; Manson, Graeme; Dervilis, Nikolaos; Staszewski, Wieslaw J.; Worden, Keith

    2012-04-01

    The proportion of worldwide installed wind power in power systems increases over the years as a result of the steadily growing interest in renewable energy sources. Still, the advantages offered by the use of wind power are overshadowed by the high operational and maintenance costs, resulting in the low competitiveness of wind power in the energy market. In order to reduce the costs of corrective maintenance, the application of condition monitoring to gearboxes becomes highly important, since gearboxes are among the wind turbine components with the most frequent failure observations. While condition monitoring of gearboxes in general is common practice, with various methods having been developed over the last few decades, wind turbine gearbox condition monitoring faces a major challenge: the detection of faults under the time-varying load conditions prevailing in wind turbine systems. Classical time and frequency domain methods fail to detect faults under variable load conditions, due to the temporary effect that these faults have on vibration signals. This paper uses the statistical discipline of outlier analysis for the damage detection of gearbox tooth faults. A simplified two-degree-of-freedom gearbox model considering nonlinear backlash, time-periodic mesh stiffness and static transmission error, simulates the vibration signals to be analysed. Local stiffness reduction is used for the simulation of tooth faults and statistical processes determine the existence of intermittencies. The lowest level of fault detection, the threshold value, is considered and the Mahalanobis squared-distance is calculated for the novelty detection problem.

  11. A stochastic approach for model reduction and memory function design in hydrogeophysical inversion

    NASA Astrophysics Data System (ADS)

    Hou, Z.; Kellogg, A.; Terry, N.

    2009-12-01

    Geophysical (e.g., seismic, electromagnetic, radar) techniques and statistical methods are essential for research related to subsurface characterization, including monitoring subsurface flow and transport processes, oil/gas reservoir identification, etc. For deep subsurface characterization such as reservoir petroleum exploration, seismic methods have been widely used. Recently, electromagnetic (EM) methods have drawn great attention in the area of reservoir characterization. However, considering the enormous computational demand corresponding to seismic and EM forward modeling, it is usually a big problem to have too many unknown parameters in the modeling domain. For shallow subsurface applications, the characterization can be very complicated considering the complexity and nonlinearity of flow and transport processes in the unsaturated zone. It is warranted to reduce the dimension of parameter space to a reasonable level. Another common concern is how to make the best use of time-lapse data with spatial-temporal correlations. This is even more critical when we try to monitor subsurface processes using geophysical data collected at different times. The normal practice is to get the inverse images individually. These images are not necessarily continuous or even reasonably related, because of the non-uniqueness of hydrogeophysical inversion. We propose to use a stochastic framework by integrating minimum-relative-entropy concept, quasi Monto Carlo sampling techniques, and statistical tests. The approach allows efficient and sufficient exploration of all possibilities of model parameters and evaluation of their significances to geophysical responses. The analyses enable us to reduce the parameter space significantly. The approach can be combined with Bayesian updating, allowing us to treat the updated ‘posterior’ pdf as a memory function, which stores all the information up to date about the distributions of soil/field attributes/properties, then consider the memory function as a new prior and generate samples from it for further updating when more geophysical data is available. We applied this approach for deep oil reservoir characterization and for shallow subsurface flow monitoring. The model reduction approach reliably helps reduce the joint seismic/EM/radar inversion computational time to reasonable levels. Continuous inversion images are obtained using time-lapse data with the “memory function” applied in the Bayesian inversion.

  12. Integrated Process Monitoring based on Systems of Sensors for Enhanced Nuclear Safeguards Sensitivity and Robustness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humberto E. Garcia

    This paper illustrates safeguards benefits that process monitoring (PM) can have as a diversion deterrent and as a complementary safeguards measure to nuclear material accountancy (NMA). In order to infer the possible existence of proliferation-driven activities, the objective of NMA-based methods is often to statistically evaluate materials unaccounted for (MUF) computed by solving a given mass balance equation related to a material balance area (MBA) at every material balance period (MBP), a particular objective for a PM-based approach may be to statistically infer and evaluate anomalies unaccounted for (AUF) that may have occurred within a MBP. Although possibly being indicativemore » of proliferation-driven activities, the detection and tracking of anomaly patterns is not trivial because some executed events may be unobservable or unreliably observed as others. The proposed similarity between NMA- and PM-based approaches is important as performance metrics utilized for evaluating NMA-based methods, such as detection probability (DP) and false alarm probability (FAP), can also be applied for assessing PM-based safeguards solutions. To this end, AUF count estimates can be translated into significant quantity (SQ) equivalents that may have been diverted within a given MBP. A diversion alarm is reported if this mass estimate is greater than or equal to the selected value for alarm level (AL), appropriately chosen to optimize DP and FAP based on the particular characteristics of the monitored MBA, the sensors utilized, and the data processing method employed for integrating and analyzing collected measurements. To illustrate the application of the proposed PM approach, a protracted diversion of Pu in a waste stream was selected based on incomplete fuel dissolution in a dissolver unit operation, as this diversion scenario is considered to be problematic for detection using NMA-based methods alone. Results demonstrate benefits of conducting PM under a system-centric strategy that utilizes data collected from a system of sensors and that effectively exploits known characterizations of sensors and facility operations in order to significantly improve anomaly detection, reduce false alarm, and enhance assessment robustness under unreliable partial sensor information.« less

  13. Measuring acoustic habitats

    PubMed Central

    Merchant, Nathan D; Fristrup, Kurt M; Johnson, Mark P; Tyack, Peter L; Witt, Matthew J; Blondel, Philippe; Parks, Susan E

    2015-01-01

    1. Many organisms depend on sound for communication, predator/prey detection and navigation. The acoustic environment can therefore play an important role in ecosystem dynamics and evolution. A growing number of studies are documenting acoustic habitats and their influences on animal development, behaviour, physiology and spatial ecology, which has led to increasing demand for passive acoustic monitoring (PAM) expertise in the life sciences. However, as yet, there has been no synthesis of data processing methods for acoustic habitat monitoring, which presents an unnecessary obstacle to would-be PAM analysts. 2. Here, we review the signal processing techniques needed to produce calibrated measurements of terrestrial and aquatic acoustic habitats. We include a supplemental tutorial and template computer codes in matlab and r, which give detailed guidance on how to produce calibrated spectrograms and statistical analyses of sound levels. Key metrics and terminology for the characterisation of biotic, abiotic and anthropogenic sound are covered, and their application to relevant monitoring scenarios is illustrated through example data sets. To inform study design and hardware selection, we also include an up-to-date overview of terrestrial and aquatic PAM instruments. 3. Monitoring of acoustic habitats at large spatiotemporal scales is becoming possible through recent advances in PAM technology. This will enhance our understanding of the role of sound in the spatial ecology of acoustically sensitive species and inform spatial planning to mitigate the rising influence of anthropogenic noise in these ecosystems. As we demonstrate in this work, progress in these areas will depend upon the application of consistent and appropriate PAM methodologies. PMID:25954500

  14. Measuring acoustic habitats.

    PubMed

    Merchant, Nathan D; Fristrup, Kurt M; Johnson, Mark P; Tyack, Peter L; Witt, Matthew J; Blondel, Philippe; Parks, Susan E

    2015-03-01

    1. Many organisms depend on sound for communication, predator/prey detection and navigation. The acoustic environment can therefore play an important role in ecosystem dynamics and evolution. A growing number of studies are documenting acoustic habitats and their influences on animal development, behaviour, physiology and spatial ecology, which has led to increasing demand for passive acoustic monitoring (PAM) expertise in the life sciences. However, as yet, there has been no synthesis of data processing methods for acoustic habitat monitoring, which presents an unnecessary obstacle to would-be PAM analysts. 2. Here, we review the signal processing techniques needed to produce calibrated measurements of terrestrial and aquatic acoustic habitats. We include a supplemental tutorial and template computer codes in matlab and r, which give detailed guidance on how to produce calibrated spectrograms and statistical analyses of sound levels. Key metrics and terminology for the characterisation of biotic, abiotic and anthropogenic sound are covered, and their application to relevant monitoring scenarios is illustrated through example data sets. To inform study design and hardware selection, we also include an up-to-date overview of terrestrial and aquatic PAM instruments. 3. Monitoring of acoustic habitats at large spatiotemporal scales is becoming possible through recent advances in PAM technology. This will enhance our understanding of the role of sound in the spatial ecology of acoustically sensitive species and inform spatial planning to mitigate the rising influence of anthropogenic noise in these ecosystems. As we demonstrate in this work, progress in these areas will depend upon the application of consistent and appropriate PAM methodologies.

  15. The application of data mining and cloud computing techniques in data-driven models for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Khazaeli, S.; Ravandi, A. G.; Banerji, S.; Bagchi, A.

    2016-04-01

    Recently, data-driven models for Structural Health Monitoring (SHM) have been of great interest among many researchers. In data-driven models, the sensed data are processed to determine the structural performance and evaluate the damages of an instrumented structure without necessitating the mathematical modeling of the structure. A framework of data-driven models for online assessment of the condition of a structure has been developed here. The developed framework is intended for automated evaluation of the monitoring data and structural performance by the Internet technology and resources. The main challenges in developing such framework include: (a) utilizing the sensor measurements to estimate and localize the induced damage in a structure by means of signal processing and data mining techniques, and (b) optimizing the computing and storage resources with the aid of cloud services. The main focus in this paper is to demonstrate the efficiency of the proposed framework for real-time damage detection of a multi-story shear-building structure in two damage scenarios (change in mass and stiffness) in various locations. Several features are extracted from the sensed data by signal processing techniques and statistical methods. Machine learning algorithms are deployed to select damage-sensitive features as well as classifying the data to trace the anomaly in the response of the structure. Here, the cloud computing resources from Amazon Web Services (AWS) have been used to implement the proposed framework.

  16. Quality-by-Design approach to monitor the operation of a batch bioreactor in an industrial avian vaccine manufacturing process.

    PubMed

    Largoni, Martina; Facco, Pierantonio; Bernini, Donatella; Bezzo, Fabrizio; Barolo, Massimiliano

    2015-10-10

    Monitoring batch bioreactors is a complex task, due to the fact that several sources of variability can affect a running batch and impact on the final product quality. Additionally, the product quality itself may not be measurable on line, but requires sampling and lab analysis taking several days to be completed. In this study we show that, by using appropriate process analytical technology tools, the operation of an industrial batch bioreactor used in avian vaccine manufacturing can be effectively monitored as the batch progresses. Multivariate statistical models are built from historical databases of batches already completed, and they are used to enable the real time identification of the variability sources, to reliably predict the final product quality, and to improve process understanding, paving the way to a reduction of final product rejections, as well as to a reduction of the product cycle time. It is also shown that the product quality "builds up" mainly during the first half of a batch, suggesting on the one side that reducing the variability during this period is crucial, and on the other side that the batch length can possibly be shortened. Overall, the study demonstrates that, by using a Quality-by-Design approach centered on the appropriate use of mathematical modeling, quality can indeed be built "by design" into the final product, whereas the role of end-point product testing can progressively reduce its importance in product manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Létourneau, Daniel, E-mail: daniel.letourneau@rmp.uh.on.ca; McNiven, Andrea; Keller, Harald

    2014-12-15

    Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods:more » The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves and the majority of the apparent leaf motion was attributed to beam spot displacements between irradiations. The MLC QC test was performed 193 and 162 times over the monitoring period for the studied units and recalibration had to be repeated up to three times on one of these units. For both units, rate of MLC interlocks was moderately associated with MLC servicing events. The strongest association with the MLC performance was observed between the MLC servicing events and the total number of out-of-control leaves. The average elapsed time for which the number of out-of-specification or out-of-control leaves was within a given performance threshold was computed and used to assess adequacy of MLC test frequency. Conclusions: A MLC performance monitoring system has been developed and implemented to acquire high-quality QC data at high frequency. This is enabled by the relatively short acquisition time for the images and automatic image analysis. The monitoring system was also used to record and track the rate of MLC-related interlocks and servicing events. MLC performances for two commercially available MLC models have been assessed and the results support monthly test frequency for widely accepted ±1 mm specifications. Higher QC test frequency is however required to maintain tighter specification and in-control behavior.« less

  18. Coliphages as indicators of enteroviruses.

    PubMed Central

    Stetler, R E

    1984-01-01

    Coliphages were monitored in conjunction with indicator bacteria and enteroviruses in a drinking-water plant modified to reduce trihalomethane production. Coliphages could be detected in the source water by direct inoculation, and sufficient coliphages were detected in enterovirus concentrates to permit following the coliphage levels through different water treatment processes. The recovery efficiency by different filter types ranged from 1 to 53%. Statistical analysis of the data indicated that enterovirus isolates were better correlated with coliphages than with total coliforms, fecal coliforms, fecal streptococci, or standard plate count organisms. Coliphages were not detected in finished water. PMID:6093694

  19. Position estimation of transceivers in communication networks

    DOEpatents

    Kent, Claudia A [Pleasanton, CA; Dowla, Farid [Castro Valley, CA

    2008-06-03

    This invention provides a system and method using wireless communication interfaces coupled with statistical processing of time-of-flight data to locate by position estimation unknown wireless receivers. Such an invention can be applied in sensor network applications, such as environmental monitoring of water in the soil or chemicals in the air where the position of the network nodes is deemed critical. Moreover, the present invention can be arranged to operate in areas where a Global Positioning System (GPS) is not available, such as inside buildings, caves, and tunnels.

  20. CAVIAR: an R package for checking, displaying and processing wood-formation-monitoring data.

    PubMed

    Rathgeber, Cyrille B K; Santenoise, Philippe; Cuny, Henri E

    2018-05-19

    In the last decade, the pervasive question of climate change impacts on forests has revived investigations on intra-annual dynamics of wood formation, involving disciplines such as plant ecology, tree physiology and dendrochronology. This resulted in the creation of many research groups working on this topic worldwide and a rapid increase in the number of studies and publications. Wood-formation-monitoring studies are generally based on a common conceptual model describing xylem cell formation as the succession of four differentiation phases (cell division, cell enlargement, cell wall thickening and mature cells). They generally use the same sampling techniques, sample preparation methods and anatomical criteria to separate between differentiation zones and discriminate and count forming xylem cells, resulting in very similar raw data. However, the way these raw data are then processed, producing the elaborated data on which statistical analyses are performed, still remains quite specific to each individual study. Thereby, despite very similar raw data, wood-formation-monitoring studies yield results that are still quite difficult to compare. CAVIAR-an R package specifically dedicated to the verification, visualization and manipulation of wood-formation-monitoring data-can help to improve this situation. Initially, CAVIAR was built to provide efficient algorithms to compute critical dates of wood formation phenology for conifers growing in temperate and cold environments. Recently, we developed it further to check, display and process wood-formation-monitoring data. Thanks to new and upgraded functions, raw data can now be consistently verified, standardized and modelled (using logistic regressions and Gompertz functions), in order to describe wood phenology and intra-annual dynamics of tree-ring formation. We believe that CAVIAR will help strengthening the science of wood formation dynamics by effectively contributing to the standardization of its concepts and methods, making thereby possible the comparison between data and results from different studies.

  1. The application of statistically designed experiments to resistance spot welding

    NASA Technical Reports Server (NTRS)

    Hafley, Robert A.; Hales, Stephen J.

    1991-01-01

    State-of-the-art Resistance Spot Welding (RSW) equipment has the potential to permit realtime monitoring of operations through advances in computerized process control. In order to realize adaptive feedback capabilities, it is necessary to establish correlations among process variables, welder outputs, and weldment properties. The initial step toward achieving this goal must involve assessment of the effect of specific process inputs and the interactions among these variables on spot weld characteristics. This investigation evaluated these effects through the application of a statistically designed experiment to the RSW process. A half-factorial, Taguchi L sub 16 design was used to understand and refine a RSW schedule developed for welding dissimilar aluminum-lithium alloys of different thickness. The baseline schedule had been established previously by traditional trial and error methods based on engineering judgment and one-factor-at-a-time studies. A hierarchy of inputs with respect to each other was established, and the significance of these inputs with respect to experimental noise was determined. Useful insight was gained into the effect of interactions among process variables, particularly with respect to weldment defects. The effects of equipment related changes associated with disassembly and recalibration were also identified. In spite of an apparent decrease in equipment performance, a significant improvement in the maximum strength for defect-free welds compared to the baseline schedule was achieved.

  2. Assessing a learning process with functional ANOVA estimators of EEG power spectral densities.

    PubMed

    Gutiérrez, David; Ramírez-Moreno, Mauricio A

    2016-04-01

    We propose to assess the process of learning a task using electroencephalographic (EEG) measurements. In particular, we quantify changes in brain activity associated to the progression of the learning experience through the functional analysis-of-variances (FANOVA) estimators of the EEG power spectral density (PSD). Such functional estimators provide a sense of the effect of training in the EEG dynamics. For that purpose, we implemented an experiment to monitor the process of learning to type using the Colemak keyboard layout during a twelve-lessons training. Hence, our aim is to identify statistically significant changes in PSD of various EEG rhythms at different stages and difficulty levels of the learning process. Those changes are taken into account only when a probabilistic measure of the cognitive state ensures the high engagement of the volunteer to the training. Based on this, a series of statistical tests are performed in order to determine the personalized frequencies and sensors at which changes in PSD occur, then the FANOVA estimates are computed and analyzed. Our experimental results showed a significant decrease in the power of [Formula: see text] and [Formula: see text] rhythms for ten volunteers during the learning process, and such decrease happens regardless of the difficulty of the lesson. These results are in agreement with previous reports of changes in PSD being associated to feature binding and memory encoding.

  3. Indicators for evaluating European population health: a Delphi selection process.

    PubMed

    Freitas, Ângela; Santana, Paula; Oliveira, Mónica D; Almendra, Ricardo; Bana E Costa, João C; Bana E Costa, Carlos A

    2018-04-27

    Indicators are essential instruments for monitoring and evaluating population health. The selection of a multidimensional set of indicators should not only reflect the scientific evidence on health outcomes and health determinants, but also the views of health experts and stakeholders. The aim of this study is to describe the Delphi selection process designed to promote agreement on indicators considered relevant to evaluate population health at the European regional level. Indicators were selected in a Delphi survey conducted using a web-platform designed to implement and monitor participatory processes. It involved a panel of 51 experts and 30 stakeholders from different areas of knowledge and geographies. In three consecutive rounds the panel indicated their level of agreement or disagreement with indicator's relevance for evaluating population health in Europe. Inferential statistics were applied to draw conclusions on observed level of agreement (Scott's Pi interrater reliability coefficient) and opinion change (McNemar Chi-square test). Multivariate analysis of variance was conducted to check if the field of expertise influenced the panellist responses (Wilk's Lambda test). The panel participated extensively in the study (overall response rate: 80%). Eighty indicators reached group agreement for selection in the areas of: economic and social environment (12); demographic change (5); lifestyle and health behaviours (8); physical environment (6); built environment (12); healthcare services (11) and health outcomes (26). Higher convergence of group opinion towards agreement on the relevance of indicators was seen for lifestyle and health behaviours, healthcare services, and health outcomes. The panellists' field of expertise influenced responses: statistically significant differences were found for economic and social environment (p < 0.05 in round 1 and 2), physical environment (p < 0.01 in round 1) and health outcomes (p < 0.01 in round 3). The high levels of participation observed in this study, by involving experts and stakeholders and ascertaining their views, underpinned the added value of using a transparent Web-Delphi process to promote agreement on what indicators are relevant to appraise population health.

  4. Big-Data RHEED analysis for understanding epitaxial film growth processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasudevan, Rama K; Tselev, Alexander; Baddorf, Arthur P

    Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in-situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED image, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the dataset are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of RHEED image sequence.more » This approach is illustrated for growth of LaxCa1-xMnO3 films grown on etched (001) SrTiO3 substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the assymetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.« less

  5. Assessing NIR & MIR Spectral Analysis as a Method for Soil C Estimation Across a Network of Sampling Sites

    NASA Astrophysics Data System (ADS)

    Spencer, S.; Ogle, S.; Borch, T.; Rock, B.

    2008-12-01

    Monitoring soil C stocks is critical to assess the impact of future climate and land use change on carbon sinks and sources in agricultural lands. A benchmark network for soil carbon monitoring of stock changes is being designed for US agricultural lands with 3000-5000 sites anticipated and re-sampling on a 5- to10-year basis. Approximately 1000 sites would be sampled per year producing around 15,000 soil samples to be processed for total, organic, and inorganic carbon, as well as bulk density and nitrogen. Laboratory processing of soil samples is cost and time intensive, therefore we are testing the efficacy of using near-infrared (NIR) and mid-infrared (MIR) spectral methods for estimating soil carbon. As part of an initial implementation of national soil carbon monitoring, we collected over 1800 soil samples from 45 cropland sites in the mid-continental region of the U.S. Samples were processed using standard laboratory methods to determine the variables above. Carbon and nitrogen were determined by dry combustion and inorganic carbon was estimated with an acid-pressure test. 600 samples are being scanned using a bench- top NIR reflectance spectrometer (30 g of 2 mm oven-dried soil and 30 g of 8 mm air-dried soil) and 500 samples using a MIR Fourier-Transform Infrared Spectrometer (FTIR) with a DRIFT reflectance accessory (0.2 g oven-dried ground soil). Lab-measured carbon will be compared to spectrally-estimated carbon contents using Partial Least Squares (PLS) multivariate statistical approach. PLS attempts to develop a soil C predictive model that can then be used to estimate C in soil samples not lab-processed. The spectral analysis of soil samples either whole or partially processed can potentially save both funding resources and time to process samples. This is particularly relevant for the implementation of a national monitoring network for soil carbon. This poster will discuss our methods, initial results and potential for using NIR and MIR spectral approaches to either replace or augment traditional lab-based carbon analyses of soils.

  6. 25 years of HBM in the Czech Republic.

    PubMed

    Černá, Milena; Puklová, Vladimíra; Hanzlíková, Lenka; Sochorová, Lenka; Kubínová, Růžena

    2017-03-01

    Since 1991 a human biomonitoring network has been established in the Czech Republic as part of the Environmental Health Monitoring System, which was set out by the Government Resolution. During the last quarter-century, important data was obtained to characterize exposure to both children and adult populations to significant toxic substances from the environment, to development trends over time, to establish reference values and compare them with existing health-related values. Moreover, the saturation of population with several essential substances as selenium, zinc, copper or iodine has also been monitored. Development of analytical and statistical methods led to increase the capacity building, improvement of QA/QC in analytical laboratories and interpretation of results. The obtained results are translated to policy actions and are used in health risk assessment processes at local and national levels. Copyright © 2016 Elsevier GmbH. All rights reserved.

  7. Sensing system development for HOV/HOT (high occupancy vehicle) lane monitoring.

    DOT National Transportation Integrated Search

    2011-02-01

    With continued interest in the efficient use of roadways the ability to monitor the use of HOV/HOT lanes is essential for management, planning and operation. A system to reliably monitor these lanes on a continuous basis and provide usage statistics ...

  8. 40 CFR 58.1 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... humidity, solar radiation, ultraviolet radiation, and/or precipitation. Metropolitan Statistical Area (MSA... receiver at opposite ends of the monitoring path; (2) Equal to twice the monitoring path length for a (monostatic) system having a transmitter and receiver at one end of the monitoring path and a mirror or...

  9. 40 CFR 58.1 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... humidity, solar radiation, ultraviolet radiation, and/or precipitation. Metropolitan Statistical Area (MSA... receiver at opposite ends of the monitoring path; (2) Equal to twice the monitoring path length for a (monostatic) system having a transmitter and receiver at one end of the monitoring path and a mirror or...

  10. 40 CFR 58.1 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... humidity, solar radiation, ultraviolet radiation, and/or precipitation. Metropolitan Statistical Area (MSA... receiver at opposite ends of the monitoring path; (2) Equal to twice the monitoring path length for a (monostatic) system having a transmitter and receiver at one end of the monitoring path and a mirror or...

  11. Atmospheric Visibility Monitoring for planetary optical communications

    NASA Technical Reports Server (NTRS)

    Cowles, Kelly

    1991-01-01

    The Atmospheric Visibility Monitoring project endeavors to improve current atmospheric models and generate visibility statistics relevant to prospective earth-satellite optical communications systems. Three autonomous observatories are being used to measure atmospheric conditions on the basis of observed starlight; these data will yield clear-sky and transmission statistics for three sites with high clear-sky probabilities. Ground-based data will be compared with satellite imagery to determine the correlation between satellite data and ground-based observations.

  12. Comparison of probability statistics for automated ship detection in SAR imagery

    NASA Astrophysics Data System (ADS)

    Henschel, Michael D.; Rey, Maria T.; Campbell, J. W. M.; Petrovic, D.

    1998-12-01

    This paper discuses the initial results of a recent operational trial of the Ocean Monitoring Workstation's (OMW) ship detection algorithm which is essentially a Constant False Alarm Rate filter applied to Synthetic Aperture Radar data. The choice of probability distribution and methodologies for calculating scene specific statistics are discussed in some detail. An empirical basis for the choice of probability distribution used is discussed. We compare the results using a l-look, k-distribution function with various parameter choices and methods of estimation. As a special case of sea clutter statistics the application of a (chi) 2-distribution is also discussed. Comparisons are made with reference to RADARSAT data collected during the Maritime Command Operation Training exercise conducted in Atlantic Canadian Waters in June 1998. Reference is also made to previously collected statistics. The OMW is a commercial software suite that provides modules for automated vessel detection, oil spill monitoring, and environmental monitoring. This work has been undertaken to fine tune the OMW algorithm's, with special emphasis on the false alarm rate of each algorithm.

  13. TU-FG-201-09: Predicting Accelerator Dysfunction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Able, C; Nguyen, C; Baydush, A

    Purpose: To develop an integrated statistical process control (SPC) framework using digital performance and component data accumulated within the accelerator system that can detect dysfunction prior to unscheduled downtime. Methods: Seven digital accelerators were monitored for twelve to 18 months. The accelerators were operated in a ‘run to failure mode’ with the individual institutions determining when service would be initiated. Institutions were required to submit detailed service reports. Trajectory and text log files resulting from a robust daily VMAT QA delivery were decoded and evaluated using Individual and Moving Range (I/MR) control charts. The SPC evaluation was presented in amore » customized dashboard interface that allows the user to review 525 monitored parameters (480 MLC parameters). Chart limits were calculated using a hybrid technique that includes the standard SPC 3σ limits and an empirical factor based on the parameter/system specification. The individual (I) grand mean values and control limit ranges of the I/MR charts of all accelerators were compared using statistical (ranked analysis of variance (ANOVA)) and graphical analyses to determine consistency of operating parameters. Results: When an alarm or warning was directly connected to field service, process control charts predicted dysfunction consistently on beam generation related parameters (BGP)– RF Driver Voltage, Gun Grid Voltage, and Forward Power (W); beam uniformity parameters – angle and position steering coil currents; and Gantry position accuracy parameter: cross correlation max-value. Control charts for individual MLC – cross correlation max-value/position detected 50% to 60% of MLCs serviced prior to dysfunction or failure. In general, non-random changes were detected 5 to 80 days prior to a service intervention. The ANOVA comparison of BGP determined that each accelerator parameter operated at a distinct value. Conclusion: The SPC framework shows promise. Long term monitoring coordinated with service will be required to definitively determine the effectiveness of the model. Varian Medical System, Inc. provided funding in support of the research presented.« less

  14. Quantifying temporal glucose variability in diabetes via continuous glucose monitoring: mathematical methods and clinical application.

    PubMed

    Kovatchev, Boris P; Clarke, William L; Breton, Marc; Brayman, Kenneth; McCall, Anthony

    2005-12-01

    Continuous glucose monitors (CGMs) collect detailed blood glucose (BG) time series, which carry significant information about the dynamics of BG fluctuations. In contrast, the methods for analysis of CGM data remain those developed for infrequent BG self-monitoring. As a result, important information about the temporal structure of the data is lost during the translation of raw sensor readings into clinically interpretable statistics and images. The following mathematical methods are introduced into the field of CGM data interpretation: (1) analysis of BG rate of change; (2) risk analysis using previously reported Low/High BG Indices and Poincare (lag) plot of risk associated with temporal BG variability; and (3) spatial aggregation of the process of BG fluctuations and its Markov chain visualization. The clinical application of these methods is illustrated by analysis of data of a patient with Type 1 diabetes mellitus who underwent islet transplantation and with data from clinical trials. Normative data [12,025 reference (YSI device, Yellow Springs Instruments, Yellow Springs, OH) BG determinations] in patients with Type 1 diabetes mellitus who underwent insulin and glucose challenges suggest that the 90%, 95%, and 99% confidence intervals of BG rate of change that could be maximally sustained over 15-30 min are [-2,2], [-3,3], and [-4,4] mg/dL/min, respectively. BG dynamics and risk parameters clearly differentiated the stages of transplantation and the effects of medication. Aspects of treatment were clearly visualized by graphs of BG rate of change and Low/High BG Indices, by a Poincare plot of risk for rapid BG fluctuations, and by a plot of the aggregated Markov process. Advanced analysis and visualization of CGM data allow for evaluation of dynamical characteristics of diabetes and reveal clinical information that is inaccessible via standard statistics, which do not take into account the temporal structure of the data. The use of such methods improves the assessment of patients' glycemic control.

  15. Examining the Relationship Between Nursing Informatics Competency and the Quality of Information Processing.

    PubMed

    Al-Hawamdih, Sajidah; Ahmad, Muayyad M

    2018-03-01

    The purpose of this study was to examine nursing informatics competency and the quality of information processing among nurses in Jordan. The study was conducted in a large hospital with 380 registered nurses. The hospital introduced the electronic health record in 2010. The measures used in this study were personal and job characteristics, self-efficacy, Self-Assessment Nursing Informatics Competencies, and Health Information System Monitoring Questionnaire. The convenience sample consisted of 99 nurses who used the electronic health record for at least 3 months. The analysis showed that nine predictors explained 22% of the variance in the quality of information processing, whereas the statistically significant predictors were nursing informatics competency, clinical specialty, and years of nursing experience. There is a need for policies that advocate for every nurse to be educated in nursing informatics and the quality of information processing.

  16. Monitoring Traffic Information with a Developed Acceleration Sensing Node.

    PubMed

    Ye, Zhoujing; Wang, Linbing; Xu, Wen; Gao, Zhifei; Yan, Guannan

    2017-12-05

    In this paper, an acceleration sensing node for pavement vibration was developed to monitor traffic information, including vehicle speed, vehicle types, and traffic flow, where a hardware design with low energy consumption and node encapsulation could be accomplished. The service performance of the sensing node was evaluated, by methods including waterproof test, compression test, sensing performance analysis, and comparison test. The results demonstrate that the sensing node is low in energy consumption, high in strength, IPX8 waterproof, and high in sensitivity and resolution. These characteristics can be applied to practical road environments. Two sensing nodes were spaced apart in the direction of travelling. In the experiment, three types of vehicles passed by the monitoring points at several different speeds and values of d (the distance between the sensor and the nearest tire center line). Based on cross-correlation with kernel pre-smoothing, a calculation method was applied to process the raw data. New algorithms for traffic flow, speed, and axle length were proposed. Finally, the effects of vehicle speed, vehicle weight, and d value on acceleration amplitude were statistically evaluated. It was found that the acceleration sensing node can be used for traffic flow, vehicle speed, and other types of monitoring.

  17. Monitoring Traffic Information with a Developed Acceleration Sensing Node

    PubMed Central

    Ye, Zhoujing; Wang, Linbing; Xu, Wen; Gao, Zhifei; Yan, Guannan

    2017-01-01

    In this paper, an acceleration sensing node for pavement vibration was developed to monitor traffic information, including vehicle speed, vehicle types, and traffic flow, where a hardware design with low energy consumption and node encapsulation could be accomplished. The service performance of the sensing node was evaluated, by methods including waterproof test, compression test, sensing performance analysis, and comparison test. The results demonstrate that the sensing node is low in energy consumption, high in strength, IPX8 waterproof, and high in sensitivity and resolution. These characteristics can be applied to practical road environments. Two sensing nodes were spaced apart in the direction of travelling. In the experiment, three types of vehicles passed by the monitoring points at several different speeds and values of d (the distance between the sensor and the nearest tire center line). Based on cross-correlation with kernel pre-smoothing, a calculation method was applied to process the raw data. New algorithms for traffic flow, speed, and axle length were proposed. Finally, the effects of vehicle speed, vehicle weight, and d value on acceleration amplitude were statistically evaluated. It was found that the acceleration sensing node can be used for traffic flow, vehicle speed, and other types of monitoring. PMID:29206169

  18. Monitoring Single-Molecule Protein Dynamics with a Carbon Nanotube Transistor

    NASA Astrophysics Data System (ADS)

    Collins, Philip G.

    2014-03-01

    Nanoscale electronic devices like field-effect transistors have long promised to provide sensitive, label-free detection of biomolecules. Single-walled carbon nanotubes press this concept further by not just detecting molecules but also monitoring their dynamics in real time. Recent measurements have demonstrated this premise by monitoring the single-molecule processivity of three different enzymes: lysozyme, protein Kinase A, and the Klenow fragment of DNA polymerase I. With all three enzymes, single molecules tethered to nanotube transistors were electronically monitored for 10 or more minutes, allowing us to directly observe a range of activity including rare transitions to chemically inactive and hyperactive conformations. The high bandwidth of the nanotube transistors further allow every individual chemical event to be clearly resolved, providing excellent statistics from tens of thousands of turnovers by a single enzyme. Initial success with three different enzymes indicates the generality and attractiveness of the nanotube devices as a new tool to complement other single-molecule techniques. Research on transduction mechanisms provides the design rules necessary to further generalize this architecture and apply it to other proteins. The purposeful incorporation of just one amino acid is sufficient to fabricate effective, single molecule sensors from a wide range of enzymes or proteins.

  19. A combined geostatistical-optimization model for the optimal design of a groundwater quality monitoring network

    NASA Astrophysics Data System (ADS)

    Kolosionis, Konstantinos; Papadopoulou, Maria P.

    2017-04-01

    Monitoring networks provide essential information for water resources management especially in areas with significant groundwater exploitation due to extensive agricultural activities. In this work, a simulation-optimization framework is developed based on heuristic optimization methodologies and geostatistical modeling approaches to obtain an optimal design for a groundwater quality monitoring network. Groundwater quantity and quality data obtained from 43 existing observation locations at 3 different hydrological periods in Mires basin in Crete, Greece will be used in the proposed framework in terms of Regression Kriging to develop the spatial distribution of nitrates concentration in the aquifer of interest. Based on the existing groundwater quality mapping, the proposed optimization tool will determine a cost-effective observation wells network that contributes significant information to water managers and authorities. The elimination of observation wells that add little or no beneficial information to groundwater level and quality mapping of the area can be obtain using estimations uncertainty and statistical error metrics without effecting the assessment of the groundwater quality. Given the high maintenance cost of groundwater monitoring networks, the proposed tool could used by water regulators in the decision-making process to obtain a efficient network design that is essential.

  20. Learning predictive statistics from temporal sequences: Dynamics and strategies

    PubMed Central

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E.; Kourtzi, Zoe

    2017-01-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics—that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments. PMID:28973111

  1. Controlled Gelation of Particle Suspensions Using Controlled Solvent Removal in Picoliter Droplets

    NASA Astrophysics Data System (ADS)

    Vuong, Sharon; Walker, Lynn; Anna, Shelley

    2013-11-01

    Droplets in microfluidic devices have proven useful as uniform picoliter reactors for nanoparticle synthesis and as components in tunable emulsions. However, there can be significant transport between the component phases depending on solubility and other factors. In the present talk, we show that water droplets trapped within a microfluidic device for tens of hours slowly dehydrate, concentrating the contents encapsulated within. We use this slow dehydration along with control of the initial droplet composition to monitor gelation of aqueous suspensions of spherical silica particles (Ludox) and disk-shaped clay particles (Laponite). Droplets are generated in a microfluidic device containing small wells that trap the droplets. We monitor the concentration process through size and shape changes of these droplets as a function of time in tens of droplets and use the large number of individual reactors to generate statistics regarding the gelation process. We also examine changes in suspension viscosity through fluorescent particle tracking as a function of dehydration rate, initial suspension concentration and initial droplet volume, and added salt, and compare the results with the Krieger-Dougherty model in which viscosity increases dramatically with particle volume fraction.

  2. Anomaly-based intrusion detection for SCADA systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, D.; Usynin, A.; Hines, J. W.

    2006-07-01

    Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper willmore » briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)« less

  3. Comparison of simulation modeling and satellite techniques for monitoring ecological processes

    NASA Technical Reports Server (NTRS)

    Box, Elgene O.

    1988-01-01

    In 1985 improvements were made in the world climatic data base for modeling and predictive mapping; in individual process models and the overall carbon-balance models; and in the interface software for mapping the simulation results. Statistical analysis of the data base was begun. In 1986 mapping was shifted to NASA-Goddard. The initial approach involving pattern comparisons was modified to a more statistical approach. A major accomplishment was the expansion and improvement of a global data base of measurements of biomass and primary production, to complement the simulation data. The main accomplishments during 1987 included: production of a master tape with all environmental and satellite data and model results for the 1600 sites; development of a complete mapping system used for the initial color maps comparing annual and monthly patterns of Normalized Difference Vegetation Index (NDVI), actual evapotranspiration, net primary productivity, gross primary productivity, and net ecosystem production; collection of more biosphere measurements for eventual improvement of the biological models; and development of some initial monthly models for primary productivity, based on satellite data.

  4. Statistical physics approach to earthquake occurrence and forecasting

    NASA Astrophysics Data System (ADS)

    de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio

    2016-04-01

    There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different levels of prediction. In this review we also briefly discuss how the statistical mechanics approach can be applied to non-tectonic earthquakes and to other natural stochastic processes, such as volcanic eruptions and solar flares.

  5. Error, Power, and Blind Sentinels: The Statistics of Seagrass Monitoring

    PubMed Central

    Schultz, Stewart T.; Kruschel, Claudia; Bakran-Petricioli, Tatjana; Petricioli, Donat

    2015-01-01

    We derive statistical properties of standard methods for monitoring of habitat cover worldwide, and criticize them in the context of mandated seagrass monitoring programs, as exemplified by Posidonia oceanica in the Mediterranean Sea. We report the novel result that cartographic methods with non-trivial classification errors are generally incapable of reliably detecting habitat cover losses less than about 30 to 50%, and the field labor required to increase their precision can be orders of magnitude higher than that required to estimate habitat loss directly in a field campaign. We derive a universal utility threshold of classification error in habitat maps that represents the minimum habitat map accuracy above which direct methods are superior. Widespread government reliance on blind-sentinel methods for monitoring seafloor can obscure the gradual and currently ongoing losses of benthic resources until the time has long passed for meaningful management intervention. We find two classes of methods with very high statistical power for detecting small habitat cover losses: 1) fixed-plot direct methods, which are over 100 times as efficient as direct random-plot methods in a variable habitat mosaic; and 2) remote methods with very low classification error such as geospatial underwater videography, which is an emerging, low-cost, non-destructive method for documenting small changes at millimeter visual resolution. General adoption of these methods and their further development will require a fundamental cultural change in conservation and management bodies towards the recognition and promotion of requirements of minimal statistical power and precision in the development of international goals for monitoring these valuable resources and the ecological services they provide. PMID:26367863

  6. Information Measures for Statistical Orbit Determination

    ERIC Educational Resources Information Center

    Mashiku, Alinda K.

    2013-01-01

    The current Situational Space Awareness (SSA) is faced with a huge task of tracking the increasing number of space objects. The tracking of space objects requires frequent and accurate monitoring for orbit maintenance and collision avoidance using methods for statistical orbit determination. Statistical orbit determination enables us to obtain…

  7. An Improved LC-ESI-MS/MS Method to Quantify Pregabalin in Human Plasma and Dry Plasma Spot for Therapeutic Monitoring and Pharmacokinetic Applications.

    PubMed

    Dwivedi, Jaya; Namdev, Kuldeep K; Chilkoti, Deepak C; Verma, Surajpal; Sharma, Swapnil

    2018-06-06

    Therapeutic drug monitoring (TDM) of anti-epileptic drugs provides a valid clinical tool in optimization of overall therapy. However, TDM is challenging due to the high biological samples (plasma/blood) storage/shipment costs and the limited availability of laboratories providing TDM services. Sampling in the form of dry plasma spot (DPS) or dry blood spot (DBS) is a suitable alternative to overcome these issues. An improved, simple, rapid, and stability indicating method for quantification of pregabalin in human plasma and DPS has been developed and validated. Analyses were performed on liquid chromatography tandem mass spectrometer under positive ionization mode of electrospray interface. Pregabain-d4 was used as internal standard, and the chromatographic separations were performed on Poroshell 120 EC-C18 column using an isocratic mobile phase flow rate of 1 mL/min. Stability of pregabalin in DPS was evaluated under simulated real-time conditions. Extraction procedures from plasma and DPS samples were compared using statistical tests. The method was validated considering the FDA method validation guideline. The method was linear over the concentration range of 20-16000 ng/mL and 100-10000 ng/mL in plasma and DPS, respectively. DPS samples were found stable for only one week upon storage at room temperature and for at least four weeks at freezing temperature (-20 ± 5 °C). Method was applied for quantification of pregabalin in over 600 samples of a clinical study. Statistical analyses revealed that two extraction procedures in plasma and DPS samples showed statistically insignificant difference and can be used interchangeably without any bias. Proposed method involves simple and rapid steps of sample processing that do not require a pre- or post-column derivatization procedure. The method is suitable for routine pharmacokinetic analysis and therapeutic monitoring of pregabalin.

  8. Spatial and temporal variations of metal content and water quality in the Belaya River Basin

    NASA Astrophysics Data System (ADS)

    Fashchevskaia, T. B.; Motovilov, Y.

    2016-12-01

    The aim of this research is to identify the spatiotemporal regularities of iron, copper and zinc contents dynamics in the streams of the Belaya River basin. The Belaya River is situated in the South Ural region and is one of the biggest tributary in the Volga River basin with catchment area of 142 000 km2. More than sixty years the diverse economic activities are carried out in the Belaya River basin, the intensity of this activity is characterized by high temporal variability. The leading industries in the region are oil, mining, petroleum processing, chemistry and petro chemistry, mechanical engineering, metallurgy, power industry. The dynamics of human activities in the catchment and intra and inter-annual changes in the water quality are analyzed for the period 1969-2007 years. Inter-annual dynamics of the metal content in the river waters was identified on the basis of the long-term hydrological monitoring statistics at the 32 sites. It was found that the dynamics of intensity of economic activities in the Belaya River basin is the cause statistically significant changes in the metal content of the river network. Statistically homogeneous time intervals have been set for each monitoring site. Within these time intervals there were obtained averaged reliable quantitative estimations of water quality. Empirical probability distributions of iron, copper and zinc concentrations for various phases of the water regime in all investigated monitoring sites were approximated by Pearson type III curves and the averages of the concentration values, the coefficient of variation and asymmetry, as well as the values of the concentrations of metal in the range of 1-95% of frequency were estimated. It was found that by the end of the test period, the average long-term concentrations for iron and copper exceed MAC for fishery use, for zinc become smaller MAC in many streams of Belaya River basin. Acknowledgements. The work was financially supported by the Russian Foundation for Basic Research (Grant 15-05-09022)

  9. Monitoring as a Means to Focus Research and Conservation - The Grassland Bird Monitoring Example

    Treesearch

    Brenda Dale; Michael Norton; Constance Downes; Brian Collins

    2005-01-01

    One recommendation of the Canadian Landbird Monitoring Strategy of Partners in Flight-Canada is to improve monitoring capability for rapidly declining grassland birds. In Canada, we lack statistical power for many grassland species because they are detected in small numbers, on a low number of routes, or show high year-to-year variability. In developing a Grassland...

  10. Profiling and multivariate statistical analysis of Panax ginseng based on ultra-high-performance liquid chromatography coupled with quadrupole-time-of-flight mass spectrometry.

    PubMed

    Wu, Wei; Sun, Le; Zhang, Zhe; Guo, Yingying; Liu, Shuying

    2015-03-25

    An ultra-high-performance liquid chromatography coupled with quadrupole-time-of-flight mass spectrometry (UHPLC-Q-TOF-MS) method was developed for the detection and structural analysis of ginsenosides in white ginseng and related processed products (red ginseng). Original neutral, malonyl, and chemically transformed ginsenosides were identified in white and red ginseng samples. The aglycone types of ginsenosides were determined by MS/MS as PPD (m/z 459), PPT (m/z 475), C-24, -25 hydrated-PPD or PPT (m/z 477 or m/z 493), and Δ20(21)-or Δ20(22)-dehydrated-PPD or PPT (m/z 441 or m/z 457). Following the structural determination, the UHPLC-Q-TOF-MS-based chemical profiling coupled with multivariate statistical analysis method was applied for global analysis of white and processed ginseng samples. The chemical markers present between the processed products red ginseng and white ginseng could be assigned. Process-mediated chemical changes were recognized as the hydrolysis of ginsenosides with large molecular weight, chemical transformations of ginsenosides, changes in malonyl-ginsenosides, and generation of 20-(R)-ginsenoside enantiomers. The relative contents of compounds classified as PPD, PPT, malonyl, and transformed ginsenosides were calculated based on peak areas in ginseng before and after processing. This study provides possibility to monitor multiple components for the quality control and global evaluation of ginseng products during processing. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Monitoring larval populations of the Douglas-fir tussock moth and the western spruce budworm on permanent plots: sampling methods and statistical properties of data

    Treesearch

    A.R. Mason; H.G. Paul

    1994-01-01

    Procedures for monitoring larval populations of the Douglas-fir tussock moth and the western spruce budworm are recommended based on many years experience in sampling these species in eastern Oregon and Washington. It is shown that statistically reliable estimates of larval density can be made for a population by sampling host trees in a series of permanent plots in a...

  12. A statistical assessment of pesticide pollution in surface waters using environmental monitoring data: Chlorpyrifos in Central Valley, California.

    PubMed

    Wang, Dan; Singhasemanon, Nan; Goh, Kean S

    2016-11-15

    Pesticides are routinely monitored in surface waters and resultant data are analyzed to assess whether their uses will damage aquatic eco-systems. However, the utility of the monitoring data is limited because of the insufficiency in the temporal and spatial sampling coverage and the inability to detect and quantify trace concentrations. This study developed a novel assessment procedure that addresses those limitations by combining 1) statistical methods capable of extracting information from concentrations below changing detection limits, 2) statistical resampling techniques that account for uncertainties rooted in the non-detects and insufficient/irregular sampling coverage, and 3) multiple lines of evidence that improve confidence in the final conclusion. This procedure was demonstrated by an assessment on chlorpyrifos monitoring data in surface waters of California's Central Valley (2005-2013). We detected a significant downward trend in the concentrations, which cannot be observed by commonly-used statistical approaches. We assessed that the aquatic risk was low using a probabilistic method that works with non-detects and has the ability to differentiate indicator groups with varying sensitivity. In addition, we showed that the frequency of exceedance over ambient aquatic life water quality criteria was affected by pesticide use, precipitation and irrigation demand in certain periods anteceding the water sampling events. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Benefits of Outsourcing Strategy and IT Technology in Clinical Trials.

    PubMed

    Stamenovic, Milorad; Dobraca, Amra

    2017-09-01

    Aim of this paper is to describe some of models of outsourcing (numerous and response to different types of risks and increment of quality is based on individual problem and situation). Defining whether to outsource or not and whether to build or buy new information technology (IT) is question for contract research organization (CRO) and Pharma companies dealing with clinical trials, so the aim of this paper is to show business model that could make process of decision making less time consuming, less segmented and more efficient. This paper has a descriptive character, and represents a review of the literature that deals with the described issues. Outsourcing should enable optimal capacity flexibility (technology that is outsourced should be done only optimally not entirely). The goal with CRO partners is to establish equivalent levels of global quality, as extensions of other research and development activities (by unification of standards of performance of alliance partners with best standards of industry). IT is gaining greater significance at each stage of clinical study and represent an inevitable element of the quality of a clinical study (for the purpose of monitoring of clinical site activities, data collection and management, medical monitoring, statistical programming, statistical analysis, clinical study reporting). CROs are able to maximize work within the CRO global development, to support the notion of a fully integrated outsourced company; facilitate the use of similar business processes and norms, reusing established CRO standards and improve CRO operational decision making within outsourced studies by providing consistent and current information across outsourced and in-house activities.

  14. Benefits of Outsourcing Strategy and IT Technology in Clinical Trials

    PubMed Central

    Stamenovic, Milorad; Dobraca, Amra

    2017-01-01

    Introduction: Aim of this paper is to describe some of models of outsourcing (numerous and response to different types of risks and increment of quality is based on individual problem and situation). Defining whether to outsource or not and whether to build or buy new information technology (IT) is question for contract research organization (CRO) and Pharma companies dealing with clinical trials, so the aim of this paper is to show business model that could make process of decision making less time consuming, less segmented and more efficient. Material and methods: This paper has a descriptive character, and represents a review of the literature that deals with the described issues. Results: Outsourcing should enable optimal capacity flexibility (technology that is outsourced should be done only optimally not entirely). The goal with CRO partners is to establish equivalent levels of global quality, as extensions of other research and development activities (by unification of standards of performance of alliance partners with best standards of industry). IT is gaining greater significance at each stage of clinical study and represent an inevitable element of the quality of a clinical study (for the purpose of monitoring of clinical site activities, data collection and management, medical monitoring, statistical programming, statistical analysis, clinical study reporting). Conclusion: CROs are able to maximize work within the CRO global development, to support the notion of a fully integrated outsourced company; facilitate the use of similar business processes and norms, reusing established CRO standards and improve CRO operational decision making within outsourced studies by providing consistent and current information across outsourced and in-house activities. PMID:29114116

  15. Using LabView for real-time monitoring and tracking of multiple biological objects

    NASA Astrophysics Data System (ADS)

    Nikolskyy, Aleksandr I.; Krasilenko, Vladimir G.; Bilynsky, Yosyp Y.; Starovier, Anzhelika

    2017-04-01

    Today real-time studying and tracking of movement dynamics of various biological objects is important and widely researched. Features of objects, conditions of their visualization and model parameters strongly influence the choice of optimal methods and algorithms for a specific task. Therefore, to automate the processes of adaptation of recognition tracking algorithms, several Labview project trackers are considered in the article. Projects allow changing templates for training and retraining the system quickly. They adapt to the speed of objects and statistical characteristics of noise in images. New functions of comparison of images or their features, descriptors and pre-processing methods will be discussed. The experiments carried out to test the trackers on real video files will be presented and analyzed.

  16. Bioassessment Tools for Stony Corals: Monitoring Approaches and Proposed Sampling Plan for the U.S. Virgin Islands

    EPA Science Inventory

    This document describes three general approaches to the design of a sampling plan for biological monitoring of coral reefs. Status assessment, trend detection and targeted monitoring each require a different approach to site selection and statistical analysis. For status assessm...

  17. Using the Traffic monitoring guide to develop a truck weight sampling procedure for use in Virginia : final report.

    DOT National Transportation Integrated Search

    1992-01-01

    The Traffic Monitoring Guide (TMG) provides a method for the development of a statistically based procedure to monitor traffic characteristics such as traffic loadings. Truck weight data in particular are a major element of the pavement management pr...

  18. 78 FR 11090 - Steel Import Monitoring and Analysis System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-15

    ...-2549-01] RIN 0625-AA93 Steel Import Monitoring and Analysis System AGENCY: Import Administration... Commerce (the Department) publishes this action to make final a rule to extend the Steel Import Monitoring... public statistical data on steel imports entering the United States seven weeks earlier than it would...

  19. Statistical Techniques to Analyze Pesticide Data Program Food Residue Observations.

    PubMed

    Szarka, Arpad Z; Hayworth, Carol G; Ramanarayanan, Tharacad S; Joseph, Robert S I

    2018-06-26

    The U.S. EPA conducts dietary-risk assessments to ensure that levels of pesticides on food in the U.S. food supply are safe. Often these assessments utilize conservative residue estimates, maximum residue levels (MRLs), and a high-end estimate derived from registrant-generated field-trial data sets. A more realistic estimate of consumers' pesticide exposure from food may be obtained by utilizing residues from food-monitoring programs, such as the Pesticide Data Program (PDP) of the U.S. Department of Agriculture. A substantial portion of food-residue concentrations in PDP monitoring programs are below the limits of detection (left-censored), which makes the comparison of regulatory-field-trial and PDP residue levels difficult. In this paper, we present a novel adaption of established statistical techniques, the Kaplan-Meier estimator (K-M), the robust regression on ordered statistic (ROS), and the maximum-likelihood estimator (MLE), to quantify the pesticide-residue concentrations in the presence of heavily censored data sets. The examined statistical approaches include the most commonly used parametric and nonparametric methods for handling left-censored data that have been used in the fields of medical and environmental sciences. This work presents a case study in which data of thiamethoxam residue on bell pepper generated from registrant field trials were compared with PDP-monitoring residue values. The results from the statistical techniques were evaluated and compared with commonly used simple substitution methods for the determination of summary statistics. It was found that the maximum-likelihood estimator (MLE) is the most appropriate statistical method to analyze this residue data set. Using the MLE technique, the data analyses showed that the median and mean PDP bell pepper residue levels were approximately 19 and 7 times lower, respectively, than the corresponding statistics of the field-trial residues.

  20. National Aquatic Resource Surveys & Statistics: Role of statistics in the development of a national monitoring program

    EPA Science Inventory

    The National Aquatic Resource Surveys (NARS) are a series of four statistical surveys conducted by the U.S. Environmental Protection Agency working in collaboration with states, tribal nations and other federal agencies. The surveys are conducted for lakes and reservoirs, streams...

  1. Cost considerations for long-term ecological monitoring

    USGS Publications Warehouse

    Caughlan, L.; Oakley, K.L.

    2001-01-01

    For an ecological monitoring program to be successful over the long-term, the perceived benefits of the information must justify the cost. Financial limitations will always restrict the scope of a monitoring program, hence the program’s focus must be carefully prioritized. Clearly identifying the costs and benefits of a program will assist in this prioritization process, but this is easier said than done. Frequently, the true costs of monitoring are not recognized and are, therefore, underestimated. Benefits are rarely evaluated, because they are difficult to quantify. The intent of this review is to assist the designers and managers of long-term ecological monitoring programs by providing a general framework for building and operating a cost-effective program. Previous considerations of monitoring costs have focused on sampling design optimization. We present cost considerations of monitoring in a broader context. We explore monitoring costs, including both budgetary costs, what dollars are spent on, and economic costs, which include opportunity costs. Often, the largest portion of a monitoring program budget is spent on data collection, and other, critical aspects of the program, such as scientific oversight, training, data management, quality assurance, and reporting, are neglected. Recognizing and budgeting for all program costs is therefore a key factor in a program’s longevity. The close relationship between statistical issues and cost is discussed, highlighting the importance of sampling design, replication and power, and comparing the costs of alternative designs through pilot studies and simulation modeling. A monitoring program development process that includes explicit checkpoints for considering costs is presented. The first checkpoint occurs during the setting of objectives and during sampling design optimization. The last checkpoint occurs once the basic shape of the program is known, and the costs and benefits, or alternatively the cost-effectiveness, of each program element can be evaluated. Moving into the implementation phase without careful evaluation of costs and benefits is risky because if costs are later found to exceed benefits, the program will fail. The costs of development, which can be quite high, will have been largely wasted. Realistic expectations of costs and benefits will help ensure that monitoring programs survive the early, turbulent stages of development and the challenges posed by fluctuating budgets during implementation.

  2. Structural health monitoring for ship structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farrar, Charles; Park, Gyuhae; Angel, Marian

    2009-01-01

    Currently the Office of Naval Research is supporting the development of structural health monitoring (SHM) technology for U.S. Navy ship structures. This application is particularly challenging because of the physical size of these structures, the widely varying and often extreme operational and environmental conditions associated with these ships missions, lack of data from known damage conditions, limited sensing that was not designed specifically for SHM, and the management of the vast amounts of data that can be collected during a mission. This paper will first define a statistical pattern recognition paradigm for SHM by describing the four steps of (1)more » Operational Evaluation, (2) Data Acquisition, (3) Feature Extraction, and (4) Statistical Classification of Features as they apply to ship structures. Note that inherent in the last three steps of this process are additional tasks of data cleansing, compression, normalization and fusion. The presentation will discuss ship structure SHM challenges in the context of applying various SHM approaches to sea trials data measured on an aluminum multi-hull high-speed ship, the HSV-2 Swift. To conclude, the paper will discuss several outstanding issues that need to be addressed before SHM can make the transition from a research topic to actual field applications on ship structures and suggest approaches for addressing these issues.« less

  3. Analysis of TCE Fate and Transport in Karst Groundwater Systems Using Statistical Mixed Models

    NASA Astrophysics Data System (ADS)

    Anaya, A. A.; Padilla, I. Y.

    2012-12-01

    Karst groundwater systems are highly productive and provide an important fresh water resource for human development and ecological integrity. Their high productivity is often associated with conduit flow and high matrix permeability. The same characteristics that make these aquifers productive also make them highly vulnerable to contamination and a likely for contaminant exposure. Of particular interest are trichloroethylene, (TCE) and Di-(2-Ethylhexyl) phthalate (DEHP). These chemicals have been identified as potential precursors of pre-term birth, a leading cause of neonatal complications with a significant health and societal cost. Both of these contaminants have been found in the karst groundwater formations in this area of the island. The general objectives of this work are to: (1) develop fundamental knowledge and determine the processes controlling the release, mobility, persistence, and possible pathways of contaminants in karst groundwater systems, and (2) characterize transport processes in conduit and diffusion-dominated flow under base flow and storm flow conditions. The work presented herein focuses on the use of geo-hydro statistical tools to characterize flow and transport processes under different flow regimes, and their application in the analysis of fate and transport of TCE. Multidimensional, laboratory-scale Geo-Hydrobed models (GHM) were used for this purpose. The models consist of stainless-steel tanks containing karstified limestone blocks collected from the karst aquifer formation of northern Puerto Rico. The models integrates a network of sampling wells to monitor flow, pressure, and solute concentrations temporally and spatially. Experimental work entails injecting dissolved CaCl2 tracers and TCE in the upstream boundary of the GHM while monitoring TCE and tracer concentrations spatially and temporally in the limestone under different groundwater flow regimes. Analysis of the temporal and spatial concentration distributions of solutes indicates a highly heterogeneous system resulting in large preferential flow components. The distributions are highly correlated with statistically-developed spatial flow models. High degree of tailing in breakthrough curves indicate significant amount of mass limitations, particularly in diffuse flow regions. Higher flow rates in the system result in increasing preferential flow region volumes, but lower mass transfer limitations. Future work will involve experiments with non-aqueous phase liquid TCE, DEHP, and a mixture of these, and geo-temporal statistical modeling. This work is supported by the U.S. Department of Energy, Savannah River (Grant Award No. DE-FG09-07SR22571), and the National Institute of Environmental Health Sciences (NIEHS, Grant Award No. P42ES017198).

  4. Facility Monitoring: A Qualitative Theory for Sensor Fusion

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando

    2001-01-01

    Data fusion and sensor management approaches have largely been implemented with centralized and hierarchical architectures. Numerical and statistical methods are the most common data fusion methods found in these systems. Given the proliferation and low cost of processing power, there is now an emphasis on designing distributed and decentralized systems. These systems use analytical/quantitative techniques or qualitative reasoning methods for date fusion.Based on other work by the author, a sensor may be treated as a highly autonomous (decentralized) unit. Each highly autonomous sensor (HAS) is capable of extracting qualitative behaviours from its data. For example, it detects spikes, disturbances, noise levels, off-limit excursions, step changes, drift, and other typical measured trends. In this context, this paper describes a distributed sensor fusion paradigm and theory where each sensor in the system is a HAS. Hence, given the reach qualitative information from each HAS, a paradigm and formal definitions are given so that sensors and processes can reason and make decisions at the qualitative level. This approach to sensor fusion makes it possible the implementation of intuitive (effective) methods to monitor, diagnose, and compensate processes/systems and their sensors. This paradigm facilitates a balanced distribution of intelligence (code and/or hardware) to the sensor level, the process/system level, and a higher controller level. The primary application of interest is in intelligent health management of rocket engine test stands.

  5. Process monitoring in intensive care with the use of cumulative expected minus observed mortality and risk-adjusted P charts.

    PubMed

    Cockings, Jerome G L; Cook, David A; Iqbal, Rehana K

    2006-02-01

    A health care system is a complex adaptive system. The effect of a single intervention, incorporated into a complex clinical environment, may be different from that expected. A national database such as the Intensive Care National Audit & Research Centre (ICNARC) Case Mix Programme in the UK represents a centralised monitoring, surveillance and reporting system for retrospective quality and comparative audit. This can be supplemented with real-time process monitoring at a local level for continuous process improvement, allowing early detection of the impact of both unplanned and deliberately imposed changes in the clinical environment. Demographic and UK Acute Physiology and Chronic Health Evaluation II (APACHE II) data were prospectively collected on all patients admitted to a UK regional hospital between 1 January 2003 and 30 June 2004 in accordance with the ICNARC Case Mix Programme. We present a cumulative expected minus observed (E-O) plot and the risk-adjusted p chart as methods of continuous process monitoring. We describe the construction and interpretation of these charts and show how they can be used to detect planned or unplanned organisational process changes affecting mortality outcomes. Five hundred and eighty-nine adult patients were included. The overall death rate was 0.78 of predicted. Calibration showed excess survival in ranges above 30% risk of death. The E-O plot confirmed a survival above that predicted. Small transient variations were seen in the slope that could represent random effects, or real but transient changes in the quality of care. The risk-adjusted p chart showed several observations below the 2 SD control limits of the expected mortality rate. These plots provide rapid analysis of risk-adjusted performance suitable for local application and interpretation. The E-O chart provided rapid easily visible feedback of changes in risk-adjusted mortality, while the risk-adjusted p chart allowed statistical evaluation. Local analysis of risk-adjusted mortality data with an E-O plot and a risk-adjusted p chart is feasible and allows the rapid detection of changes in risk-adjusted outcome of intensive care patients. This complements the centralised national database, which is more archival and comparative in nature.

  6. [A personal computer-based system for online monitoring of neurologic intensive care patients].

    PubMed

    Stoll, M; Hamann, G; Jost, V; Schimrigk, K

    1992-03-01

    In the management of neurological intensive care patients with an intracranial space-consuming process the measurement and recording of intracranial pressure together with arterial blood pressure is of special interest. These parameters can be used to monitor the treatment of brain edema and hypertension. Intracranial pressure measurement is also important in the diagnosis of the various subtypes of hydrocephalus. Not only the absolute figures, but also the recognition of specific pressure-patterns is of particular clinical and scientific interest. This new, easily installed and inexpensive system comprises a PC and a conventional monitor, which are connected by an AD-conversion card. Our software, specially developed for this system demonstrates, stores and prints the online-course and the trend of the measurements. In addition it is also possible to view the online-course of conspicuous parts of the trend curve retrospectively and to use these values for statistical analyses. Object-orientated software development techniques were used for flexible graphic output on the screen, printer or to a file. Though developed for this specific purpose, this system is also suitable for recording continuous, longer-term measurements in general.

  7. Recommendations for standardizing validation procedures assessing physical activity of older persons by monitoring body postures and movements.

    PubMed

    Lindemann, Ulrich; Zijlstra, Wiebren; Aminian, Kamiar; Chastin, Sebastien F M; de Bruin, Eling D; Helbostad, Jorunn L; Bussmann, Johannes B J

    2014-01-10

    Physical activity is an important determinant of health and well-being in older persons and contributes to their social participation and quality of life. Hence, assessment tools are needed to study this physical activity in free-living conditions. Wearable motion sensing technology is used to assess physical activity. However, there is a lack of harmonisation of validation protocols and applied statistics, which make it hard to compare available and future studies. Therefore, the aim of this paper is to formulate recommendations for assessing the validity of sensor-based activity monitoring in older persons with focus on the measurement of body postures and movements. Validation studies of body-worn devices providing parameters on body postures and movements were identified and summarized and an extensive inter-active process between authors resulted in recommendations about: information on the assessed persons, the technical system, and the analysis of relevant parameters of physical activity, based on a standardized and semi-structured protocol. The recommended protocols can be regarded as a first attempt to standardize validity studies in the area of monitoring physical activity.

  8. Studying fish near ocean energy devices using underwater video

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzner, Shari; Hull, Ryan E.; Harker-Klimes, Genevra EL

    The effects of energy devices on fish populations are not well-understood, and studying the interactions of fish with tidal and instream turbines is challenging. To address this problem, we have evaluated algorithms to automatically detect fish in underwater video and propose a semi-automated method for ocean and river energy device ecological monitoring. The key contributions of this work are the demonstration of a background subtraction algorithm (ViBE) that detected 87% of human-identified fish events and is suitable for use in a real-time system to reduce data volume, and the demonstration of a statistical model to classify detections as fish ormore » not fish that achieved a correct classification rate of 85% overall and 92% for detections larger than 5 pixels. Specific recommendations for underwater video acquisition to better facilitate automated processing are given. The recommendations will help energy developers put effective monitoring systems in place, and could lead to a standard approach that simplifies the monitoring effort and advances the scientific understanding of the ecological impacts of ocean and river energy devices.« less

  9. Satellite-based monitoring of cotton evapotranspiration

    NASA Astrophysics Data System (ADS)

    Dalezios, Nicolas; Dercas, Nicholas; Tarquis, Ana Maria

    2016-04-01

    Water for agricultural use represents the largest share among all water uses. Vulnerability in agriculture is influenced, among others, by extended periods of water shortage in regions exposed to droughts. Advanced technological approaches and methodologies, including remote sensing, are increasingly incorporated for the assessment of irrigation water requirements. In this paper, remote sensing techniques are integrated for the estimation and monitoring of crop evapotranspiration ETc. The study area is Thessaly central Greece, which is a drought-prone agricultural region. Cotton fields in a small agricultural sub-catchment in Thessaly are used as an experimental site. Daily meteorological data and weekly field data are recorded throughout seven (2004-2010) growing seasons for the computation of reference evapotranspiration ETo, crop coefficient Kc and cotton crop ETc based on conventional data. Satellite data (Landsat TM) for the corresponding period are processed to estimate cotton crop coefficient Kc and cotton crop ETc and delineate its spatiotemporal variability. The methodology is applied for monitoring Kc and ETc during the growing season in the selected sub-catchment. Several error statistics are used showing very good agreement with ground-truth observations.

  10. The Canarian Seismic Monitoring Network: design, development and first result

    NASA Astrophysics Data System (ADS)

    D'Auria, Luca; Barrancos, José; Padilla, Germán D.; García-Hernández, Rubén; Pérez, Aaron; Pérez, Nemesio M.

    2017-04-01

    Tenerife is an active volcanic island which experienced several eruptions of moderate intensity in historical times, and few explosive eruptions in the Holocene. The increasing population density and the consistent number of tourists are constantly raising the volcanic risk. In June 2016 Instituto Volcanologico de Canarias started the deployment of a seismological volcano monitoring network consisting of 15 broadband seismic stations. The network began its full operativity in November 2016. The aim of the network are both volcano monitoring and scientific research. Currently data are continuously recorded and processed in real-time. Seismograms, hypocentral parameters, statistical informations about the seismicity and other data are published on a web page. We show the technical characteristics of the network and an estimate of its detection threshold and earthquake location performances. Furthermore we present other near-real time procedures on the data: analysis of the ambient noise for determining the shallow velocity model and temporal velocity variations, detection of earthquake multiplets through massive data mining of the seismograms and automatic relocation of events through double-difference location.

  11. Assessing Aircraft Supply Air to Recommend Compounds for Timely Warning of Contamination

    NASA Astrophysics Data System (ADS)

    Fox, Richard B.

    Taking aircraft out of service for even one day to correct fume-in-cabin events can cost the industry roughly $630 million per year in lost revenue. The quantitative correlation study investigated quantitative relationships between measured concentrations of contaminants in bleed air and probability of odor detectability. Data were collected from 94 aircraft engine and auxiliary power unit (APU) bleed air tests from an archival data set between 1997 and 2011, and no relationships were found. Pearson correlation was followed by regression analysis for individual contaminants. Significant relationships of concentrations of compounds in bleed air to probability of odor detectability were found (p<0.05), as well as between compound concentration and probability of sensory irritancy detectability. Study results may be useful to establish early warning levels. Predictive trend monitoring, a method to identify potential pending failure modes within a mechanical system, may influence scheduled down-time for maintenance as a planned event, rather than repair after a mechanical failure and thereby reduce operational costs associated with odor-in-cabin events. Twenty compounds (independent variables) were found statistically significant as related to probability of odor detectability (dependent variable 1). Seventeen compounds (independent variables) were found statistically significant as related to probability of sensory irritancy detectability (dependent variable 2). Additional research was recommended to further investigate relationships between concentrations of contaminants and probability of odor detectability or probability of sensory irritancy detectability for all turbine oil brands. Further research on implementation of predictive trend monitoring may be warranted to demonstrate how the monitoring process might be applied to in-flight application.

  12. Data mining spacecraft telemetry: towards generic solutions to automatic health monitoring and status characterisation

    NASA Astrophysics Data System (ADS)

    Royer, P.; De Ridder, J.; Vandenbussche, B.; Regibo, S.; Huygen, R.; De Meester, W.; Evans, D. J.; Martinez, J.; Korte-Stapff, M.

    2016-07-01

    We present the first results of a study aimed at finding new and efficient ways to automatically process spacecraft telemetry for automatic health monitoring. The goal is to reduce the load on the flight control team while extending the "checkability" to the entire telemetry database, and provide efficient, robust and more accurate detection of anomalies in near real time. We present a set of effective methods to (a) detect outliers in the telemetry or in its statistical properties, (b) uncover and visualise special properties of the telemetry and (c) detect new behavior. Our results are structured around two main families of solutions. For parameters visiting a restricted set of signal values, i.e. all status parameters and about one third of all the others, we focus on a transition analysis, exploiting properties of Poincare plots. For parameters with an arbitrarily high number of possible signal values, we describe the statistical properties of the signal via its Kernel Density Estimate. We demonstrate that this allows for a generic and dynamic approach of the soft-limit definition. Thanks to a much more accurate description of the signal and of its time evolution, we are more sensitive and more responsive to outliers than the traditional checks against hard limits. Our methods were validated on two years of Venus Express telemetry. They are generic for assisting in health monitoring of any complex system with large amounts of diagnostic sensor data. Not only spacecraft systems but also present-day astronomical observatories can benefit from them.

  13. The novel application of Benford's second order analysis for monitoring radiation output in interventional radiology.

    PubMed

    Cournane, S; Sheehy, N; Cooke, J

    2014-06-01

    Benford's law is an empirical observation which predicts the expected frequency of digits in naturally occurring datasets spanning multiple orders of magnitude, with the law having been most successfully applied as an audit tool in accountancy. This study investigated the sensitivity of the technique in identifying system output changes using simulated changes in interventional radiology Dose-Area-Product (DAP) data, with any deviations from Benford's distribution identified using z-statistics. The radiation output for interventional radiology X-ray equipment is monitored annually during quality control testing; however, for a considerable portion of the year an increased output of the system, potentially caused by engineering adjustments or spontaneous system faults may go unnoticed, leading to a potential increase in the radiation dose to patients. In normal operation recorded examination radiation outputs vary over multiple orders of magnitude rendering the application of normal statistics ineffective for detecting systematic changes in the output. In this work, the annual DAP datasets complied with Benford's first order law for first, second and combinations of the first and second digits. Further, a continuous 'rolling' second order technique was devised for trending simulated changes over shorter timescales. This distribution analysis, the first employment of the method for radiation output trending, detected significant changes simulated on the original data, proving the technique useful in this case. The potential is demonstrated for implementation of this novel analysis for monitoring and identifying change in suitable datasets for the purpose of system process control. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  14. Method & apparatus for monitoring plasma processing operations

    DOEpatents

    Smith, Jr., Michael Lane; Ward, Pamela Denise; Stevenson, Joel O'Don

    2004-10-19

    The invention generally relates to various aspects of a plasma process and, more specifically, to the monitoring of such plasma processes. One aspect relates to a plasma monitoring module that may be adjusted in at least some manner so as to re-evaluate a previously monitored plasma process. For instance, optical emissions data on a plasma process that was previously monitored by the plasma monitoring module may be replayed through the plasma monitoring module after making at least one adjustment in relation to the plasma monitoring module.

  15. Monitor-based evaluation of pollutant load from urban stormwater runoff in Beijing.

    PubMed

    Liu, Y; Che, W; Li, J

    2005-01-01

    As a major pollutant source to urban receiving waters, the non-point source pollution from urban runoff needs to be well studied and effectively controlled. Based on monitoring data from urban runoff pollutant sources, this article describes a systematic estimation of total pollutant loads from the urban areas of Beijing. A numerical model was developed to quantify main pollutant loads of urban runoff in Beijing. A sub-procedure is involved in this method, in which the flush process influences both the quantity and quality of stormwater runoff. A statistics-based method was applied in computing the annual pollutant load as an output of the runoff. The proportions of pollutant from point-source and non-point sources were compared. This provides a scientific basis for proper environmental input assessment of urban stormwater pollution to receiving waters, improvement of infrastructure performance, implementation of urban stormwater management, and utilization of stormwater.

  16. Characteristics and verification of a car-borne survey system for dose rates in air: KURAMA-II.

    PubMed

    Tsuda, S; Yoshida, T; Tsutsumi, M; Saito, K

    2015-01-01

    The car-borne survey system KURAMA-II, developed by the Kyoto University Research Reactor Institute, has been used for air dose rate mapping after the Fukushima Dai-ichi Nuclear Power Plant accident. KURAMA-II consists of a CsI(Tl) scintillation detector, a GPS device, and a control device for data processing. The dose rates monitored by KURAMA-II are based on the G(E) function (spectrum-dose conversion operator), which can precisely calculate dose rates from measured pulse-height distribution even if the energy spectrum changes significantly. The characteristics of KURAMA-II have been investigated with particular consideration to the reliability of the calculated G(E) function, dose rate dependence, statistical fluctuation, angular dependence, and energy dependence. The results indicate that 100 units of KURAMA-II systems have acceptable quality for mass monitoring of dose rates in the environment. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. CMM Interim Check (U)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montano, Joshua Daniel

    2015-03-23

    Coordinate Measuring Machines (CMM) are widely used in industry, throughout the Nuclear Weapons Complex and at Los Alamos National Laboratory (LANL) to verify part conformance to design definition. Calibration cycles for CMMs at LANL are predominantly one year in length. Unfortunately, several nonconformance reports have been generated to document the discovery of a certified machine found out of tolerance during a calibration closeout. In an effort to reduce risk to product quality two solutions were proposed – shorten the calibration cycle which could be costly, or perform an interim check to monitor the machine’s performance between cycles. The CMM interimmore » check discussed makes use of Renishaw’s Machine Checking Gauge. This off-the-shelf product simulates a large sphere within a CMM’s measurement volume and allows for error estimation. Data was gathered, analyzed, and simulated from seven machines in seventeen different configurations to create statistical process control run charts for on-the-floor monitoring.« less

  18. Evaluation of a Change Detection Methodology by Means of Binary Thresholding Algorithms and Informational Fusion Processes

    PubMed Central

    Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier

    2012-01-01

    Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth’s resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution. PMID:22737023

  19. Evaluation of a change detection methodology by means of binary thresholding algorithms and informational fusion processes.

    PubMed

    Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier

    2012-01-01

    Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth's resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution.

  20. Process Evaluation and Costing of a Multifaceted Population-Wide Intervention to Reduce Salt Consumption in Fiji.

    PubMed

    Webster, Jacqui; Pillay, Arti; Suku, Arleen; Gohil, Paayal; Santos, Joseph Alvin; Schultz, Jimaima; Wate, Jillian; Trieu, Kathy; Hope, Silvia; Snowdon, Wendy; Moodie, Marj; Jan, Stephen; Bell, Colin

    2018-01-30

    This paper reports the process evaluation and costing of a national salt reduction intervention in Fiji. The population-wide intervention included engaging food industry to reduce salt in foods, strategic health communication and a hospital program. The evaluation showed a 1.4 g/day drop in salt intake from the 11.7 g/day at baseline; however, this was not statistically significant. To better understand intervention implementation, we collated data to assess intervention fidelity, reach, context and costs. Government and management changes affected intervention implementation, meaning fidelity was relatively low. There was no active mechanism for ensuring food companies adhered to the voluntary salt reduction targets. Communication activities had wide reach but most activities were one-off, meaning the overall dose was low and impact on behavior limited. Intervention costs were moderate (FJD $277,410 or $0.31 per person) but the strategy relied on multi-sector action which was not fully operationalised. The cyclone also delayed monitoring and likely impacted the results. However, 73% of people surveyed had heard about the campaign and salt reduction policies have been mainstreamed into government programs. Longer-term monitoring of salt intake is planned through future surveys and lessons from this process evaluation will be used to inform future strategies in the Pacific Islands and globally.

  1. Process Evaluation and Costing of a Multifaceted Population-Wide Intervention to Reduce Salt Consumption in Fiji

    PubMed Central

    Webster, Jacqui; Pillay, Arti; Suku, Arleen; Gohil, Paayal; Santos, Joseph Alvin; Schultz, Jimaima; Wate, Jillian; Trieu, Kathy; Hope, Silvia; Snowdon, Wendy; Moodie, Marj; Jan, Stephen; Bell, Colin

    2018-01-01

    This paper reports the process evaluation and costing of a national salt reduction intervention in Fiji. The population-wide intervention included engaging food industry to reduce salt in foods, strategic health communication and a hospital program. The evaluation showed a 1.4 g/day drop in salt intake from the 11.7 g/day at baseline; however, this was not statistically significant. To better understand intervention implementation, we collated data to assess intervention fidelity, reach, context and costs. Government and management changes affected intervention implementation, meaning fidelity was relatively low. There was no active mechanism for ensuring food companies adhered to the voluntary salt reduction targets. Communication activities had wide reach but most activities were one-off, meaning the overall dose was low and impact on behavior limited. Intervention costs were moderate (FJD $277,410 or $0.31 per person) but the strategy relied on multi-sector action which was not fully operationalised. The cyclone also delayed monitoring and likely impacted the results. However, 73% of people surveyed had heard about the campaign and salt reduction policies have been mainstreamed into government programs. Longer-term monitoring of salt intake is planned through future surveys and lessons from this process evaluation will be used to inform future strategies in the Pacific Islands and globally. PMID:29385758

  2. Make your trappings count: The mathematics of pest insect monitoring. Comment on “Multiscale approach to pest insect monitoring: Random walks, pattern formation, synchronization, and networks” by Petrovskii et al.

    NASA Astrophysics Data System (ADS)

    Blasius, Bernd

    2014-09-01

    Since the beginnings of agriculture the production of crops is characterized by an ongoing battle between farmers and pests [1]. Already during biblical times swarms of the desert locust, Schistocerca gregaria, were known as major pest that can devour a field of corn within an hour. Even today, harmful organisms have the potential to threaten food production worldwide. It is estimated that about 37% of all potential crops are destroyed by pests. Harmful insects alone destroy 13%, causing financial losses in the agricultural industry of millions of dollars each year [2-4]. These numbers emphasize the importance of pest insect monitoring as a crucial step of integrated pest management [1]. The main approach to gain information about infestation levels is based on trapping, which leads to the question of how to extrapolate the sparse population counts at singularly disposed traps to a spatial representation of the pest species distribution. In their review Petrovskii et al. provide a mathematical framework to tackle this problem [5]. Their analysis reveals that this seemingly inconspicuous problem gives rise to surprisingly deep mathematical challenges that touch several modern contemporary concepts of statistical physics and complex systems theory. The review does not aim for a collection of numerical recipes to support crop growers in the analysis of their trapping data. Instead the review identifies the relevant biological and physical processes that are involved in pest insect monitoring and it presents the mathematical techniques that are required to capture these processes.

  3. A motion-tolerant approach for monitoring SpO2 and heart rate using photoplethysmography signal with dual frame length processing and multi-classifier fusion.

    PubMed

    Fan, Feiyi; Yan, Yuepeng; Tang, Yongzhong; Zhang, Hao

    2017-12-01

    Monitoring pulse oxygen saturation (SpO 2 ) and heart rate (HR) using photoplethysmography (PPG) signal contaminated by a motion artifact (MA) remains a difficult problem, especially when the oximeter is not equipped with a 3-axis accelerometer for adaptive noise cancellation. In this paper, we report a pioneering investigation on the impact of altering the frame length of Molgedey and Schuster independent component analysis (ICAMS) on performance, design a multi-classifier fusion strategy for selecting the PPG correlated signal component, and propose a novel approach to extract SpO 2 and HR readings from PPG signal contaminated by strong MA interference. The algorithm comprises multiple stages, including dual frame length ICAMS, a multi-classifier-based PPG correlated component selector, line spectral analysis, tree-based HR monitoring, and post-processing. Our approach is evaluated by multi-subject tests. The root mean square error (RMSE) is calculated for each trial. Three statistical metrics are selected as performance evaluation criteria: mean RMSE, median RMSE and the standard deviation (SD) of RMSE. The experimental results demonstrate that a shorter ICAMS analysis window probably results in better performance in SpO 2 estimation. Notably, the designed multi-classifier signal component selector achieved satisfactory performance. The subject tests indicate that our algorithm outperforms other baseline methods regarding accuracy under most criteria. The proposed work can contribute to improving the performance of current pulse oximetry and personal wearable monitoring devices. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Quality-by-design III: application of near-infrared spectroscopy to monitor roller compaction in-process and product quality attributes of immediate release tablets.

    PubMed

    Kona, Ravikanth; Fahmy, Raafat M; Claycamp, Gregg; Polli, James E; Martinez, Marilyn; Hoag, Stephen W

    2015-02-01

    The objective of this study is to use near-infrared spectroscopy (NIRS) coupled with multivariate chemometric models to monitor granule and tablet quality attributes in the formulation development and manufacturing of ciprofloxacin hydrochloride (CIP) immediate release tablets. Critical roller compaction process parameters, compression force (CFt), and formulation variables identified from our earlier studies were evaluated in more detail. Multivariate principal component analysis (PCA) and partial least square (PLS) models were developed during the development stage and used as a control tool to predict the quality of granules and tablets. Validated models were used to monitor and control batches manufactured at different sites to assess their robustness to change. The results showed that roll pressure (RP) and CFt played a critical role in the quality of the granules and the finished product within the range tested. Replacing binder source did not statistically influence the quality attributes of the granules and tablets. However, lubricant type has significantly impacted the granule size. Blend uniformity, crushing force, disintegration time during the manufacturing was predicted using validated PLS regression models with acceptable standard error of prediction (SEP) values, whereas the models resulted in higher SEP for batches obtained from different manufacturing site. From this study, we were able to identify critical factors which could impact the quality attributes of the CIP IR tablets. In summary, we demonstrated the ability of near-infrared spectroscopy coupled with chemometrics as a powerful tool to monitor critical quality attributes (CQA) identified during formulation development.

  5. Self-regulated learning processes of medical students during an academic learning task.

    PubMed

    Gandomkar, Roghayeh; Mirzazadeh, Azim; Jalili, Mohammad; Yazdani, Kamran; Fata, Ladan; Sandars, John

    2016-10-01

    This study was designed to identify the self-regulated learning (SRL) processes of medical students during a biomedical science learning task and to examine the associations of the SRL processes with previous performance in biomedical science examinations and subsequent performance on a learning task. A sample of 76 Year 1 medical students were recruited based on their performance in biomedical science examinations and stratified into previous high and low performers. Participants were asked to complete a biomedical science learning task. Participants' SRL processes were assessed before (self-efficacy, goal setting and strategic planning), during (metacognitive monitoring) and after (causal attributions and adaptive inferences) their completion of the task using an SRL microanalytic interview. Descriptive statistics were used to analyse the means and frequencies of SRL processes. Univariate and multiple logistic regression analyses were conducted to examine the associations of SRL processes with previous examination performance and the learning task performance. Most participants (from 88.2% to 43.4%) reported task-specific processes for SRL measures. Students who exhibited higher self-efficacy (odds ratio [OR] 1.44, 95% confidence interval [CI] 1.09-1.90) and reported task-specific processes for metacognitive monitoring (OR 6.61, 95% CI 1.68-25.93) and causal attributions (OR 6.75, 95% CI 2.05-22.25) measures were more likely to be high previous performers. Multiple analysis revealed that similar SRL measures were associated with previous performance. The use of task-specific processes for causal attributions (OR 23.00, 95% CI 4.57-115.76) and adaptive inferences (OR 27.00, 95% CI 3.39-214.95) measures were associated with being a high learning task performer. In multiple analysis, only the causal attributions measure was associated with high learning task performance. Self-efficacy, metacognitive monitoring and causal attributions measures were associated positively with previous performance. Causal attributions and adaptive inferences measures were associated positively with learning task performance. These findings may inform remediation interventions in the early years of medical school training. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  6. Contribution au developpement d'une methode de controle des procedes dans une usine de bouletage

    NASA Astrophysics Data System (ADS)

    Gosselin, Claude

    This thesis, a collaborative effort between Ecole de technologie superieure and ArcelorMittal Company, presents the development of a methodology for monitoring and quality control of multivariable industrial production processes. This innovation research mandate was developed at ArcelorMittal Exploitation Miniere (AMEM) pellet plant in Port-Cartier (Quebec, Canada). With this undertaking, ArcelorMittal is striving to maintain its world class level of excellence and continues to pursue initiatives that can augment its competitive advantage worldwide. The plant's gravimetric classification process was retained as a prototype and development laboratory due to its effect on the company's competitiveness and its impact on subsequent steps leading to final production of iron oxide pellets. Concretely, the development of this expertise in process control and in situ monitoring will establish a firm basic knowledge in the fields of complex system physical modeling, data reconciliation, statistical observers, multivariate command and quality control using real-time monitoring of the desirability function. The hydraulic classifier is mathematically modeled. Using planned disturbances on the production line, an identification procedure was established to provide empirical estimations of the model's structural parameters. A new sampling campaign and a previously unpublished data collection and consolidation policy were implemented plant-wide. Access to these invaluable data sources has enabled the establishment of new thresholds that govern the production process and its control. Finally, as a substitute for the traditional quality control process, we have implemented a new strategy based on the use of the desirability function. Our innovation is not in using this Finally, as a substitute for the traditional quality control process, we have implemented a new strategy based on the use of the desirability function. Our innovation is not in using this function as an indicator of overall (economic) satisfaction in the production process, but rather in proposing it as an "observer" of the system's state. The first implementation steps have already demonstrated the method's feasibility as well as other numerous industrial impacts on production processes within the company. Namely, the emergence of the economical aspect as a strategic variable that assures better governance of production processes where quality variables present strategic issues.

  7. Forest Soil Disturbance Monitoring Protocol: Volume II: Supplementary methods, statistics, and data collection

    Treesearch

    Deborah S. Page-Dumroese; Ann M. Abbott; Thomas M. Rice

    2009-01-01

    Volume I and volume II of the Forest Soil Disturbance Monitoring Protocol (FSDMP) provide information for a wide range of users, including technicians, field crew leaders, private landowners, land managers, forest professionals, and researchers. Volume I: Rapid Assessment includes the basic methods for establishing forest soil monitoring transects and consistently...

  8. Combining accuracy assessment of land-cover maps with environmental monitoring programs

    Treesearch

    Stephen V. Stehman; Raymond L. Czaplewski; Sarah M. Nusser; Limin Yang; Zhiliang Zhu

    2000-01-01

    A scientifically valid accuracy assessment of a large-area, land-cover map is expensive. Environmental monitoring programs offer a potential source of data to partially defray the cost of accuracy assessment while still maintaining the statistical validity. In this article, three general strategies for combining accuracy assessment and environmental monitoring...

  9. Guidelines for collecting and maintaining archives for genetic monitoring

    Treesearch

    Jennifer A. Jackson; Linda Laikre; C. Scott Baker; Katherine C. Kendall; F. W. Allendorf; M. K. Schwartz

    2011-01-01

    Rapid advances in molecular genetic techniques and the statistical analysis of genetic data have revolutionized the way that populations of animals, plants and microorganisms can be monitored. Genetic monitoring is the practice of using molecular genetic markers to track changes in the abundance, diversity or distribution of populations, species or ecosystems over time...

  10. Operating Experience Review of the INL HTE Gas Monitoring System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    L. C. Cadwallader; K. G. DeWall

    2010-06-01

    This paper describes the operations of several types of gas monitors in use at the Idaho National Laboratory (INL) High Temperature Electrolysis Experiment (HTE) laboratory. The gases monitored at hydrogen, carbon monoxide, carbon dioxide, and oxygen. The operating time, calibration, and unwanted alarms are described. The calibration session time durations are described. Some simple statistics are given for the reliability of these monitors and the results are compared to operating experiences of other types of monitors.

  11. Empirical evaluation of the conceptual model underpinning a regional aquatic long-term monitoring program using causal modelling

    USGS Publications Warehouse

    Irvine, Kathryn M.; Miller, Scott; Al-Chokhachy, Robert K.; Archer, Erik; Roper, Brett B.; Kershner, Jeffrey L.

    2015-01-01

    Conceptual models are an integral facet of long-term monitoring programs. Proposed linkages between drivers, stressors, and ecological indicators are identified within the conceptual model of most mandated programs. We empirically evaluate a conceptual model developed for a regional aquatic and riparian monitoring program using causal models (i.e., Bayesian path analysis). We assess whether data gathered for regional status and trend estimation can also provide insights on why a stream may deviate from reference conditions. We target the hypothesized causal pathways for how anthropogenic drivers of road density, percent grazing, and percent forest within a catchment affect instream biological condition. We found instream temperature and fine sediments in arid sites and only fine sediments in mesic sites accounted for a significant portion of the maximum possible variation explainable in biological condition among managed sites. However, the biological significance of the direct effects of anthropogenic drivers on instream temperature and fine sediments were minimal or not detected. Consequently, there was weak to no biological support for causal pathways related to anthropogenic drivers’ impact on biological condition. With weak biological and statistical effect sizes, ignoring environmental contextual variables and covariates that explain natural heterogeneity would have resulted in no evidence of human impacts on biological integrity in some instances. For programs targeting the effects of anthropogenic activities, it is imperative to identify both land use practices and mechanisms that have led to degraded conditions (i.e., moving beyond simple status and trend estimation). Our empirical evaluation of the conceptual model underpinning the long-term monitoring program provided an opportunity for learning and, consequently, we discuss survey design elements that require modification to achieve question driven monitoring, a necessary step in the practice of adaptive monitoring. We suspect our situation is not unique and many programs may suffer from the same inferential disconnect. Commonly, the survey design is optimized for robust estimates of regional status and trend detection and not necessarily to provide statistical inferences on the causal mechanisms outlined in the conceptual model, even though these relationships are typically used to justify and promote the long-term monitoring of a chosen ecological indicator. Our application demonstrates a process for empirical evaluation of conceptual models and exemplifies the need for such interim assessments in order for programs to evolve and persist.

  12. Landslide Life-Cycle Monitoring and Failure Prediction using Satellite Remote Sensing

    NASA Astrophysics Data System (ADS)

    Bouali, E. H. Y.; Oommen, T.; Escobar-Wolf, R. P.

    2017-12-01

    The consequences of slope instability are severe across the world: the US Geological Survey estimates that, each year, the United States spends $3.5B to repair damages caused by landslides, 25-50 deaths occur, real estate values in affected areas are reduced, productivity decreases, and natural environments are destroyed. A 2012 study by D.N. Petley found that loss of life is typically underestimated and, between 2004 and 2010, 2,620 fatal landslides caused 32,322 deaths around the world. These statistics have led research into the study of landslide monitoring and forecasting. More specifically, this presentation focuses on assessing the potential for using satellite-based optical and radar imagery toward overall landslide life-cycle monitoring and prediction. Radar images from multiple satellites (ERS-1, ERS-2, ENVISAT, and COSMO-SkyMed) are processed using the Persistent Scatterer Interferometry (PSI) technique. Optical images, from the Worldview-2 satellite, are orthorectified and processed using the Co-registration of Optically Sensed Images and Correlation (COSI-Corr) algorithm. Both approaches, process stacks of respective images, yield ground displacement rate values. Ground displacement information is used to generate `inverse-velocity vs time' plots, a proxy relationship that is used to estimate landslide occurrence (slope failure) and derived from a relationship quantified by T. Fukuzono in 1985 and B. Voight in 1988 between a material's time of failure and the strain rate applied to that material. Successful laboratory tests have demonstrated the usefulness of `inverse-velocity vs time' plots. This presentation will investigate the applicability of this approach with remote sensing on natural landslides in the western United States.

  13. Integrating observation and statistical forecasts over sub-Saharan Africa to support Famine Early Warning

    USGS Publications Warehouse

    Funk, Chris; Verdin, James P.; Husak, Gregory

    2007-01-01

    Famine early warning in Africa presents unique challenges and rewards. Hydrologic extremes must be tracked and anticipated over complex and changing climate regimes. The successful anticipation and interpretation of hydrologic shocks can initiate effective government response, saving lives and softening the impacts of droughts and floods. While both monitoring and forecast technologies continue to advance, discontinuities between monitoring and forecast systems inhibit effective decision making. Monitoring systems typically rely on high resolution satellite remote-sensed normalized difference vegetation index (NDVI) and rainfall imagery. Forecast systems provide information on a variety of scales and formats. Non-meteorologists are often unable or unwilling to connect the dots between these disparate sources of information. To mitigate these problem researchers at UCSB's Climate Hazard Group, NASA GIMMS and USGS/EROS are implementing a NASA-funded integrated decision support system that combines the monitoring of precipitation and NDVI with statistical one-to-three month forecasts. We present the monitoring/forecast system, assess its accuracy, and demonstrate its application in food insecure sub-Saharan Africa.

  14. Selecting a Classification Ensemble and Detecting Process Drift in an Evolving Data Stream

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heredia-Langner, Alejandro; Rodriguez, Luke R.; Lin, Andy

    2015-09-30

    We characterize the commercial behavior of a group of companies in a common line of business using a small ensemble of classifiers on a stream of records containing commercial activity information. This approach is able to effectively find a subset of classifiers that can be used to predict company labels with reasonable accuracy. Performance of the ensemble, its error rate under stable conditions, can be characterized using an exponentially weighted moving average (EWMA) statistic. The behavior of the EWMA statistic can be used to monitor a record stream from the commercial network and determine when significant changes have occurred. Resultsmore » indicate that larger classification ensembles may not necessarily be optimal, pointing to the need to search the combinatorial classifier space in a systematic way. Results also show that current and past performance of an ensemble can be used to detect when statistically significant changes in the activity of the network have occurred. The dataset used in this work contains tens of thousands of high level commercial activity records with continuous and categorical variables and hundreds of labels, making classification challenging.« less

  15. Modeling of a Robust Confidence Band for the Power Curve of a Wind Turbine.

    PubMed

    Hernandez, Wilmar; Méndez, Alfredo; Maldonado-Correa, Jorge L; Balleteros, Francisco

    2016-12-07

    Having an accurate model of the power curve of a wind turbine allows us to better monitor its operation and planning of storage capacity. Since wind speed and direction is of a highly stochastic nature, the forecasting of the power generated by the wind turbine is of the same nature as well. In this paper, a method for obtaining a robust confidence band containing the power curve of a wind turbine under test conditions is presented. Here, the confidence band is bound by two curves which are estimated using parametric statistical inference techniques. However, the observations that are used for carrying out the statistical analysis are obtained by using the binning method, and in each bin, the outliers are eliminated by using a censorship process based on robust statistical techniques. Then, the observations that are not outliers are divided into observation sets. Finally, both the power curve of the wind turbine and the two curves that define the robust confidence band are estimated using each of the previously mentioned observation sets.

  16. Modeling of a Robust Confidence Band for the Power Curve of a Wind Turbine

    PubMed Central

    Hernandez, Wilmar; Méndez, Alfredo; Maldonado-Correa, Jorge L.; Balleteros, Francisco

    2016-01-01

    Having an accurate model of the power curve of a wind turbine allows us to better monitor its operation and planning of storage capacity. Since wind speed and direction is of a highly stochastic nature, the forecasting of the power generated by the wind turbine is of the same nature as well. In this paper, a method for obtaining a robust confidence band containing the power curve of a wind turbine under test conditions is presented. Here, the confidence band is bound by two curves which are estimated using parametric statistical inference techniques. However, the observations that are used for carrying out the statistical analysis are obtained by using the binning method, and in each bin, the outliers are eliminated by using a censorship process based on robust statistical techniques. Then, the observations that are not outliers are divided into observation sets. Finally, both the power curve of the wind turbine and the two curves that define the robust confidence band are estimated using each of the previously mentioned observation sets. PMID:27941604

  17. Analysis of the sleep quality of elderly people using biomedical signals.

    PubMed

    Moreno-Alsasua, L; Garcia-Zapirain, B; Mendez-Zorrilla, A

    2015-01-01

    This paper presents a technical solution that analyses sleep signals captured by biomedical sensors to find possible disorders during rest. Specifically, the method evaluates electrooculogram (EOG) signals, skin conductance (GSR), air flow (AS), and body temperature. Next, a quantitative sleep quality analysis determines significant changes in the biological signals, and any similarities between them in a given time period. Filtering techniques such as the Fourier transform method and IIR filters process the signal and identify significant variations. Once these changes have been identified, all significant data is compared and a quantitative and statistical analysis is carried out to determine the level of a person's rest. To evaluate the correlation and significant differences, a statistical analysis has been calculated showing correlation between EOG and AS signals (p=0,005), EOG, and GSR signals (p=0,037) and, finally, the EOG and Body temperature (p=0,04). Doctors could use this information to monitor changes within a patient.

  18. Accelerometry-based classification of human activities using Markov modeling.

    PubMed

    Mannini, Andrea; Sabatini, Angelo Maria

    2011-01-01

    Accelerometers are a popular choice as body-motion sensors: the reason is partly in their capability of extracting information that is useful for automatically inferring the physical activity in which the human subject is involved, beside their role in feeding biomechanical parameters estimators. Automatic classification of human physical activities is highly attractive for pervasive computing systems, whereas contextual awareness may ease the human-machine interaction, and in biomedicine, whereas wearable sensor systems are proposed for long-term monitoring. This paper is concerned with the machine learning algorithms needed to perform the classification task. Hidden Markov Model (HMM) classifiers are studied by contrasting them with Gaussian Mixture Model (GMM) classifiers. HMMs incorporate the statistical information available on movement dynamics into the classification process, without discarding the time history of previous outcomes as GMMs do. An example of the benefits of the obtained statistical leverage is illustrated and discussed by analyzing two datasets of accelerometer time series.

  19. Use of ventilator associated pneumonia bundle and statistical process control chart to decrease VAP rate in Syria.

    PubMed

    Alsadat, Reem; Al-Bardan, Hussam; Mazloum, Mona N; Shamah, Asem A; Eltayeb, Mohamed F E; Marie, Ali; Dakkak, Abdulrahman; Naes, Ola; Esber, Faten; Betelmal, Ibrahim; Kherallah, Mazen

    2012-10-01

    Implementation of ventilator associated pneumonia (VAP) bundle as a performance improvement project in the critical care units for all mechanically ventilated patients aiming to decrease the VAP rates. VAP bundle was implemented in 4 teaching hospitals after educational sessions and compliance rates along with VAP rates were monitored using statistical process control charts. VAP bundle compliance rates were steadily increasing from 33 to 80% in hospital 1, from 33 to 86% in hospital 2 and from 83 to 100% in hospital 3 during the study period. The VAP bundle was not applied in hospital 4 therefore no data was available. A target level of 95% was reached only in hospital 3. This correlated with a decrease in VAP rates from 30 to 6.4 per 1000 ventilator days in hospital 1, from 12 to 4.9 per 1000 ventilator days in hospital 3, whereas VAP rate failed to decrease in hospital 2 (despite better compliance) and it remained high around 33 per 1000 ventilator days in hospital 4 where VAP bundle was not implemented. VAP bundle has performed differently in different hospitals in our study. Prevention of VAP requires a multidimensional strategy that includes strict infection control interventions, VAP bundle implementation, process and outcome surveillance and education.

  20. Statistical Paradigm for Organic Optoelectronic Devices: Normal Force Testing for Adhesion of Organic Photovoltaics and Organic Light-Emitting Diodes.

    PubMed

    Vasilak, Lindsay; Tanu Halim, Silvie M; Das Gupta, Hrishikesh; Yang, Juan; Kamperman, Marleen; Turak, Ayse

    2017-04-19

    In this study, we assess the utility of a normal force (pull-test) approach to measuring adhesion in organic solar cells and organic light-emitting diodes. This approach is a simple and practical method of monitoring the impact of systematic changes in materials, processing conditions, or environmental exposure on interfacial strength and electrode delamination. The ease of measurement enables a statistical description with numerous samples, variant geometry, and minimal preparation. After examining over 70 samples, using the Weibull modulus and the characteristic breaking strength as metrics, we were able to successfully differentiate the adhesion values between 8-tris(hydroxyquinoline aluminum) (Alq 3 ) and poly(3-hexyl-thiophene) and [6,6]-phenyl C61-butyric acid methyl ester (P3HT:PCBM) interfaces with Al and between two annealing times for the bulk heterojunction polymer blends. Additionally, the Weibull modulus, a relative measure of the range of flaw sizes at the fracture plane, can be correlated with the roughness of the organic surface. Finite element modeling of the delamination process suggests that the out-of-plane elastic modulus for Alq 3 is lower than the reported in-plane elastic values. We suggest a statistical treatment of a large volume of tests be part of the standard protocol for investigating adhesion to accommodate the unavoidable variability in morphology and interfacial structure found in most organic devices.

  1. Statistical parametric mapping of stimuli-evoked changes in quantitative blood flow using extended-focus optical coherence microscopy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Marchand, Paul J.; Bouwens, Arno; Shamaei, Vincent; Nguyen, David; Extermann, Jerome; Bolmont, Tristan; Lasser, Theo

    2016-03-01

    Magnetic Resonance Imaging has revolutionised our understanding of brain function through its ability to image human cerebral structures non-invasively over the entire brain. By exploiting the different magnetic properties of oxygenated and deoxygenated blood, functional MRI can indirectly map areas undergoing neural activation. Alongside the development of fMRI, powerful statistical tools have been developed in an effort to shed light on the neural pathways involved in processing of sensory and cognitive information. In spite of the major improvements made in fMRI technology, the obtained spatial resolution of hundreds of microns prevents MRI in resolving and monitoring processes occurring at the cellular level. In this regard, Optical Coherence Microscopy is an ideal instrumentation as it can image at high spatio-temporal resolution. Moreover, by measuring the mean and the width of the Doppler spectra of light scattered by moving particles, OCM allows extracting the axial and lateral velocity components of red blood cells. The ability to assess quantitatively total blood velocity, as opposed to classical axial velocity Doppler OCM, is of paramount importance in brain imaging as a large proportion of cortical vascular is oriented perpendicularly to the optical axis. We combine here quantitative blood flow imaging with extended-focus Optical Coherence Microscopy and Statistical Parametric Mapping tools to generate maps of stimuli-evoked cortical hemodynamics at the capillary level.

  2. Volcanic hazard assessment for the Canary Islands (Spain) using extreme value theory

    NASA Astrophysics Data System (ADS)

    Sobradelo, R.; Martí, J.; Mendoza-Rosas, A. T.; Gómez, G.

    2011-10-01

    The Canary Islands are an active volcanic region densely populated and visited by several millions of tourists every year. Nearly twenty eruptions have been reported through written chronicles in the last 600 yr, suggesting that the probability of a new eruption in the near future is far from zero. This shows the importance of assessing and monitoring the volcanic hazard of the region in order to reduce and manage its potential volcanic risk, and ultimately contribute to the design of appropriate preparedness plans. Hence, the probabilistic analysis of the volcanic eruption time series for the Canary Islands is an essential step for the assessment of volcanic hazard and risk in the area. Such a series describes complex processes involving different types of eruptions over different time scales. Here we propose a statistical method for calculating the probabilities of future eruptions which is most appropriate given the nature of the documented historical eruptive data. We first characterize the eruptions by their magnitudes, and then carry out a preliminary analysis of the data to establish the requirements for the statistical method. Past studies in eruptive time series used conventional statistics and treated the series as an homogeneous process. In this paper, we will use a method that accounts for the time-dependence of the series and includes rare or extreme events, in the form of few data of large eruptions, since these data require special methods of analysis. Hence, we will use a statistical method from extreme value theory. In particular, we will apply a non-homogeneous Poisson process to the historical eruptive data of the Canary Islands to estimate the probability of having at least one volcanic event of a magnitude greater than one in the upcoming years. This is done in three steps: First, we analyze the historical eruptive series to assess independence and homogeneity of the process. Second, we perform a Weibull analysis of the distribution of repose time between successive eruptions. Third, we analyze the non-homogeneous Poisson process with a generalized Pareto distribution as the intensity function.

  3. Concepts for laser beam parameter monitoring during industrial mass production

    NASA Astrophysics Data System (ADS)

    Harrop, Nicholas J.; Maerten, Otto; Wolf, Stefan; Kramer, Reinhard

    2017-02-01

    In today's industrial mass production, lasers have become an established tool for a variety of processes. As with any other tool, mechanical or otherwise, the laser and its ancillary components are prone to wear and ageing. Monitoring of these ageing processes at full operating power of an industrial laser is challenging for a range of reasons. Not only the damage threshold of the measurement device itself, but also cycle time constraints in industrial processing are just two of these challenges. Power measurement, focus spot size or full beam caustic measurements are being implemented in industrial laser systems. The scope of the measurement and the amount of data collected is limited by the above mentioned cycle time, which in some cases can only be a few seconds. For successful integration of these measurement systems into automated production lines, the devices must be equipped with standardized communication interfaces, enabling a feedback loop from the measurement device to the laser processing systems. If necessary these measurements can be performed before each cycle. Power is determined with either static or dynamic calorimetry while camera and scanning systems are used for beam profile analysis. Power levels can be measured from 25W up to 20 kW, with focus spot sizes between 10μm and several millimeters. We will show, backed by relevant statistical data, that defects or contamination of the laser beam path can be detected with applied measurement systems, enabling a quality control chain to prevent process defects.

  4. Statistical Research of Investment Development of Russian Regions

    ERIC Educational Resources Information Center

    Burtseva, Tatiana A.; Aleshnikova, Vera I.; Dubovik, Mayya V.; Naidenkova, Ksenya V.; Kovalchuk, Nadezda B.; Repetskaya, Natalia V.; Kuzmina, Oksana G.; Surkov, Anton A.; Bershadskaya, Olga I.; Smirennikova, Anna V.

    2016-01-01

    This article the article is concerned with a substantiation of procedures ensuring the implementation of statistical research and monitoring of investment development of the Russian regions, which would be pertinent for modern development of the state statistics. The aim of the study is to develop the methodological framework in order to estimate…

  5. Long-term strategy for the statistical design of a forest health monitoring system

    Treesearch

    Hans T. Schreuder; Raymond L. Czaplewski

    1993-01-01

    A conceptual framework is given for a broad-scale survey of forest health that accomplishes three objectives: generate descriptive statistics; detect changes in such statistics; and simplify analytical inferences that identify, and possibly establish cause-effect relationships. Our paper discusses the development of sampling schemes to satisfy these three objectives,...

  6. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  7. Can hospital episode statistics support appraisal and revalidation? Randomised study of physician attitudes.

    PubMed

    Croft, Giles P; Williams, John G; Mann, Robin Y; Cohen, David; Phillips, Ceri J

    2007-08-01

    Hospital episode statistics were originally designed to monitor activity and allocate resources in the NHS. Recently their uses have widened to include analysis of individuals' activity, to inform appraisal and revalidation, and monitor performance. This study investigated physician attitudes to the validity and usefulness of these data for such purposes, and the effect of supporting individuals in data interpretation. A randomised study was conducted with consultant physicians in England, Wales and Scotland. The intervention group was supported by a clinician and an information analyst in obtaining and analysing their own data. The control group was unsupported. Attitudes to the data and confidence in their ability to reflect clinical practice were examined before and after the intervention. It was concluded that hospital episode statistics are not presently fit for monitoring the performance of individual physicians. A more comprehensive description of activity is required for these purposes. Improvements in the quality of existing data through clinical engagement at a local level, however, are possible.

  8. Evaluating the efficiency of environmental monitoring programs

    USGS Publications Warehouse

    Levine, Carrie R.; Yanai, Ruth D.; Lampman, Gregory G.; Burns, Douglas A.; Driscoll, Charles T.; Lawrence, Gregory B.; Lynch, Jason; Schoch, Nina

    2014-01-01

    Statistical uncertainty analyses can be used to improve the efficiency of environmental monitoring, allowing sampling designs to maximize information gained relative to resources required for data collection and analysis. In this paper, we illustrate four methods of data analysis appropriate to four types of environmental monitoring designs. To analyze a long-term record from a single site, we applied a general linear model to weekly stream chemistry data at Biscuit Brook, NY, to simulate the effects of reducing sampling effort and to evaluate statistical confidence in the detection of change over time. To illustrate a detectable difference analysis, we analyzed a one-time survey of mercury concentrations in loon tissues in lakes in the Adirondack Park, NY, demonstrating the effects of sampling intensity on statistical power and the selection of a resampling interval. To illustrate a bootstrapping method, we analyzed the plot-level sampling intensity of forest inventory at the Hubbard Brook Experimental Forest, NH, to quantify the sampling regime needed to achieve a desired confidence interval. Finally, to analyze time-series data from multiple sites, we assessed the number of lakes and the number of samples per year needed to monitor change over time in Adirondack lake chemistry using a repeated-measures mixed-effects model. Evaluations of time series and synoptic long-term monitoring data can help determine whether sampling should be re-allocated in space or time to optimize the use of financial and human resources.

  9. Towards a Solid Foundation of Using Remotely Sensed Solar-Induced Chlorophyll Fluorescence for Crop Monitoring and Yield Forecast

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Sun, Y.; You, L.; Liu, Y.

    2017-12-01

    The growing demand for food production due to population increase coupled with high vulnerability to volatile environmental changes poses a paramount challenge for mankind in the coming century. Real-time crop monitoring and yield forecasting must be a key part of any solution to this challenge as these activities provide vital information needed for effective and efficient crop management and for decision making. However, traditional methods of crop growth monitoring (e.g., remotely sensed vegetation indices) do not directly relate to the most important function of plants - photosynthesis and therefore crop yield. The recent advance in the satellite remote sensing of Solar-Induced chlorophyll Fluorescence (SIF), an integrative photosynthetic signal from molecular origin and a direct measure of plant functions holds great promise for real-time monitoring of crop growth conditions and forecasting yields. In this study, we use satellite measurements of SIF from both the Global Ozone Monitoring Experiment-2 (GOME-2) onboard MetOp-A and the Orbiting Carbon Observatory-2 (OCO-2) satellites to estimate crop yield using both process-based and statistical models. We find that SIF-based crop yield well correlates with the global yield product Spatial Production Allocation Model (SPAM) derived from ground surveys for all major crops including maize, soybean, wheat, sorghum, and rice. The potential and challenges of using upcoming SIF satellite missions for crop monitoring and prediction will also be discussed.

  10. Big-data reflection high energy electron diffraction analysis for understanding epitaxial film growth processes.

    PubMed

    Vasudevan, Rama K; Tselev, Alexander; Baddorf, Arthur P; Kalinin, Sergei V

    2014-10-28

    Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED images, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the data set are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of a RHEED image sequence. This approach is illustrated for growth of La(x)Ca(1-x)MnO(3) films grown on etched (001) SrTiO(3) substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the asymmetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.

  11. Phase space representation of neutron monitor count rate and atmospheric electric field in relation to solar activity in cycles 21 and 22.

    PubMed

    Silva, H G; Lopes, I

    Heliospheric modulation of galactic cosmic rays links solar cycle activity with neutron monitor count rate on earth. A less direct relation holds between neutron monitor count rate and atmospheric electric field because different atmospheric processes, including fluctuations in the ionosphere, are involved. Although a full quantitative model is still lacking, this link is supported by solid statistical evidence. Thus, a connection between the solar cycle activity and atmospheric electric field is expected. To gain a deeper insight into these relations, sunspot area (NOAA, USA), neutron monitor count rate (Climax, Colorado, USA), and atmospheric electric field (Lisbon, Portugal) are presented here in a phase space representation. The period considered covers two solar cycles (21, 22) and extends from 1978 to 1990. Two solar maxima were observed in this dataset, one in 1979 and another in 1989, as well as one solar minimum in 1986. Two main observations of the present study were: (1) similar short-term topological features of the phase space representations of the three variables, (2) a long-term phase space radius synchronization between the solar cycle activity, neutron monitor count rate, and potential gradient (confirmed by absolute correlation values above ~0.8). Finally, the methodology proposed here can be used for obtaining the relations between other atmospheric parameters (e.g., solar radiation) and solar cycle activity.

  12. Sediment sampling and processing methods in Hungary, and possible improvements

    NASA Astrophysics Data System (ADS)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river, best organized in the upstream countries, where also on tributaries like the Drau/Drava monitoring stations are in operation. Sampling frequency of suspended load is 3 to 7 per year in Hungary, and even lower downstream. Sediment management is a major challenge, as most methods developed until now are unsustainable, require continuous intervention and are expensive as well. However, there is a new focus on the subject in the 21st century, which still lacks uniform methodological recommendations for measurements and analyses, and the number of engineers with sediment expertise and experience is alarmingly low. Data related to sediment quantity are unreliable and often contradictory. It is difficult to produce high quality long-term databases that could support and enable the mathematical calibration of sediment transport models. Sediment measurements are different in different countries in Europe. Even in Hungary, sampling and laboratory techniques have changed several times in the past. Also, sediment sampling was never really systhematic, and the sampling campaigns did not follow the hydrological processes. That is how sediment data can hardly be compared; and the data series are inhomogeneous and they cannot be statistically analysed. The majority of the existing sediment data in Hungary are not suitable for the data supply needs of state-of-the-art numerical modeling. It is even problematic to describe the connections between water flow (discharge) and sediment transport, because data are scarce and irregular. Even the most modern measurement methods (Acoustic Doppler Current Profiler [ADCP], or Laser In Situ Scattering and Transmissometry [LISST]) need calibration, which means field sampling and laboratory processing. For these reasons we need (both quantitatively and qualitively) appropriate sampling of sediment. In frame of projects and programs of the Institute for Hydraulic engineering and Water management of Eötvös József College, we developed the methodology of field-data collection campaigns in relation to sediment data in order to meet the calibration and verification needs of state-of-the art numerical modeling, and to be able to collect comparable data series for statistical analyses.

  13. On the efficiency of driver state monitoring systems

    NASA Astrophysics Data System (ADS)

    Dementienko, V. V.; Dorokhov, V. B.; Gerus, S. V.; Markov, A. G.; Shakhnarovich, V. M.

    2007-06-01

    Statistical data on road traffic and the results of laboratory studies are used to construct a mathematical model of a driver-driver state monitor-automobile-traffic system. In terms of the model, the probability of an accident resulting from the drowsy state of the driver is determined both in the absence and presence of a monitor. The model takes into account the efficiency and safety level provided by different monitoring systems, as well as psychological factors associated with the excessive reliance of drivers upon monitoring.

  14. Single-molecule detection of dihydroazulene photo-thermal reaction using break junction technique

    NASA Astrophysics Data System (ADS)

    Huang, Cancan; Jevric, Martyn; Borges, Anders; Olsen, Stine T.; Hamill, Joseph M.; Zheng, Jue-Ting; Yang, Yang; Rudnev, Alexander; Baghernejad, Masoud; Broekmann, Peter; Petersen, Anne Ugleholdt; Wandlowski, Thomas; Mikkelsen, Kurt V.; Solomon, Gemma C.; Brøndsted Nielsen, Mogens; Hong, Wenjing

    2017-05-01

    Charge transport by tunnelling is one of the most ubiquitous elementary processes in nature. Small structural changes in a molecular junction can lead to significant difference in the single-molecule electronic properties, offering a tremendous opportunity to examine a reaction on the single-molecule scale by monitoring the conductance changes. Here, we explore the potential of the single-molecule break junction technique in the detection of photo-thermal reaction processes of a photochromic dihydroazulene/vinylheptafulvene system. Statistical analysis of the break junction experiments provides a quantitative approach for probing the reaction kinetics and reversibility, including the occurrence of isomerization during the reaction. The product ratios observed when switching the system in the junction does not follow those observed in solution studies (both experiment and theory), suggesting that the junction environment was perturbing the process significantly. This study opens the possibility of using nano-structured environments like molecular junctions to tailor product ratios in chemical reactions.

  15. [Development and clinical evaluation of an anesthesia information management system].

    PubMed

    Feng, Jing-yi; Chen, Hua; Zhu, Sheng-mei

    2010-09-21

    To study the design, implementation and clinical evaluation of an anesthesia information management system. To record, process and store peri-operative patient data automatically, all kinds of bedside monitoring equipments are connected into the system based on information integrating technology; after a statistical analysis of those patient data by data mining technology, patient status can be evaluated automatically based on risk prediction standard and decision support system, and then anesthetist could perform reasonable and safe clinical processes; with clinical processes electronically recorded, standard record tables could be generated, and clinical workflow is optimized, as well. With the system, kinds of patient data could be collected, stored, analyzed and archived, kinds of anesthesia documents could be generated, and patient status could be evaluated to support clinic decision. The anesthesia information management system is useful for improving anesthesia quality, decreasing risk of patient and clinician, and aiding to provide clinical proof.

  16. PROBABILITY SAMPLING AND POPULATION INFERENCE IN MONITORING PROGRAMS

    EPA Science Inventory

    A fundamental difference between probability sampling and conventional statistics is that "sampling" deals with real, tangible populations, whereas "conventional statistics" usually deals with hypothetical populations that have no real-world realization. he focus here is on real ...

  17. Using Unmanned Aerial Vehicle (UAV) for spatio-temporal monitoring of soil erosion and roughness in Chania, Crete, Greece

    NASA Astrophysics Data System (ADS)

    Alexakis, Dimitrios; Seiradakis, Kostas; Tsanis, Ioannis

    2016-04-01

    This article presents a remote sensing approach for spatio-temporal monitoring of both soil erosion and roughness using an Unmanned Aerial Vehicle (UAV). Soil erosion by water is commonly known as one of the main reasons for land degradation. Gully erosion causes considerable soil loss and soil degradation. Furthermore, quantification of soil roughness (irregularities of the soil surface due to soil texture) is important and affects surface storage and infiltration. Soil roughness is one of the most susceptible to variation in time and space characteristics and depends on different parameters such as cultivation practices and soil aggregation. A UAV equipped with a digital camera was employed to monitor soil in terms of erosion and roughness in two different study areas in Chania, Crete, Greece. The UAV followed predicted flight paths computed by the relevant flight planning software. The photogrammetric image processing enabled the development of sophisticated Digital Terrain Models (DTMs) and ortho-image mosaics with very high resolution on a sub-decimeter level. The DTMs were developed using photogrammetric processing of more than 500 images acquired with the UAV from different heights above the ground level. As the geomorphic formations can be observed from above using UAVs, shadowing effects do not generally occur and the generated point clouds have very homogeneous and high point densities. The DTMs generated from UAV were compared in terms of vertical absolute accuracies with a Global Navigation Satellite System (GNSS) survey. The developed data products were used for quantifying gully erosion and soil roughness in 3D as well as for the analysis of the surrounding areas. The significant elevation changes from multi-temporal UAV elevation data were used for estimating diachronically soil loss and sediment delivery without installing sediment traps. Concerning roughness, statistical indicators of surface elevation point measurements were estimated and various parameters such as standard deviation of DTM, deviation of residual and standard deviation of prominence were calculated directly from the extracted DTM. Sophisticated statistical filters and elevation indices were developed to quantify both soil erosion and roughness. The applied methodology for monitoring both soil erosion and roughness provides an optimum way of reducing the existing gap between field scale and satellite scale. Keywords : UAV, soil, erosion, roughness, DTM

  18. Fundamentals and Catalytic Innovation: The Statistical and Data Management Center of the Antibacterial Resistance Leadership Group.

    PubMed

    Huvane, Jacqueline; Komarow, Lauren; Hill, Carol; Tran, Thuy Tien T; Pereira, Carol; Rosenkranz, Susan L; Finnemeyer, Matt; Earley, Michelle; Jiang, Hongyu Jeanne; Wang, Rui; Lok, Judith; Evans, Scott R

    2017-03-15

    The Statistical and Data Management Center (SDMC) provides the Antibacterial Resistance Leadership Group (ARLG) with statistical and data management expertise to advance the ARLG research agenda. The SDMC is active at all stages of a study, including design; data collection and monitoring; data analyses and archival; and publication of study results. The SDMC enhances the scientific integrity of ARLG studies through the development and implementation of innovative and practical statistical methodologies and by educating research colleagues regarding the application of clinical trial fundamentals. This article summarizes the challenges and roles, as well as the innovative contributions in the design, monitoring, and analyses of clinical trials and diagnostic studies, of the ARLG SDMC. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  19. Make measurable what is not so: national monitoring of the status of persons with intellectual disability.

    PubMed

    Fujiura, Glenn T; Rutkowski-Kmitta, Violet; Owen, Randall

    2010-12-01

    Statistics are critical in holding governments accountable for the well-being of citizens with disability. International initiatives are underway to improve the quality of disability statistics, but meaningful ID data is exceptionally rare. The status of ID data was evaluated in a review of 12 national statistical systems. Recurring data collection by national ministries was identified and the availability of measures of poverty, exclusion, and disadvantage was assessed. A total of 131 recurring systems coordinated by 50 different ministries were identified. The majority included general disability but less than 25% of the systems screened ID. Of these, few provided policy-relevant data. The scope of ID data was dismal at best, though a significant statistical infrastructure exists for the integration of ID data. Advocacy will be necessary. There is no optimal form of data monitoring, and decisions regarding priorities in purpose, targeted audiences, and the goals for surveillance must be resolved.

  20. Comments of statistical issue in numerical modeling for underground nuclear test monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicholson, W.L.; Anderson, K.K.

    1993-03-01

    The Symposium concluded with prepared summaries by four experts in the involved disciplines. These experts made no mention of statistics and/or the statistical content of issues. The first author contributed an extemporaneous statement at the Symposium because there are important issues associated with conducting and evaluating numerical modeling that are familiar to statisticians and often treated successfully by them. This note expands upon these extemporaneous remarks. Statistical ideas may be helpful in resolving some numerical modeling issues. Specifically, we comment first on the role of statistical design/analysis in the quantification process to answer the question ``what do we know aboutmore » the numerical modeling of underground nuclear tests?`` and second on the peculiar nature of uncertainty analysis for situations involving numerical modeling. The simulations described in the workshop, though associated with topic areas, were basically sets of examples. Each simulation was tuned towards agreeing with either empirical evidence or an expert`s opinion of what empirical evidence would be. While the discussions were reasonable, whether the embellishments were correct or a forced fitting of reality is unclear and illustrates that ``simulation is easy.`` We also suggest that these examples of simulation are typical and the questions concerning the legitimacy and the role of knowing the reality are fair, in general, with respect to simulation. The answers will help us understand why ``prediction is difficult.``« less

  1. Regional Environmental Monitoring and Assessment Program Data (REMAP)

    EPA Pesticide Factsheets

    The Regional Environmental Monitoring and Assessment Program (REMAP) was initiated to test the applicability of the Environmental Monitoring and Assessment Program (EMAP) approach to answer questions about ecological conditions at regional and local scales. Using EMAP's statistical design and indicator concepts, REMAP conducts projects at smaller geographic scales and in shorter time frames than the national EMAP program.

  2. Coherent spectroscopic methods for monitoring pathogens, genetically modified products and nanostructured materials in colloidal solution

    NASA Astrophysics Data System (ADS)

    Moguilnaya, T.; Suminov, Y.; Botikov, A.; Ignatov, S.; Kononenko, A.; Agibalov, A.

    2017-01-01

    We developed the new automatic method that combines the method of forced luminescence and stimulated Brillouin scattering. This method is used for monitoring pathogens, genetically modified products and nanostructured materials in colloidal solution. We carried out the statistical spectral analysis of pathogens, genetically modified soy and nano-particles of silver in water from different regions in order to determine the statistical errors of the method. We studied spectral characteristics of these objects in water to perform the initial identification with 95% probability. These results were used for creation of the model of the device for monitor of pathogenic organisms and working model of the device to determine the genetically modified soy in meat.

  3. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2002

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Tytgat, Guy; Moran, Seth C.; Sánchez, John; Estes, Steve; McNutt, Stephen R.; Paskievitch, John

    2003-01-01

    The Alaska Volcano Observatory (AVO), a cooperative program of the U.S. Geological Survey, the Geophysical Institute of the University of Alaska Fairbanks, and the Alaska Division of Geological and Geophysical Surveys, has maintained seismic monitoring networks at historically active volcanoes in Alaska since 1988 (Power and others, 1993; Jolly and others, 1996; Jolly and others, 2001; Dixon and others, 2002). The primary objectives of this program are the seismic monitoring of active, potentially hazardous, Alaskan volcanoes and the investigation of seismic processes associated with active volcanism. This catalog presents the basic seismic data and changes in the seismic monitoring program for the period January 1, 2002 through December 31, 2002. Appendix G contains a list of publications pertaining to seismicity of Alaskan volcanoes based on these and previously recorded data. The AVO seismic network was used to monitor twenty-four volcanoes in real time in 2002. These include Mount Wrangell, Mount Spurr, Redoubt Volcano, Iliamna Volcano, Augustine Volcano, Katmai Volcanic Group (Snowy Mountain, Mount Griggs, Mount Katmai, Novarupta, Trident Volcano, Mount Mageik, Mount Martin), Aniakchak Crater, Mount Veniaminof, Pavlof Volcano, Mount Dutton, Isanotski Peaks, Shishaldin Volcano, Fisher Caldera, Westdahl Peak, Akutan Peak, Makushin Volcano, Great Sitkin Volcano, and Kanaga Volcano (Figure 1). Monitoring highlights in 2002 include an earthquake swarm at Great Sitkin Volcano in May-June; an earthquake swarm near Snowy Mountain in July-September; low frequency (1-3 Hz) tremor and long-period events at Mount Veniaminof in September-October and in December; and continuing volcanogenic seismic swarms at Shishaldin Volcano throughout the year. Instrumentation and data acquisition highlights in 2002 were the installation of a subnetwork on Okmok Volcano, the establishment of telemetry for the Mount Veniaminof subnetwork, and the change in the data acquisition system to an EARTHWORM detection system. AVO located 7430 earthquakes during 2002 in the vicinity of the monitored volcanoes. This catalog includes: (1) a description of instruments deployed in the field and their locations; (2) a description of earthquake detection, recording, analysis, and data archival systems; (3) a description of velocity models used for earthquake locations; (4) a summary of earthquakes located in 2002; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, and location quality statistics; daily station usage statistics; and all HYPOELLIPSE files used to determine the earthquake locations in 2002.The AVO seismic network was used to monitor twenty-four volcanoes in real time in 2002. These include Mount Wrangell, Mount Spurr, Redoubt Volcano, Iliamna Volcano, Augustine Volcano, Katmai Volcanic Group (Snowy Mountain, Mount Griggs, Mount Katmai, Novarupta, Trident Volcano, Mount Mageik, Mount Martin), Aniakchak Crater, Mount Veniaminof, Pavlof Volcano, Mount Dutton, Isanotski Peaks, Shishaldin Volcano, Fisher Caldera, Westdahl Peak, Akutan Peak, Makushin Volcano, Great Sitkin Volcano, and Kanaga Volcano (Figure 1). Monitoring highlights in 2002 include an earthquake swarm at Great Sitkin Volcano in May-June; an earthquake swarm near Snowy Mountain in July-September; low frequency (1-3 Hz) tremor and long-period events at Mount Veniaminof in September-October and in December; and continuing volcanogenic seismic swarms at Shishaldin Volcano throughout the year. Instrumentation and data acquisition highlights in 2002 were the installation of a subnetwork on Okmok Volcano, the establishment of telemetry for the Mount Veniaminof subnetwork, and the change in the data acquisition system to an EARTHWORM detection system. AVO located 7430 earthquakes during 2002 in the vicinity of the monitored volcanoes.This catalog includes: (1) a description of instruments deployed in the field and their locations; (2) a description of earthquake detection, recording, analysis, and data archival systems; (3) a description of velocity models used for earthquake locations; (4) a summary of earthquakes located in 2002; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, and location quality statistics; daily station usage statistics; and all HYPOELLIPSE files used to determine the earthquake locations in 2002.

  4. Power analysis and trend detection for water quality monitoring data. An application for the Greater Yellowstone Inventory and Monitoring Network

    USGS Publications Warehouse

    Irvine, Kathryn M.; Manlove, Kezia; Hollimon, Cynthia

    2012-01-01

    An important consideration for long term monitoring programs is determining the required sampling effort to detect trends in specific ecological indicators of interest. To enhance the Greater Yellowstone Inventory and Monitoring Network’s water resources protocol(s) (O’Ney 2006 and O’Ney et al. 2009 [under review]), we developed a set of tools to: (1) determine the statistical power for detecting trends of varying magnitude in a specified water quality parameter over different lengths of sampling (years) and different within-year collection frequencies (monthly or seasonal sampling) at particular locations using historical data, and (2) perform periodic trend analyses for water quality parameters while addressing seasonality and flow weighting. A power analysis for trend detection is a statistical procedure used to estimate the probability of rejecting the hypothesis of no trend when in fact there is a trend, within a specific modeling framework. In this report, we base our power estimates on using the seasonal Kendall test (Helsel and Hirsch 2002) for detecting trend in water quality parameters measured at fixed locations over multiple years. We also present procedures (R-scripts) for conducting a periodic trend analysis using the seasonal Kendall test with and without flow adjustment. This report provides the R-scripts developed for power and trend analysis, tutorials, and the associated tables and graphs. The purpose of this report is to provide practical information for monitoring network staff on how to use these statistical tools for water quality monitoring data sets.

  5. A review of selected inorganic surface water quality-monitoring practices: are we really measuring what we think, and if so, are we doing it right?

    USGS Publications Warehouse

    Horowitz, Arthur J.

    2013-01-01

    Successful environmental/water quality-monitoring programs usually require a balance between analytical capabilities, the collection and preservation of representative samples, and available financial/personnel resources. Due to current economic conditions, monitoring programs are under increasing pressure to do more with less. Hence, a review of current sampling and analytical methodologies, and some of the underlying assumptions that form the bases for these programs seems appropriate, to see if they are achieving their intended objectives within acceptable error limits and/or measurement uncertainty, in a cost-effective manner. That evaluation appears to indicate that several common sampling/processing/analytical procedures (e.g., dip (point) samples/measurements, nitrogen determinations, total recoverable analytical procedures) are generating biased or nonrepresentative data, and that some of the underlying assumptions relative to current programs, such as calendar-based sampling and stationarity are no longer defensible. The extensive use of statistical models as well as surrogates (e.g., turbidity) also needs to be re-examined because the hydrologic interrelationships that support their use tend to be dynamic rather than static. As a result, a number of monitoring programs may need redesigning, some sampling and analytical procedures may need to be updated, and model/surrogate interrelationships may require recalibration.

  6. Detection of long duration cloud contamination in hyper-temporal NDVI imagery

    NASA Astrophysics Data System (ADS)

    Ali, A.; de Bie, C. A. J. M.; Skidmore, A. K.; Scarrott, R. G.

    2012-04-01

    NDVI time series imagery are commonly used as a reliable source for land use and land cover mapping and monitoring. However long duration cloud can significantly influence its precision in areas where persistent clouds prevails. Therefore quantifying errors related to cloud contamination are essential for accurate land cover mapping and monitoring. This study aims to detect long duration cloud contamination in hyper-temporal NDVI imagery based land cover mapping and monitoring. MODIS-Terra NDVI imagery (250 m; 16-day; Feb'03-Dec'09) were used after necessary pre-processing using quality flags and upper envelope filter (ASAVOGOL). Subsequently stacked MODIS-Terra NDVI image (161 layers) was classified for 10 to 100 clusters using ISODATA. After classifications, 97 clusters image was selected as best classified with the help of divergence statistics. To detect long duration cloud contamination, mean NDVI class profiles of 97 clusters image was analyzed for temporal artifacts. Results showed that long duration clouds affect the normal temporal progression of NDVI and caused anomalies. Out of total 97 clusters, 32 clusters were found with cloud contamination. Cloud contamination was found more prominent in areas where high rainfall occurs. This study can help to stop error propagation in regional land cover mapping and monitoring, caused by long duration cloud contamination.

  7. Informatics in radiology: Efficiency metrics for imaging device productivity.

    PubMed

    Hu, Mengqi; Pavlicek, William; Liu, Patrick T; Zhang, Muhong; Langer, Steve G; Wang, Shanshan; Place, Vicki; Miranda, Rafael; Wu, Teresa Tong

    2011-01-01

    Acute awareness of the costs associated with medical imaging equipment is an ever-present aspect of the current healthcare debate. However, the monitoring of productivity associated with expensive imaging devices is likely to be labor intensive, relies on summary statistics, and lacks accepted and standardized benchmarks of efficiency. In the context of the general Six Sigma DMAIC (design, measure, analyze, improve, and control) process, a World Wide Web-based productivity tool called the Imaging Exam Time Monitor was developed to accurately and remotely monitor imaging efficiency with use of Digital Imaging and Communications in Medicine (DICOM) combined with a picture archiving and communication system. Five device efficiency metrics-examination duration, table utilization, interpatient time, appointment interval time, and interseries time-were derived from DICOM values. These metrics allow the standardized measurement of productivity, to facilitate the comparative evaluation of imaging equipment use and ongoing efforts to improve efficiency. A relational database was constructed to store patient imaging data, along with device- and examination-related data. The database provides full access to ad hoc queries and can automatically generate detailed reports for administrative and business use, thereby allowing staff to monitor data for trends and to better identify possible changes that could lead to improved productivity and reduced costs in association with imaging services. © RSNA, 2011.

  8. Fast neutron-gamma discrimination on neutron emission profile measurement on JT-60U.

    PubMed

    Ishii, K; Shinohara, K; Ishikawa, M; Baba, M; Isobe, M; Okamoto, A; Kitajima, S; Sasao, M

    2010-10-01

    A digital signal processing (DSP) system is applied to stilbene scintillation detectors of the multichannel neutron emission profile monitor in JT-60U. Automatic analysis of the neutron-γ pulse shape discrimination is a key issue to diminish the processing time in the DSP system, and it has been applied using the two-dimensional (2D) map. Linear discriminant function is used to determine the dividing line between neutron events and γ-ray events on a 2D map. In order to verify the validity of the dividing line determination, the pulse shape discrimination quality is evaluated. As a result, the γ-ray contamination in most of the beam heating phase was negligible compared with the statistical error with 10 ms time resolution.

  9. Fast neutron-gamma discrimination on neutron emission profile measurement on JT-60U

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ishii, K.; Okamoto, A.; Kitajima, S.

    2010-10-15

    A digital signal processing (DSP) system is applied to stilbene scintillation detectors of the multichannel neutron emission profile monitor in JT-60U. Automatic analysis of the neutron-{gamma} pulse shape discrimination is a key issue to diminish the processing time in the DSP system, and it has been applied using the two-dimensional (2D) map. Linear discriminant function is used to determine the dividing line between neutron events and {gamma}-ray events on a 2D map. In order to verify the validity of the dividing line determination, the pulse shape discrimination quality is evaluated. As a result, the {gamma}-ray contamination in most of themore » beam heating phase was negligible compared with the statistical error with 10 ms time resolution.« less

  10. Seasonal rationalization of river water quality sampling locations: a comparative study of the modified Sanders and multivariate statistical approaches.

    PubMed

    Varekar, Vikas; Karmakar, Subhankar; Jha, Ramakar

    2016-02-01

    The design of surface water quality sampling location is a crucial decision-making process for rationalization of monitoring network. The quantity, quality, and types of available dataset (watershed characteristics and water quality data) may affect the selection of appropriate design methodology. The modified Sanders approach and multivariate statistical techniques [particularly factor analysis (FA)/principal component analysis (PCA)] are well-accepted and widely used techniques for design of sampling locations. However, their performance may vary significantly with quantity, quality, and types of available dataset. In this paper, an attempt has been made to evaluate performance of these techniques by accounting the effect of seasonal variation, under a situation of limited water quality data but extensive watershed characteristics information, as continuous and consistent river water quality data is usually difficult to obtain, whereas watershed information may be made available through application of geospatial techniques. A case study of Kali River, Western Uttar Pradesh, India, is selected for the analysis. The monitoring was carried out at 16 sampling locations. The discrete and diffuse pollution loads at different sampling sites were estimated and accounted using modified Sanders approach, whereas the monitored physical and chemical water quality parameters were utilized as inputs for FA/PCA. The designed optimum number of sampling locations for monsoon and non-monsoon seasons by modified Sanders approach are eight and seven while that for FA/PCA are eleven and nine, respectively. Less variation in the number and locations of designed sampling sites were obtained by both techniques, which shows stability of results. A geospatial analysis has also been carried out to check the significance of designed sampling location with respect to river basin characteristics and land use of the study area. Both methods are equally efficient; however, modified Sanders approach outperforms FA/PCA when limited water quality and extensive watershed information is available. The available water quality dataset is limited and FA/PCA-based approach fails to identify monitoring locations with higher variation, as these multivariate statistical approaches are data-driven. The priority/hierarchy and number of sampling sites designed by modified Sanders approach are well justified by the land use practices and observed river basin characteristics of the study area.

  11. Image-guided radiotherapy quality control: Statistical process control using image similarity metrics.

    PubMed

    Shiraishi, Satomi; Grams, Michael P; Fong de Los Santos, Luis E

    2018-05-01

    The purpose of this study was to demonstrate an objective quality control framework for the image review process. A total of 927 cone-beam computed tomography (CBCT) registrations were retrospectively analyzed for 33 bilateral head and neck cancer patients who received definitive radiotherapy. Two registration tracking volumes (RTVs) - cervical spine (C-spine) and mandible - were defined, within which a similarity metric was calculated and used as a registration quality tracking metric over the course of treatment. First, sensitivity to large misregistrations was analyzed for normalized cross-correlation (NCC) and mutual information (MI) in the context of statistical analysis. The distribution of metrics was obtained for displacements that varied according to a normal distribution with standard deviation of σ = 2 mm, and the detectability of displacements greater than 5 mm was investigated. Then, similarity metric control charts were created using a statistical process control (SPC) framework to objectively monitor the image registration and review process. Patient-specific control charts were created using NCC values from the first five fractions to set a patient-specific process capability limit. Population control charts were created using the average of the first five NCC values for all patients in the study. For each patient, the similarity metrics were calculated as a function of unidirectional translation, referred to as the effective displacement. Patient-specific action limits corresponding to 5 mm effective displacements were defined. Furthermore, effective displacements of the ten registrations with the lowest similarity metrics were compared with a three dimensional (3DoF) couch displacement required to align the anatomical landmarks. Normalized cross-correlation identified suboptimal registrations more effectively than MI within the framework of SPC. Deviations greater than 5 mm were detected at 2.8σ and 2.1σ from the mean for NCC and MI, respectively. Patient-specific control charts using NCC evaluated daily variation and identified statistically significant deviations. This study also showed that subjective evaluations of the images were not always consistent. Population control charts identified a patient whose tracking metrics were significantly lower than those of other patients. The patient-specific action limits identified registrations that warranted immediate evaluation by an expert. When effective displacements in the anterior-posterior direction were compared to 3DoF couch displacements, the agreement was ±1 mm for seven of 10 patients for both C-spine and mandible RTVs. Qualitative review alone of IGRT images can result in inconsistent feedback to the IGRT process. Registration tracking using NCC objectively identifies statistically significant deviations. When used in conjunction with the current image review process, this tool can assist in improving the safety and consistency of the IGRT process. © 2018 American Association of Physicists in Medicine.

  12. An evaluation of the effectiveness of a risk-based monitoring approach implemented with clinical trials involving implantable cardiac medical devices.

    PubMed

    Diani, Christopher A; Rock, Angie; Moll, Phil

    2017-12-01

    Background Risk-based monitoring is a concept endorsed by the Food and Drug Administration to improve clinical trial data quality by focusing monitoring efforts on critical data elements and higher risk investigator sites. BIOTRONIK approached this by implementing a comprehensive strategy that assesses risk and data quality through a combination of operational controls and data surveillance. This publication demonstrates the effectiveness of a data-driven risk assessment methodology when used in conjunction with a tailored monitoring plan. Methods We developed a data-driven risk assessment system to rank 133 investigator sites comprising 3442 subjects and identify those sites that pose a potential risk to the integrity of data collected in implantable cardiac device clinical trials. This included identification of specific risk factors and a weighted scoring mechanism. We conducted trend analyses for risk assessment data collected over 1 year to assess the overall impact of our data surveillance process combined with other operational monitoring efforts. Results Trending analyses of key risk factors revealed an improvement in the quality of data collected during the observation period. The three risk factors follow-up compliance rate, unavailability of critical data, and noncompliance rate correspond closely with Food and Drug Administration's risk-based monitoring guidance document. Among these three risk factors, 100% (12/12) of quantiles analyzed showed an increase in data quality. Of these, 67% (8/12) of the improving trends in worst performing quantiles had p-values less than 0.05, and 17% (2/12) had p-values between 0.05 and 0.06. Among the poorest performing site quantiles, there was a statistically significant decrease in subject follow-up noncompliance rates, protocol noncompliance rates, and incidence of missing critical data. Conclusion One year after implementation of a comprehensive strategy for risk-based monitoring, including a data-driven risk assessment methodology to target on-site monitoring visits, statistically significant improvement was seen in a majority of measurable risk factors at the worst performing site quantiles. For the three risk factors which are most critical to the overall compliance of cardiac rhythm management medical device studies: follow-up compliance rate, unavailability of critical data, and noncompliance rate, we measured significant improvement in data quality. Although the worst performing site quantiles improved but not significantly in some risk factors such as subject attrition, the data-driven risk assessment highlighted key areas on which to continue focusing both on-site and centralized monitoring efforts. Data-driven surveillance of clinical trial performance provides actionable observations that can improve site performance. Clinical trials utilizing risk-based monitoring by leveraging a data-driven quality assessment combined with specific operational procedures may lead to an improvement in data quality and resource efficiencies.

  13. Spectral variations of canopy reflectance in support of precision agriculture

    NASA Astrophysics Data System (ADS)

    Kancheva, Rumiana; Georgiev, Georgi; Borisova, Denitsa; Nikolov, Hristo

    2014-05-01

    Agricultural monitoring is an important and continuously spreading activity in remote sensing and applied Earth observations. It supplies precise, reliable and valuable information on current crop condition and growth processes. In agriculture, the timing of seasonal cycles of crop activity is important for species classification and evaluation of crop development, growing conditions and potential yield. The correct interpretation of remotely sensed data, however, and the increasing demand for data reliability require ground-truth knowledge of the seasonal spectral behavior of different species and their relation to crop vigor. For this reason, we performed ground-based study of the seasonal response of winter wheat reflectance patterns to crop growth patterns. The goal was to quantify crop seasonality by establishing empirical relationships between plant biophysical and spectral properties in main ontogenetic periods. Phenology and agro-specific relationships allow assessing crop condition during different portions of the growth cycle and thus effectively tracking plant development, and finally make yield predictions. The applicability of a number of vegetation indices (VIs) for monitoring crop seasonal dynamics, its health condition, and yield potential was examined. Special emphasis we put on narrow-band indices as the availability of data from hyperspectral imagers is unavoidable future. The temporal behavior of vegetation indices revealed increased sensitivity to crop growth. The derived spectral-biophysical relationships allowed extraction of quantitative information about crop variables and yield at different stages of the phenological development. Relating plant spectral and biophysical variables in a phenology-based manner allows crop monitoring, that is crop diagnosis and predictions to be performed multiple times during plant ontogenesis. During active vegetative periods spectral data was highly indicative of plant growth trends and yield potential. The VIs values contributed to reliable yield prediction and showed very good correspondence with the estimates from biophysical models. For dates before full maturity most of the examined VIs proved to be meaningful statistical predictors of crop state-indicative biophysical variables. High correlations were obtained for canopy cover fraction, LAI, and biomass. Sensitivity to red, near-infrared and green reflectance showed both vigorous and stressed plants. As crops attained advanced growth stages, decreased sensitivity of VIs and weaker correlations with bioparameters were observed, yet still significant in a statistical sense. The results highlight the capability of the presented approach to track the dynamics of crop growth from multitemporal spectral data, and illustrate the prediction accuracy of the spectral models. The results are useful in assessing the efficiency of various spectral band ratios and other vegetation indices often used in remote sensing studies of natural and agricultural vegetation. They suggest that the used algorithm for data processing is particularly suitable for airborne cropland monitoring and could be expanded to sites at farm or municipality scale. The results reported are from pilot study carried out on a plot located in one of the established polygons for experimental crop monitoring. In the mentioned research GIS database is established for supporting the experiments and modelling process. Recommendations on good farming practices for medium sized farms for monitoring stress conditions such as drought and overfertilizing are developed.

  14. Evaluating model accuracy for model-based reasoning

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Roden, Joseph

    1992-01-01

    Described here is an approach to automatically assessing the accuracy of various components of a model. In this approach, actual data from the operation of a target system is used to drive statistical measures to evaluate the prediction accuracy of various portions of the model. We describe how these statistical measures of model accuracy can be used in model-based reasoning for monitoring and design. We then describe the application of these techniques to the monitoring and design of the water recovery system of the Environmental Control and Life Support System (ECLSS) of Space Station Freedom.

  15. Quantitative Laser Biospeckle Method for the Evaluation of the Activity of Trypanosoma cruzi Using VDRL Plates and Digital Analysis.

    PubMed

    Grassi, Hilda Cristina; García, Lisbette C; Lobo-Sulbarán, María Lorena; Velásquez, Ana; Andrades-Grassi, Francisco A; Cabrera, Humberto; Andrades-Grassi, Jesús E; Andrades, Efrén D J

    2016-12-01

    In this paper we report a quantitative laser Biospeckle method using VDRL plates to monitor the activity of Trypanosoma cruzi and the calibration conditions including three image processing algorithms and three programs (ImageJ and two programs designed in this work). Benznidazole was used as a test drug. Variable volume (constant density) and variable density (constant volume) were used for the quantitative evaluation of parasite activity in calibrated wells of the VDRL plate. The desiccation process within the well was monitored as a function of volume and of the activity of the Biospeckle pattern of the parasites as well as the quantitative effect of the surface parasite quantity (proportion of the object's plane). A statistical analysis was performed with ANOVA, Tukey post hoc and Descriptive Statistics using R and R Commander. Conditions of volume (100μl) and parasite density (2-4x104 parasites/well, in exponential growth phase), assay time (up to 204min), frame number (11 frames), algorithm and program (RCommander/SAGA) for image processing were selected to test the effect of variable concentrations of benznidazole (0.0195 to 20μg/mL / 0.075 to 76.8μM) at various times (1, 61, 128 and 204min) on the activity of the Biospeckle pattern. The flat wells of the VDRL plate were found to be suitable for the quantitative calibration of the activity of Trypanosoma cruzi using the appropriate algorithm and program. Under these conditions, benznidazole produces at 1min an instantaneous effect on the activity of the Biospeckle pattern of T. cruzi, which remains with a similar profile up to 1 hour. A second effect which is dependent on concentrations above 1.25μg/mL and is statistically different from the effect at lower concentrations causes a decrease in the activity of the Biospeckle pattern. This effect is better detected after 1 hour of drug action. This behavior may be explained by an instantaneous effect on a membrane protein of Trypanosoma cruzi that could mediate the translocation of benznidazole. At longer times the effect may possibly be explained by the required transformation of the pro-drug into the active drug.

  16. Ensemble Statistical Post-Processing of the National Air Quality Forecast Capability: Enhancing Ozone Forecasts in Baltimore, Maryland

    NASA Technical Reports Server (NTRS)

    Garner, Gregory G.; Thompson, Anne M.

    2013-01-01

    An ensemble statistical post-processor (ESP) is developed for the National Air Quality Forecast Capability (NAQFC) to address the unique challenges of forecasting surface ozone in Baltimore, MD. Air quality and meteorological data were collected from the eight monitors that constitute the Baltimore forecast region. These data were used to build the ESP using a moving-block bootstrap, regression tree models, and extreme-value theory. The ESP was evaluated using a 10-fold cross-validation to avoid evaluation with the same data used in the development process. Results indicate that the ESP is conditionally biased, likely due to slight overfitting while training the regression tree models. When viewed from the perspective of a decision-maker, the ESP provides a wealth of additional information previously not available through the NAQFC alone. The user is provided the freedom to tailor the forecast to the decision at hand by using decision-specific probability thresholds that define a forecast for an ozone exceedance. Taking advantage of the ESP, the user not only receives an increase in value over the NAQFC, but also receives value for An ensemble statistical post-processor (ESP) is developed for the National Air Quality Forecast Capability (NAQFC) to address the unique challenges of forecasting surface ozone in Baltimore, MD. Air quality and meteorological data were collected from the eight monitors that constitute the Baltimore forecast region. These data were used to build the ESP using a moving-block bootstrap, regression tree models, and extreme-value theory. The ESP was evaluated using a 10-fold cross-validation to avoid evaluation with the same data used in the development process. Results indicate that the ESP is conditionally biased, likely due to slight overfitting while training the regression tree models. When viewed from the perspective of a decision-maker, the ESP provides a wealth of additional information previously not available through the NAQFC alone. The user is provided the freedom to tailor the forecast to the decision at hand by using decision-specific probability thresholds that define a forecast for an ozone exceedance. Taking advantage of the ESP, the user not only receives an increase in value over the NAQFC, but also receives value for

  17. Electrical engineering research support for FDOT Traffic Statistics Office

    DOT National Transportation Integrated Search

    2010-03-01

    The aim of this project was to provide electrical engineering support for the telemetered traffic monitoring sites (TTMSs) operated by the Statistics Office of the Florida Department of Transportation. This project was a continuation of project BD-54...

  18. Analysis of ERT data of geoelectrical permafrost monitoring on Hoher Sonnblick (Austrian Central Alps)

    NASA Astrophysics Data System (ADS)

    Pfeiler, Stefan; Schöner, Wolfgang; Reisenhofer, Stefan; Ottowitz, David; Jochum, Birgit; Kim, Jung-Ho; Hoyer, Stefan; Supper, Robert; Heinrich, Georg

    2016-04-01

    In the Alps infrastructure facilities such as roads, routes or buildings are affected by the changes of permafrost, which often cause enormous reparation costs. Investigation on degradation of Alpine Permafrost in the last decade has increased, however, the understanding of the permafrost changes inducing its atmospheric forcing processes is still insufficient. Within the project ATMOperm the application of the geoelectrical method to estimate thawing layer thickness for mountain permafrost is investigated near the highest meteorological observatory of Austria on the Hoher Sonnblick. Therefore, it is necessary to further optimize the transformation of ERT data to thermal changes in the subsurface. Based on an innovative time lapse inversion routine for ERT data (Kim J.-H. et al 2013) a newly developed data analysis software tool developed by Kim Jung-Ho (KIGAM) in cooperation with the Geophysics group of the Geological Survey of Austria allows the statistical analysis of the entire sample set of each and every data point measured by the geoelectrical monitoring instrument. This gives on the one hand of course an enhanced opportunity to separate between „good" and „bad" data points in order to assess the quality of measurements. On the other hand, the results of the statistical analysis define the impact of every single data point on the inversion routine. The interpretation of the inversion results will be supplemented by temperature logs from selected boreholes along the ERT profile as well as climatic parameters. KIM J.-H., SUPPER R., TSOURLOS P. and YI M.-J.: Four-dimensional inversion of resistivity monitoring data through Lp norm minimizations. - Geophysical Journal International, 195(3), 1640-1656, 2013. Doi: 10.1093/gji/ggt324. (No OA) Acknowledgments: The geoelectrical monitoring on Hoher Sonnblick has been installed and is operated in the frame of the project ATMOperm (Atmosphere - permafrost relationship in the Austrian Alps - atmospheric extreme events and their relevance for the mean state of the active layer) funded by the Austrian Academy of Science (ÖAW)

  19. SimExTargId: A comprehensive package for real-time LC-MS data acquisition and analysis.

    PubMed

    Edmands, William M B; Hayes, Josie; Rappaport, Stephen M

    2018-05-22

    Liquid chromatography mass spectrometry (LC-MS) is the favored method for untargeted metabolomic analysis of small molecules in biofluids. Here we present SimExTargId, an open-source R package for autonomous analysis of metabolomic data and real-time observation of experimental runs. This simultaneous, fully automated and multi-threaded (optional) package is a wrapper for vendor-independent format conversion (ProteoWizard), xcms- and CAMERA- based peak-picking, MetMSLine-based pre-processing and covariate-based statistical analysis. Users are notified of detrimental instrument drift or errors by email. Also included are two shiny applications, targetId for real-time MS2 target identification, and peakMonitor to monitor targeted metabolites. SimExTargId is publicly available under GNU LGPL v3.0 license at https://github.com/JosieLHayes/simExTargId, which includes a vignette with example data. SimExTargId should be installed on a dedicated data-processing workstation or server that is networked to the LC-MS platform to facilitate MS1 profiling of metabolomic data. josie.hayes@berkeley.edu. Supplementary data are available at Bioinformatics online.

  20. Dynamic Statistical Characterization of Variation in Source Processes of Microseismic Events

    NASA Astrophysics Data System (ADS)

    Smith-Boughner, L.; Viegas, G. F.; Urbancic, T.; Baig, A. M.

    2015-12-01

    During a hydraulic fracture, water is pumped at high pressure into a formation. A proppant, typically sand is later injected in the hope that it will make its way into a fracture, keep it open and provide a path for the hydrocarbon to enter the well. This injection can create micro-earthquakes, generated by deformation within the reservoir during treatment. When these injections are monitored, thousands of microseismic events are recorded within several hundred cubic meters. For each well-located event, many source parameters are estimated e.g. stress drop, Savage-Wood efficiency and apparent stress. However, because we are evaluating outputs from a power-law process, the extent to which the failure is impacted by fluid injection or stress triggering is not immediately clear. To better detect differences in source processes, we use a set of dynamic statistical parameters which characterize various force balance assumptions using the average distance to the nearest event, event rate, volume enclosed by the events, cumulative moment and energy from a group of events. One parameter, the Fracability index, approximates the ratio of viscous to elastic forcing and highlights differences in the response time of a rock to changes in stress. These dynamic parameters are applied to a database of more than 90 000 events in a shale-gas play in the Horn River Basin to characterize spatial-temporal variations in the source processes. In order to resolve these differences, a moving window, nearest neighbour approach was used. First, the center of mass of the local distribution was estimated for several source parameters. Then, a set of dynamic parameters, which characterize the response of the rock were estimated. These techniques reveal changes in seismic efficiency and apparent stress and often coincide with marked changes in the Fracability index and other dynamic statistical parameters. Utilizing these approaches allowed for the characterization of fluid injection related processes.

  1. The Cloud2SM Project

    NASA Astrophysics Data System (ADS)

    Crinière, Antoine; Dumoulin, Jean; Mevel, Laurent; Andrade-Barosso, Guillermo; Simonin, Matthieu

    2015-04-01

    From the past decades the monitoring of civil engineering structure became a major field of research and development process in the domains of modelling and integrated instrumentation. This increasing of interest can be attributed in part to the need of controlling the aging of such structures and on the other hand to the need to optimize maintenance costs. From this standpoint the project Cloud2SM (Cloud architecture design for Structural Monitoring with in-line Sensors and Models tasking), has been launched to develop a robust information system able to assess the long term monitoring of civil engineering structures as well as interfacing various sensors and data. The specificity of such architecture is to be based on the notion of data processing through physical or statistical models. Thus the data processing, whether material or mathematical, can be seen here as a resource of the main architecture. The project can be divided in various items: -The sensors and their measurement process: Those items provide data to the main architecture and can embed storage or computational resources. Dependent of onboard capacity and the amount of data generated it can be distinguished heavy and light sensors. - The storage resources: Based on the cloud concept this resource can store at least two types of data, raw data and processed ones. - The computational resources: This item includes embedded "pseudo real time" resources as the dedicated computer cluster or computational resources. - The models: Used for the conversion of raw data to meaningful data. Those types of resources inform the system of their needs they can be seen as independents blocks of the system. - The user interface: This item can be divided in various HMI to assess maintaining operation on the sensors or pop-up some information to the user. - The demonstrators: The structures themselves. This project follows previous research works initiated in the European project ISTIMES [1]. It includes the infrared thermal monitoring of civil engineering structures [2-3] and/or the vibration monitoring of such structures [4-5]. The chosen architecture is based on the OGC standard in order to ensure the interoperability between the various measurement systems. This concept is extended to the notion of physical models. The last but not the least main objective of this project is to explore the feasibility and the reliability to deploy mathematical models and process a large amount of data using the GPGPU capacity of a dedicated computational cluster, while studying OGC standardization to those technical concepts. References [1] M. Proto et al., « Transport Infrastructure surveillance and Monitoring by Electromagnetic Sensing: the ISTIMES project », Journal Sensors, Sensors 2010, 10(12), 10620-10639; doi:10.3390/s101210620, December 2010. [2] J. Dumoulin, A. Crinière, R. Averty ," Detection and thermal characterization of the inner structure of the "Musmeci" bridge deck by infrared thermography monitoring ",Journal of Geophysics and Engineering, Volume 10, Number 2, 17 pages ,November 2013, IOP Science, doi:10.1088/1742-2132/10/6/064003. [3] J Dumoulin and V Boucher; "Infrared thermography system for transport infrastructures survey with inline local atmospheric parameter measurements and offline model for radiation attenuation evaluations," J. Appl. Remote Sens., 8(1), 084978 (2014). doi:10.1117/1.JRS.8.084978. [4] V. Le Cam, M. Doehler, M. Le Pen, L. Mevel. "Embedded modal analysis algorithms on the smart wireless sensor platform PEGASE", In Proc. 9th International Workshop on Structural Health Monitoring, Stanford, CA, USA, 2013. [5] M. Zghal, L. Mevel, P. Del Moral, "Modal parameter estimation using interacting Kalman filter", Mechanical Systems and Signal Processing, 2014.

  2. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2003

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Tytgat, Guy; Moran, Seth C.; Sanchez, John J.; McNutt, Stephen R.; Estes, Steve; Paskievitch, John

    2004-01-01

    The Alaska Volcano Observatory (AVO), a cooperative program of the U.S. Geological Survey, the Geophysical Institute of the University of Alaska Fairbanks, and the Alaska Division of Geological and Geophysical Surveys, has maintained seismic monitoring networks at historically active volcanoes in Alaska since 1988. The primary objectives of this program are the near real time seismic monitoring of active, potentially hazardous, Alaskan volcanoes and the investigation of seismic processes associated with active volcanism. This catalog presents the calculated earthquake hypocenter and phase arrival data, and changes in the seismic monitoring program for the period January 1 through December 31, 2003.The AVO seismograph network was used to monitor the seismic activity at twenty-seven volcanoes within Alaska in 2003. These include Mount Wrangell, Mount Spurr, Redoubt Volcano, Iliamna Volcano, Augustine Volcano, Katmai volcanic cluster (Snowy Mountain, Mount Griggs, Mount Katmai, Novarupta, Trident Volcano, Mount Mageik, Mount Martin), Aniakchak Crater, Mount Veniaminof, Pavlof Volcano, Mount Dutton, Isanotski Peaks, Shishaldin Volcano, Fisher Caldera, Westdahl Peak, Akutan Peak, Makushin Volcano, Okmok Caldera, Great Sitkin Volcano, Kanaga Volcano, Tanaga Volcano, and Mount Gareloi. Monitoring highlights in 2003 include: continuing elevated seismicity at Mount Veniaminof in January-April (volcanic unrest began in August 2002), volcanogenic seismic swarms at Shishaldin Volcano throughout the year, and low-level tremor at Okmok Caldera throughout the year. Instrumentation and data acquisition highlights in 2003 were the installation of subnetworks on Tanaga and Gareloi Islands, the installation of broadband installations on Akutan Volcano and Okmok Caldera, and the establishment of telemetry for the Okmok Caldera subnetwork. AVO located 3911 earthquakes in 2003.This catalog includes: (1) a description of instruments deployed in the field and their locations; (2) a description of earthquake detection, recording, analysis, and data archival systems; (3) a description of velocity models used for earthquake locations; (4) a summary of earthquakes located in 2003; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, and location quality statistics; daily station usage statistics; and all HYPOELLIPSE files used to determine the earthquake locations in 2003.

  3. Identification of characteristic frequencies of damaged railway tracks using field hammer test measurements

    NASA Astrophysics Data System (ADS)

    Oregui, M.; Li, Z.; Dollevoet, R.

    2015-03-01

    In this paper, the feasibility of the Frequency Response Function (FRF)-based statistical method to identify the characteristic frequencies of railway track defects is studied. The method compares a damaged track state to a healthy state based on non-destructive field hammer test measurements. First, a study is carried out to investigate the repeatability of hammer tests in railway tracks. By changing the excitation and measurement locations it is shown that the variability introduced by the test process is negligible. Second, following the concepts of control charts employed in process monitoring, a method to define an approximate healthy state is introduced by using hammer test measurements at locations without visual damage. Then, the feasibility study includes an investigation into squats (i.e. a major type of rail surface defect) of varying severity. The identified frequency ranges related to squats agree with those found in an extensively validated vehicle-borne detection system. Therefore, the FRF-based statistical method in combination with the non-destructive hammer test measurements has the potential to be employed to identify the characteristic frequencies of damaged conditions in railway tracks in the frequency range of 300-3000 Hz.

  4. Knowledge Reasoning with Semantic Data for Real-Time Data Processing in Smart Factory

    PubMed Central

    Wang, Shiyong; Li, Di; Liu, Chengliang

    2018-01-01

    The application of high-bandwidth networks and cloud computing in manufacturing systems will be followed by mass data. Industrial data analysis plays important roles in condition monitoring, performance optimization, flexibility, and transparency of the manufacturing system. However, the currently existing architectures are mainly for offline data analysis, not suitable for real-time data processing. In this paper, we first define the smart factory as a cloud-assisted and self-organized manufacturing system in which physical entities such as machines, conveyors, and products organize production through intelligent negotiation and the cloud supervises this self-organized process for fault detection and troubleshooting based on data analysis. Then, we propose a scheme to integrate knowledge reasoning and semantic data where the reasoning engine processes the ontology model with real time semantic data coming from the production process. Based on these ideas, we build a benchmarking system for smart candy packing application that supports direct consumer customization and flexible hybrid production, and the data are collected and processed in real time for fault diagnosis and statistical analysis. PMID:29415444

  5. Weekend Commercial Children's Television, 1975. A Study of Programming and Advertising to Children on Five Boston Stations.

    ERIC Educational Resources Information Center

    Barcus, F. Earle

    Some 25-1/2 hours of Boston commercial television for children were monitored on a Saturday and Sunday in April 1975. The monitoring covered three network affiliated stations and two independent UHF stations. Monitoring, coding, and editing provided much statistical data, which was analyzed to yield findings in the areas of distribution of…

  6. Towards good practice for health statistics: lessons from the Millennium Development Goal health indicators.

    PubMed

    Murray, Christopher J L

    2007-03-10

    Health statistics are at the centre of an increasing number of worldwide health controversies. Several factors are sharpening the tension between the supply and demand for high quality health information, and the health-related Millennium Development Goals (MDGs) provide a high-profile example. With thousands of indicators recommended but few measured well, the worldwide health community needs to focus its efforts on improving measurement of a small set of priority areas. Priority indicators should be selected on the basis of public-health significance and several dimensions of measurability. Health statistics can be divided into three types: crude, corrected, and predicted. Health statistics are necessary inputs to planning and strategic decision making, programme implementation, monitoring progress towards targets, and assessment of what works and what does not. Crude statistics that are biased have no role in any of these steps; corrected statistics are preferred. For strategic decision making, when corrected statistics are unavailable, predicted statistics can play an important part. For monitoring progress towards agreed targets and assessment of what works and what does not, however, predicted statistics should not be used. Perhaps the most effective method to decrease controversy over health statistics and to encourage better primary data collection and the development of better analytical methods is a strong commitment to provision of an explicit data audit trail. This initiative would make available the primary data, all post-data collection adjustments, models including covariates used for farcasting and forecasting, and necessary documentation to the public.

  7. Volcanic hazard assessment for the Canary Islands (Spain) using extreme value theory, and the recent volcanic eruption of El Hierro

    NASA Astrophysics Data System (ADS)

    Sobradelo, R.; Martí, J.; Mendoza-Rosas, A. T.; Gómez, G.

    2012-04-01

    The Canary Islands are an active volcanic region densely populated and visited by several millions of tourists every year. Nearly twenty eruptions have been reported through written chronicles in the last 600 years, suggesting that the probability of a new eruption in the near future is far from zero. This shows the importance of assessing and monitoring the volcanic hazard of the region in order to reduce and manage its potential volcanic risk, and ultimately contribute to the design of appropriate preparedness plans. Hence, the probabilistic analysis of the volcanic eruption time series for the Canary Islands is an essential step for the assessment of volcanic hazard and risk in the area. Such a series describes complex processes involving different types of eruptions over different time scales. Here we propose a statistical method for calculating the probabilities of future eruptions which is most appropriate given the nature of the documented historical eruptive data. We first characterise the eruptions by their magnitudes, and then carry out a preliminary analysis of the data to establish the requirements for the statistical method. Past studies in eruptive time series used conventional statistics and treated the series as an homogeneous process. In this paper, we will use a method that accounts for the time-dependence of the series and includes rare or extreme events, in the form of few data of large eruptions, since these data require special methods of analysis. Hence, we will use a statistical method from extreme value theory. In particular, we will apply a non-homogeneous Poisson process to the historical eruptive data of the Canary Islands to estimate the probability of having at least one volcanic event of a magnitude greater than one in the upcoming years. Shortly after the publication of this method an eruption in the island of El Hierro took place for the first time in historical times, supporting our method and contributing towards the validation of our results.

  8. Statistical monitoring of data quality and consistency in the Stomach Cancer Adjuvant Multi-institutional Trial Group Trial.

    PubMed

    Timmermans, Catherine; Doffagne, Erik; Venet, David; Desmet, Lieven; Legrand, Catherine; Burzykowski, Tomasz; Buyse, Marc

    2016-01-01

    Data quality may impact the outcome of clinical trials; hence, there is a need to implement quality control strategies for the data collected. Traditional approaches to quality control have primarily used source data verification during on-site monitoring visits, but these approaches are hugely expensive as well as ineffective. There is growing interest in central statistical monitoring (CSM) as an effective way to ensure data quality and consistency in multicenter clinical trials. CSM with SMART™ uses advanced statistical tools that help identify centers with atypical data patterns which might be the sign of an underlying quality issue. This approach was used to assess the quality and consistency of the data collected in the Stomach Cancer Adjuvant Multi-institutional Trial Group Trial, involving 1495 patients across 232 centers in Japan. In the Stomach Cancer Adjuvant Multi-institutional Trial Group Trial, very few atypical data patterns were found among the participating centers, and none of these patterns were deemed to be related to a quality issue that could significantly affect the outcome of the trial. CSM can be used to provide a check of the quality of the data from completed multicenter clinical trials before analysis, publication, and submission of the results to regulatory agencies. It can also form the basis of a risk-based monitoring strategy in ongoing multicenter trials. CSM aims at improving data quality in clinical trials while also reducing monitoring costs.

  9. Biomanufacturing process analytical technology (PAT) application for downstream processing: Using dissolved oxygen as an indicator of product quality for a protein refolding reaction.

    PubMed

    Pizarro, Shelly A; Dinges, Rachel; Adams, Rachel; Sanchez, Ailen; Winter, Charles

    2009-10-01

    Process analytical technology (PAT) is an initiative from the US FDA combining analytical and statistical tools to improve manufacturing operations and ensure regulatory compliance. This work describes the use of a continuous monitoring system for a protein refolding reaction to provide consistency in product quality and process performance across batches. A small-scale bioreactor (3 L) is used to understand the impact of aeration for refolding recombinant human vascular endothelial growth factor (rhVEGF) in a reducing environment. A reverse-phase HPLC assay is used to assess product quality. The goal in understanding the oxygen needs of the reaction and its impact to quality, is to make a product that is efficiently refolded to its native and active form with minimum oxidative degradation from batch to batch. Because this refolding process is heavily dependent on oxygen, the % dissolved oxygen (DO) profile is explored as a PAT tool to regulate process performance at commercial manufacturing scale. A dynamic gassing out approach using constant mass transfer (k(L)a) is used for scale-up of the aeration parameters to manufacturing scale tanks (2,000 L, 15,000 L). The resulting DO profiles of the refolding reaction show similar trends across scales and these are analyzed using rpHPLC. The desired product quality attributes are then achieved through alternating air and nitrogen sparging triggered by changes in the monitored DO profile. This approach mitigates the impact of differences in equipment or feedstock components between runs, and is directly inline with the key goal of PAT to "actively manage process variability using a knowledge-based approach." (c) 2009 Wiley Periodicals, Inc.

  10. Defense and Development in Sub-Saharan Africa: Codebook.

    DTIC Science & Technology

    1988-03-01

    countries by presenting the different data sources and explaining how they were compiled. The statistics in the 0 database cover 41 African countries for...February 1984, pp. 157-164 -vi Finally, in addition to the economic and military data , some statistics have been compiled that monitor social and...32 IX. SOCIAL/POLITICAL STATISTICS ....................................34 SOURCES AND NOTES ON COLLECTION OF DATA

  11. Research on the optimization of air quality monitoring station layout based on spatial grid statistical analysis method.

    PubMed

    Li, Tianxin; Zhou, Xing Chen; Ikhumhen, Harrison Odion; Difei, An

    2018-05-01

    In recent years, with the significant increase in urban development, it has become necessary to optimize the current air monitoring stations to reflect the quality of air in the environment. Highlighting the spatial representation of some air monitoring stations using Beijing's regional air monitoring station data from 2012 to 2014, the monthly mean particulate matter concentration (PM10) in the region was calculated and through the IDW interpolation method and spatial grid statistical method using GIS, the spatial distribution of PM10 concentration in the whole region was deduced. The spatial distribution variation of districts in Beijing using the gridding model was performed, and through the 3-year spatial analysis, PM10 concentration data including the variation and spatial overlay (1.5 km × 1.5 km cell resolution grid), the spatial distribution result obtained showed that the total PM10 concentration frequency variation exceeded the standard. It is very important to optimize the layout of the existing air monitoring stations by combining the concentration distribution of air pollutants with the spatial region using GIS.

  12. Monitoring the metering performance of an electronic voltage transformer on-line based on cyber-physics correlation analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Zhu; Li, Hongbin; Tang, Dengping; Hu, Chen; Jiao, Yang

    2017-10-01

    Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer.

  13. Short-term monitoring of benzene air concentration in an urban area: a preliminary study of application of Kruskal-Wallis non-parametric test to assess pollutant impact on global environment and indoor.

    PubMed

    Mura, Maria Chiara; De Felice, Marco; Morlino, Roberta; Fuselli, Sergio

    2010-01-01

    In step with the need to develop statistical procedures to manage small-size environmental samples, in this work we have used concentration values of benzene (C6H6), concurrently detected by seven outdoor and indoor monitoring stations over 12 000 minutes, in order to assess the representativeness of collected data and the impact of the pollutant on indoor environment. Clearly, the former issue is strictly connected to sampling-site geometry, which proves critical to correctly retrieving information from analysis of pollutants of sanitary interest. Therefore, according to current criteria for network-planning, single stations have been interpreted as nodes of a set of adjoining triangles; then, a) node pairs have been taken into account in order to estimate pollutant stationarity on triangle sides, as well as b) node triplets, to statistically associate data from air-monitoring with the corresponding territory area, and c) node sextuplets, to assess the impact probability of the outdoor pollutant on indoor environment for each area. Distributions from the various node combinations are all non-Gaussian, in the consequently, Kruskal-Wallis (KW) non-parametric statistics has been exploited to test variability on continuous density function from each pair, triplet and sextuplet. Results from the above-mentioned statistical analysis have shown randomness of site selection, which has not allowed a reliable generalization of monitoring data to the entire selected territory, except for a single "forced" case (70%); most important, they suggest a possible procedure to optimize network design.

  14. Using a portable sulfide monitor as a motivational tool: a clinical study.

    PubMed

    Uppal, Ranjit Singh; Malhotra, Ranjan; Grover, Vishakha; Grover, Deepak

    2012-01-01

    Bad breath has a significant impact on daily life of those who suffer from it. Oral malodor may rank only behind dental caries and periodontal disease as the cause of patient's visit to dentist. An aim of this study was to use a portable sulfide monitor as a motivational tool for encouraging the patients towards the better oral hygiene by correlating the plaque scores with sulfide monitor scores, and comparing the sulfide monitor scores before and after complete prophylaxis and 3 months after patient motivation. 30 patients with chronic periodontitis, having chief complaint of oral malodor participated in this study. At first visit, the plaque scores (P1) and sulfide monitor scores before (BCR1) and after complete oral prophylaxis (BCR2) were taken. Then the patients were motivated towards the better oral hygiene. After 3 months, plaque scores (P2) and sulfide monitor scores (BCR3) were recorded again. It was done using SPSS (student package software for statistical analysis). Paired sample test was performed. Statistically significant reduction in sulfide monitor scores was reported after the complete oral prophylaxis and 3 months after patient motivation. Plaque scores were significantly reduced after a period of 3 months. Plaque scores and breathchecker scores were positively correlated. An intensity of the oral malodor was positively correlated with the plaque scores. The portable sulfide monitor was efficacious in motivating the patients towards the better oral hygiene.

  15. Racial/Ethnic Disparity in NICU Quality of Care Delivery.

    PubMed

    Profit, Jochen; Gould, Jeffrey B; Bennett, Mihoko; Goldstein, Benjamin A; Draper, David; Phibbs, Ciaran S; Lee, Henry C

    2017-09-01

    Differences in NICU quality of care provided to very low birth weight (<1500 g) infants may contribute to the persistence of racial and/or ethnic disparity. An examination of such disparities in a population-based sample across multiple dimensions of care and outcomes is lacking. Prospective observational analysis of 18 616 very low birth weight infants in 134 California NICUs between January 1, 2010, and December 31, 2014. We assessed quality of care via the Baby-MONITOR, a composite indicator consisting of 9 process and outcome measures of quality. For each NICU, we calculated a risk-adjusted composite and individual component quality score for each race and/or ethnicity. We standardized each score to the overall population to compare quality of care between and within NICUs. We found clinically and statistically significant racial and/or ethnic variation in quality of care between NICUs as well as within NICUs. Composite quality scores ranged by 5.26 standard units (range: -2.30 to 2.96). Adjustment of Baby-MONITOR scores by race and/or ethnicity had only minimal effect on comparative assessments of NICU performance. Among subcomponents of the Baby-MONITOR, non-Hispanic white infants scored higher on measures of process compared with African Americans and Hispanics. Compared with whites, African Americans scored higher on measures of outcome; Hispanics scored lower on 7 of the 9 Baby-MONITOR subcomponents. Significant racial and/or ethnic variation in quality of care exists between and within NICUs. Providing feedback of disparity scores to NICUs could serve as an important starting point for promoting improvement and reducing disparities. Copyright © 2017 by the American Academy of Pediatrics.

  16. Digitise This! A Quick and Easy Remote Sensing Method to Monitor the Daily Extent of Dredge Plumes

    PubMed Central

    Evans, Richard D.; Murray, Kathy L.; Field, Stuart N.; Moore, James A. Y.; Shedrawi, George; Huntley, Barton G.; Fearns, Peter; Broomhall, Mark; McKinna, Lachlan I. W.; Marrable, Daniel

    2012-01-01

    Technological advancements in remote sensing and GIS have improved natural resource managers’ abilities to monitor large-scale disturbances. In a time where many processes are heading towards automation, this study has regressed to simple techniques to bridge a gap found in the advancement of technology. The near-daily monitoring of dredge plume extent is common practice using Moderate Resolution Imaging Spectroradiometer (MODIS) imagery and associated algorithms to predict the total suspended solids (TSS) concentration in the surface waters originating from floods and dredge plumes. Unfortunately, these methods cannot determine the difference between dredge plume and benthic features in shallow, clear water. This case study at Barrow Island, Western Australia, uses hand digitising to demonstrate the ability of human interpretation to determine this difference with a level of confidence and compares the method to contemporary TSS methods. Hand digitising was quick, cheap and required very little training of staff to complete. Results of ANOSIM R statistics show remote sensing derived TSS provided similar spatial results if they were thresholded to at least 3 mg L−1. However, remote sensing derived TSS consistently provided false-positive readings of shallow benthic features as Plume with a threshold up to TSS of 6 mg L−1, and began providing false-negatives (excluding actual plume) at a threshold as low as 4 mg L−1. Semi-automated processes that estimate plume concentration and distinguish between plumes and shallow benthic features without the arbitrary nature of human interpretation would be preferred as a plume monitoring method. However, at this stage, the hand digitising method is very useful and is more accurate at determining plume boundaries over shallow benthic features and is accessible to all levels of management with basic training. PMID:23240055

  17. Oxidation management of white wines using cyclic voltammetry and multivariate process monitoring.

    PubMed

    Martins, Rui C; Oliveira, Raquel; Bento, Fatima; Geraldo, Dulce; Lopes, Vitor V; Guedes de Pinho, Paula; Oliveira, Carla M; Silva Ferreira, Antonio C

    2008-12-24

    The development of a fingerprinting strategy capable to evaluate the "oxidation status" of white wines based on cyclic voltammetry is proposed here. It is known that the levels of specific antioxidants and redox mechanisms may be evaluated by cyclic voltammetry. This electrochemical technique was applied on two sets of samples. One group was composed of normal aged white wines and a second group obtained from a white wine forced aging protocol with different oxygen, SO(2), pH, and temperature regimens. A study of antioxidant additions, namely ascorbic acid, was also made in order to establish a statistical link between voltammogram fingerprints and chemical antioxidant substances. It was observed that the oxidation curve presented typical features, which enables sample discrimination according to age, oxygen consumption, and antioxidant additions. In fact, it was possible to place the results into four significant orthogonal directions, compressing 99.8% of nonrandom features. Attempts were made to make voltammogram fingerprinting a tool for monitoring oxidation management. For this purpose, a supervised multivariate control chart was developed using a control sample as reference. When white wines are plotted onto the chart, it is possible to monitor the oxidation status and to diagnose the effects of oxygen regimes and antioxidant activity. Finally, quantification of substances implicated in the oxidation process as reagents (antioxidants) and products (off-flavors) was tried using a supervised algorithmic the partial least square regression analysis. Good correlations (r > 0.93) were observed for ascorbic acid, Folin-Ciocalteu index, total SO(2), methional, and phenylacetaldehyde. These results show that cyclic voltammetry fingerprinting can be used to monitor and diagnose the effects of wine oxidation.

  18. Design and evaluation of a mobile application to assist the self-monitoring of the chronic kidney disease in developing countries.

    PubMed

    Sobrinho, Alvaro; da Silva, Leandro Dias; Perkusich, Angelo; Pinheiro, Maria Eliete; Cunha, Paulo

    2018-01-12

    The chronic kidney disease (CKD) is a worldwide critical problem, especially in developing countries. CKD patients usually begin their treatment in advanced stages, which requires dialysis and kidney transplantation, and consequently, affects mortality rates. This issue is faced by a mobile health (mHealth) application (app) that aims to assist the early diagnosis and self-monitoring of the disease progression. A user-centered design (UCD) approach involving health professionals (nurse and nephrologists) and target users guided the development process of the app between 2012 and 2016. In-depth interviews and prototyping were conducted along with healthcare professionals throughout the requirements elicitation process. Elicited requirements were translated into a native mHealth app targeting the Android platform. Afterward, the Cohen's Kappa coefficient statistics was applied to evaluate the agreement between the app and three nephrologists who analyzed test results collected from 60 medical records. Finally, eight users tested the app and were interviewed about usability and user perceptions. A mHealth app was designed to assist the CKD early diagnosis and self-monitoring considering quality attributes such as safety, effectiveness, and usability. A global Kappa value of 0.7119 showed a substantial degree of agreement between the app and three nephrologists. Results of face-to-face interviews with target users indicated a good user satisfaction. However, the task of CKD self-monitoring proved difficult because most of the users did not fully understand the meaning of specific biomarkers (e.g., creatinine). The UCD approach provided mechanisms to develop the app based on the real needs of users. Even with no perfect Kappa degree of agreement, results are satisfactory because it aims to refer patients to nephrologists in early stages, where they may confirm the CKD diagnosis.

  19. A Categorization of Dynamic Analyzers

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.

  20. Estimating Unbiased Land Cover Change Areas In The Colombian Amazon Using Landsat Time Series And Statistical Inference Methods

    NASA Astrophysics Data System (ADS)

    Arevalo, P. A.; Olofsson, P.; Woodcock, C. E.

    2017-12-01

    Unbiased estimation of the areas of conversion between land categories ("activity data") and their uncertainty is crucial for providing more robust calculations of carbon emissions to the atmosphere, as well as their removals. This is particularly important for the REDD+ mechanism of UNFCCC where an economic compensation is tied to the magnitude and direction of such fluxes. Dense time series of Landsat data and statistical protocols are becoming an integral part of forest monitoring efforts, but there are relatively few studies in the tropics focused on using these methods to advance operational MRV systems (Monitoring, Reporting and Verification). We present the results of a prototype methodology for continuous monitoring and unbiased estimation of activity data that is compliant with the IPCC Approach 3 for representation of land. We used a break detection algorithm (Continuous Change Detection and Classification, CCDC) to fit pixel-level temporal segments to time series of Landsat data in the Colombian Amazon. The segments were classified using a Random Forest classifier to obtain annual maps of land categories between 2001 and 2016. Using these maps, a biannual stratified sampling approach was implemented and unbiased stratified estimators constructed to calculate area estimates with confidence intervals for each of the stable and change classes. Our results provide evidence of a decrease in primary forest as a result of conversion to pastures, as well as increase in secondary forest as pastures are abandoned and the forest allowed to regenerate. Estimating areas of other land transitions proved challenging because of their very small mapped areas compared to stable classes like forest, which corresponds to almost 90% of the study area. Implications on remote sensing data processing, sample allocation and uncertainty reduction are also discussed.

  1. Visual Perception-Based Statistical Modeling of Complex Grain Image for Product Quality Monitoring and Supervision on Assembly Production Line

    PubMed Central

    Chen, Qing; Xu, Pengfei; Liu, Wenzhong

    2016-01-01

    Computer vision as a fast, low-cost, noncontact, and online monitoring technology has been an important tool to inspect product quality, particularly on a large-scale assembly production line. However, the current industrial vision system is far from satisfactory in the intelligent perception of complex grain images, comprising a large number of local homogeneous fragmentations or patches without distinct foreground and background. We attempt to solve this problem based on the statistical modeling of spatial structures of grain images. We present a physical explanation in advance to indicate that the spatial structures of the complex grain images are subject to a representative Weibull distribution according to the theory of sequential fragmentation, which is well known in the continued comminution of ore grinding. To delineate the spatial structure of the grain image, we present a method of multiscale and omnidirectional Gaussian derivative filtering. Then, a product quality classifier based on sparse multikernel–least squares support vector machine is proposed to solve the low-confidence classification problem of imbalanced data distribution. The proposed method is applied on the assembly line of a food-processing enterprise to classify (or identify) automatically the production quality of rice. The experiments on the real application case, compared with the commonly used methods, illustrate the validity of our method. PMID:26986726

  2. Ariadne's Thread: A Robust Software Solution Leading to Automated Absolute and Relative Quantification of SRM Data.

    PubMed

    Nasso, Sara; Goetze, Sandra; Martens, Lennart

    2015-09-04

    Selected reaction monitoring (SRM) MS is a highly selective and sensitive technique to quantify protein abundances in complex biological samples. To enhance the pace of SRM large studies, a validated, robust method to fully automate absolute quantification and to substitute for interactive evaluation would be valuable. To address this demand, we present Ariadne, a Matlab software. To quantify monitored targets, Ariadne exploits metadata imported from the transition lists, and targets can be filtered according to mProphet output. Signal processing and statistical learning approaches are combined to compute peptide quantifications. To robustly estimate absolute abundances, the external calibration curve method is applied, ensuring linearity over the measured dynamic range. Ariadne was benchmarked against mProphet and Skyline by comparing its quantification performance on three different dilution series, featuring either noisy/smooth traces without background or smooth traces with complex background. Results, evaluated as efficiency, linearity, accuracy, and precision of quantification, showed that Ariadne's performance is independent of data smoothness and complex background presence and that Ariadne outperforms mProphet on the noisier data set and improved 2-fold Skyline's accuracy and precision for the lowest abundant dilution with complex background. Remarkably, Ariadne could statistically distinguish from each other all different abundances, discriminating dilutions as low as 0.1 and 0.2 fmol. These results suggest that Ariadne offers reliable and automated analysis of large-scale SRM differential expression studies.

  3. Waveform classification and statistical analysis of seismic precursors to the July 2008 Vulcanian Eruption of Soufrière Hills Volcano, Montserrat

    NASA Astrophysics Data System (ADS)

    Rodgers, Mel; Smith, Patrick; Pyle, David; Mather, Tamsin

    2016-04-01

    Understanding the transition between quiescence and eruption at dome-forming volcanoes, such as Soufrière Hills Volcano (SHV), Montserrat, is important for monitoring volcanic activity during long-lived eruptions. Statistical analysis of seismic events (e.g. spectral analysis and identification of multiplets via cross-correlation) can be useful for characterising seismicity patterns and can be a powerful tool for analysing temporal changes in behaviour. Waveform classification is crucial for volcano monitoring, but consistent classification, both during real-time analysis and for retrospective analysis of previous volcanic activity, remains a challenge. Automated classification allows consistent re-classification of events. We present a machine learning (random forest) approach to rapidly classify waveforms that requires minimal training data. We analyse the seismic precursors to the July 2008 Vulcanian explosion at SHV and show systematic changes in frequency content and multiplet behaviour that had not previously been recognised. These precursory patterns of seismicity may be interpreted as changes in pressure conditions within the conduit during magma ascent and could be linked to magma flow rates. Frequency analysis of the different waveform classes supports the growing consensus that LP and Hybrid events should be considered end members of a continuum of low-frequency source processes. By using both supervised and unsupervised machine-learning methods we investigate the nature of waveform classification and assess current classification schemes.

  4. Electrical engineering support of telemetered traffic monitoring sites : final report, March 2009.

    DOT National Transportation Integrated Search

    2009-03-01

    "The aim of this project was to provide electrical engineering support for the telemetered traffic monitoring sites (TTMS) operated by the Statistics Office of the Florida Department of Transportation. This project was a companion to project BD-543-1...

  5. Civil engineering support for the traffic monitoring program : final report, January 2010.

    DOT National Transportation Integrated Search

    2010-01-01

    This project was aimed at providing various civil engineering support services for the telemetered traffic monitoring sites operated by the Statistics Office of the Florida Department of Transportation. This was a companion project to the one that pr...

  6. PROTOTYPING A VISION FOR INTER-AGENCY TERRESTRIAL INVENTORY AND MONITORING: A STATISTICAL PERSPECTIVE

    EPA Science Inventory

    A demonstration project in Oregon examined the feasibility of combining Federal environmental monitoring surveys. An integrated approach should remove duplication of effort and reduce the possibility of providing apparently conflicing information to policy makers and the public. ...

  7. 40 CFR 257.25 - Assessment monitoring program.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Assessment monitoring program. 257.25 Section 257.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES CRITERIA... unit caused the contamination, or that the statistically significant increase resulted from error in...

  8. Quantifying biodiversity using digital cameras and automated image analysis.

    NASA Astrophysics Data System (ADS)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and enabling automatic deletion of images generated by erroneous triggering (e.g. cloud movements). This is the first step to a hierarchical image processing framework, where situation subclasses such as birds or climatic conditions can be fed into more appropriate automated or semi-automated data mining software.

  9. Vegetation Monitoring with Gaussian Processes and Latent Force Models

    NASA Astrophysics Data System (ADS)

    Camps-Valls, Gustau; Svendsen, Daniel; Martino, Luca; Campos, Manuel; Luengo, David

    2017-04-01

    Monitoring vegetation by biophysical parameter retrieval from Earth observation data is a challenging problem, where machine learning is currently a key player. Neural networks, kernel methods, and Gaussian Process (GP) regression have excelled in parameter retrieval tasks at both local and global scales. GP regression is based on solid Bayesian statistics, yield efficient and accurate parameter estimates, and provides interesting advantages over competing machine learning approaches such as confidence intervals. However, GP models are hampered by lack of interpretability, that prevented the widespread adoption by a larger community. In this presentation we will summarize some of our latest developments to address this issue. We will review the main characteristics of GPs and their advantages in vegetation monitoring standard applications. Then, three advanced GP models will be introduced. First, we will derive sensitivity maps for the GP predictive function that allows us to obtain feature ranking from the model and to assess the influence of examples in the solution. Second, we will introduce a Joint GP (JGP) model that combines in situ measurements and simulated radiative transfer data in a single GP model. The JGP regression provides more sensible confidence intervals for the predictions, respects the physics of the underlying processes, and allows for transferability across time and space. Finally, a latent force model (LFM) for GP modeling that encodes ordinary differential equations to blend data-driven modeling and physical models of the system is presented. The LFM performs multi-output regression, adapts to the signal characteristics, is able to cope with missing data in the time series, and provides explicit latent functions that allow system analysis and evaluation. Empirical evidence of the performance of these models will be presented through illustrative examples.

  10. An introduction to structural health monitoring.

    PubMed

    Farrar, Charles R; Worden, Keith

    2007-02-15

    The process of implementing a damage identification strategy for aerospace, civil and mechanical engineering infrastructure is referred to as structural health monitoring (SHM). Here, damage is defined as changes to the material and/or geometric properties of these systems, including changes to the boundary conditions and system connectivity, which adversely affect the system's performance. A wide variety of highly effective local non-destructive evaluation tools are available for such monitoring. However, the majority of SHM research conducted over the last 30 years has attempted to identify damage in structures on a more global basis. The past 10 years have seen a rapid increase in the amount of research related to SHM as quantified by the significant escalation in papers published on this subject. The increased interest in SHM and its associated potential for significant life-safety and economic benefits has motivated the need for this theme issue. This introduction begins with a brief history of SHM technology development. Recent research has begun to recognize that the SHM problem is fundamentally one of the statistical pattern recognition (SPR) and a paradigm to address such a problem is described in detail herein as it forms the basis for organization of this theme issue. In the process of providing the historical overview and summarizing the SPR paradigm, the subsequent articles in this theme issue are cited in an effort to show how they fit into this overview of SHM. In conclusion, technical challenges that must be addressed if SHM is to gain wider application are discussed in a general manner.

  11. Mapping tobacco industry strategies in South East Asia for action planning and surveillance

    PubMed Central

    Stillman, F; Hoang, M; Linton, R; Ritthiphakdee, B; Trochim, W

    2008-01-01

    Objective: To develop a comprehensive conceptual framework of tobacco industry tactics in four countries in South East Asia for the purpose of: (1) generating consensus on key areas of importance and feasibility for regional and cross country tobacco industry monitoring and surveillance; (2) developing measures to track and monitor the effects of the tobacco industry and to design counterstrategies; and (3) building capacity to improve tobacco control planning in the participating countries. Design: A structured conceptualisation methodology known as concept mapping was used. The process included brainstorming, sorting and rating of statements describing industry activities. Statistical analyses used multidimensional scaling and cluster analysis. Interpretation of the maps was participatory, using regional tobacco control researchers, practitioners, and policy makers during a face to face meeting. Participants: 31 participants in this study come from the four countries represented in the project along with six people from the Johns Hopkins Blomberg School of Public Health. Conclusions: The map shows eight clusters of industry activities within the four countries. These were arranged into four general sectors: economics, politics, public relations and deception. For project design purposes, the map indicates areas of importance and feasibility for monitoring tobacco industry activities and serves as a basis for an initial discussion about action planning. Furthermore, the development of the map used a consensus building process across different stakeholders or stakeholder agencies and is critical when developing regional, cross border strategies for tracking and surveillance. PMID:18218787

  12. Environmental monitoring of the area surrounding oil wells in Val d'Agri (Italy): element accumulation in bovine and ovine organs.

    PubMed

    Miedico, Oto; Iammarino, Marco; Paglia, Giuseppe; Tarallo, Marina; Mangiacotti, Michele; Chiaravalle, A Eugenio

    2016-06-01

    In this work, environmental heavy metal contamination in the Val d'Agri area of Southern Italy was monitored, measuring the accumulation of 18 heavy metals (U, Hg, Pb, Cd, As, Sr, Sn, V, Ni, Cr, Mo, Co, Cu, Zn, Ca, Mn, Fe, and Al) in the organs of animals raised in the surrounding area (kidney, lung, and liver of bovine and ovine species). Val d'Agri features various oil processing centers which are potentially a significant source of environmental pollution, making it essential to perform studies that will outline the state of the art on which any recovery plans and interventions may be developed. The analysis was carried out using official and accredited analytical methods based on inductively coupled plasma mass spectrometry, and the measurements were statistically processed in order to give a contribution to risk assessment. Even though five samples showed Pb and Cd concentrations above the limits defined in the European Commission Regulation (EC) No 1881/2006, the mean concentrations of most elements suggest that contamination in this area is low. Consequently, these results also suggest that there is no particular risk for human exposure to toxic trace elements. Nevertheless, the findings of this work confirm that element accumulation in ovine species is correlated with geographical livestock area. Therefore, ovine-specific organs might be used as bioindicators for monitoring contamination by specific toxic elements in exposed areas.

  13. Equivalent circuit models for interpreting impedance perturbation spectroscopy data

    NASA Astrophysics Data System (ADS)

    Smith, R. Lowell

    2004-07-01

    As in-situ structural integrity monitoring disciplines mature, there is a growing need to process sensor/actuator data efficiently in real time. Although smaller, faster embedded processors will contribute to this, it is also important to develop straightforward, robust methods to reduce the overall computational burden for practical applications of interest. This paper addresses the use of equivalent circuit modeling techniques for inferring structure attributes monitored using impedance perturbation spectroscopy. In pioneering work about ten years ago significant progress was associated with the development of simple impedance models derived from the piezoelectric equations. Using mathematical modeling tools currently available from research in ultrasonics and impedance spectroscopy is expected to provide additional synergistic benefits. For purposes of structural health monitoring the objective is to use impedance spectroscopy data to infer the physical condition of structures to which small piezoelectric actuators are bonded. Features of interest include stiffness changes, mass loading, and damping or mechanical losses. Equivalent circuit models are typically simple enough to facilitate the development of practical analytical models of the actuator-structure interaction. This type of parametric structure model allows raw impedance/admittance data to be interpreted optimally using standard multiple, nonlinear regression analysis. One potential long-term outcome is the possibility of cataloging measured viscoelastic properties of the mechanical subsystems of interest as simple lists of attributes and their statistical uncertainties, whose evolution can be followed in time. Equivalent circuit models are well suited for addressing calibration and self-consistency issues such as temperature corrections, Poisson mode coupling, and distributed relaxation processes.

  14. Management system of occupational diseases in Korea: statistics, report and monitoring system.

    PubMed

    Rhee, Kyung Yong; Choe, Seong Weon

    2010-12-01

    The management system of occupational diseases in Korea can be assessed from the perspective of a surveillance system. Workers' compensation insurance reports are used to produce official statistics on occupational diseases in Korea. National working conditions surveys are used to monitor the magnitude of work-related symptoms and signs in the labor force. A health examination program was introduced to detect occupational diseases through both selective and mass screening programs. The Working Environment Measurement Institution assesses workers' exposure to hazards in the workplace. Government regulates that the employer should do health examinations and working conditions measurement through contracted private agencies and following the Occupational Safety and Health Act. It is hoped that these institutions may be able to effectively detect and monitor occupational diseases and hazards in the workplace. In view of this, the occupational management system in Korea is well designed, except for the national survey system. In the future, national surveys for detection of hazards and ill-health outcomes in workers should be developed. The existing surveillance system for occupational disease can be improved by providing more refined information through statistical analysis of surveillance data.

  15. The sequentially discounting autoregressive (SDAR) method for on-line automatic seismic event detecting on long term observation

    NASA Astrophysics Data System (ADS)

    Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.

    2017-12-01

    In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term series and achieve real-time monitoring. Finally, we employ the SDAR on a synthetic model and Tomakomai Ocean Bottom Cable (OBC) baseline data to prove the feasibility and advantage of our method.

  16. Linked Micromaps: Statistical Summaries in a Spatial Context

    EPA Science Inventory

    Communicating summaries of spatial data to decision makers and the public is challenging. We present a graphical method that provides both a geographic context and a statistical summary for such spatial data. Monitoring programs have a need for such geographical summaries. For ...

  17. Sensor-based monitoring and inspection of surface morphology in ultraprecision manufacturing processes

    NASA Astrophysics Data System (ADS)

    Rao, Prahalad Krishna

    This research proposes approaches for monitoring and inspection of surface morphology with respect to two ultraprecision/nanomanufacturing processes, namely, ultraprecision machining (UPM) and chemical mechanical planarization (CMP). The methods illustrated in this dissertation are motivated from the compelling need for in situ process monitoring in nanomanufacturing and invoke concepts from diverse scientific backgrounds, such as artificial neural networks, Bayesian learning, and algebraic graph theory. From an engineering perspective, this work has the following contributions: 1. A combined neural network and Bayesian learning approach for early detection of UPM process anomalies by integrating data from multiple heterogeneous in situ sensors (force, vibration, and acoustic emission) is developed. The approach captures process drifts in UPM of aluminum 6061 discs within 15 milliseconds of their inception and is therefore valuable for minimizing yield losses. 2. CMP process dynamics are mathematically represented using a deterministic multi-scale hierarchical nonlinear differential equation model. This process-machine inter-action (PMI) model is evocative of the various physio-mechanical aspects in CMP and closely emulates experimentally acquired vibration signal patterns, including complex nonlinear dynamics manifest in the process. By combining the PMI model predictions with features gathered from wirelessly acquired CMP vibration signal patterns, CMP process anomalies, such as pad wear, and drifts in polishing were identified in their nascent stage with high fidelity (R2 ~ 75%). 3. An algebraic graph theoretic approach for quantifying nano-surface morphology from optical micrograph images is developed. The approach enables a parsimonious representation of the topological relationships between heterogeneous nano-surface fea-tures, which are enshrined in graph theoretic entities, namely, the similarity, degree, and Laplacian matrices. Topological invariant measures (e.g., Fiedler number, Kirchoff index) extracted from these matrices are shown to be sensitive to evolving nano-surface morphology. For instance, we observed that prominent nanoscale morphological changes on CMP processed Cu wafers, although discernible visually, could not be tractably quantified using statistical metrology parameters, such as arithmetic average roughness (Sa), root mean square roughness (Sq), etc. In contrast, CMP induced nanoscale surface variations were captured on invoking graph theoretic topological invariants. Consequently, the graph theoretic approach can enable timely, non-contact, and in situ metrology of semiconductor wafers by obviating the need for reticent profile mapping techniques (e.g., AFM, SEM, etc.), and thereby prevent the propagation of yield losses over long production runs.

  18. Statistical and engineering methods for model enhancement

    NASA Astrophysics Data System (ADS)

    Chang, Chia-Jung

    Models which describe the performance of physical process are essential for quality prediction, experimental planning, process control and optimization. Engineering models developed based on the underlying physics/mechanics of the process such as analytic models or finite element models are widely used to capture the deterministic trend of the process. However, there usually exists stochastic randomness in the system which may introduce the discrepancy between physics-based model predictions and observations in reality. Alternatively, statistical models can be used to develop models to obtain predictions purely based on the data generated from the process. However, such models tend to perform poorly when predictions are made away from the observed data points. This dissertation contributes to model enhancement research by integrating physics-based model and statistical model to mitigate the individual drawbacks and provide models with better accuracy by combining the strengths of both models. The proposed model enhancement methodologies including the following two streams: (1) data-driven enhancement approach and (2) engineering-driven enhancement approach. Through these efforts, more adequate models are obtained, which leads to better performance in system forecasting, process monitoring and decision optimization. Among different data-driven enhancement approaches, Gaussian Process (GP) model provides a powerful methodology for calibrating a physical model in the presence of model uncertainties. However, if the data contain systematic experimental errors, the GP model can lead to an unnecessarily complex adjustment of the physical model. In Chapter 2, we proposed a novel enhancement procedure, named as “Minimal Adjustment”, which brings the physical model closer to the data by making minimal changes to it. This is achieved by approximating the GP model by a linear regression model and then applying a simultaneous variable selection of the model and experimental bias terms. Two real examples and simulations are presented to demonstrate the advantages of the proposed approach. Different from enhancing the model based on data-driven perspective, an alternative approach is to focus on adjusting the model by incorporating the additional domain or engineering knowledge when available. This often leads to models that are very simple and easy to interpret. The concepts of engineering-driven enhancement are carried out through two applications to demonstrate the proposed methodologies. In the first application where polymer composite quality is focused, nanoparticle dispersion has been identified as a crucial factor affecting the mechanical properties. Transmission Electron Microscopy (TEM) images are commonly used to represent nanoparticle dispersion without further quantifications on its characteristics. In Chapter 3, we developed the engineering-driven nonhomogeneous Poisson random field modeling strategy to characterize nanoparticle dispersion status of nanocomposite polymer, which quantitatively represents the nanomaterial quality presented through image data. The model parameters are estimated through the Bayesian MCMC technique to overcome the challenge of limited amount of accessible data due to the time consuming sampling schemes. The second application is to calibrate the engineering-driven force models of laser-assisted micro milling (LAMM) process statistically, which facilitates a systematic understanding and optimization of targeted processes. In Chapter 4, the force prediction interval has been derived by incorporating the variability in the runout parameters as well as the variability in the measured cutting forces. The experimental results indicate that the model predicts the cutting force profile with good accuracy using a 95% confidence interval. To conclude, this dissertation is the research drawing attention to model enhancement, which has considerable impacts on modeling, design, and optimization of various processes and systems. The fundamental methodologies of model enhancement are developed and further applied to various applications. These research activities developed engineering compliant models for adequate system predictions based on observational data with complex variable relationships and uncertainty, which facilitate process planning, monitoring, and real-time control.

  19. [Prevention of gastrointestinal bleeding in patients with advanced burns].

    PubMed

    Vagner, D O; Krylov, K M; Verbitsky, V G; Shlyk, I V

    2018-01-01

    To reduce the incidence of gastrointestinal bleeding in patients with advanced burns by developing a prophylactic algorithm. The study consisted of retrospective group of 488 patients with thermal burns grade II-III over 20% of body surface area and prospective group of 135 patients with a similar thermal trauma. Standard clinical and laboratory examination was applied. Instrumental survey included fibrogastroduodenoscopy, endoscopic pH-metry and invasive volumetric monitoring (PICCO plus). Statistical processing was carried out with Microsoft Office Excel 2007 and IBM SPSS 20.0. New algorithm significantly decreased incidence of gastrointestinal bleeding (p<0.001) and mortality rate (p=0.006) in patients with advanced burns.

  20. Monitoring of vegetation dynamics on the former military training area Königsbrücker Heide using remote sensing time series

    NASA Astrophysics Data System (ADS)

    Wessollek, Christine; Karrasch, Pierre

    2016-10-01

    In 1989 about 1.5 million soldiers were stationed in Germany. With the political changes in the early 1990s a substantial decline of the staff occurred on currently 200,000 employees in the armed forces and less than 60,000 soldiers of foreign forces. These processes entailed conversions of large areas not longer used for military purposes, especially in the new federal states in the eastern part of Germany. One of these conversion areas is the former military training area Konigsbruck in Saxony. For the analysis of vegetation and its development over time, the Normalized Difference Vegetation Index (NDVI) has established as one of the most important indicators. In this context, the questions arise whether MODIS NDVI products are suitable to determine conversion processes on former military territories like military training areas and what development processes occurred in the "Konigsbrucker Heide" in the past 15 years. First, a decomposition of each series in its trend component, seasonality and the remaining residuals is performed. For the trend component different regression models are tested. Statistical analysis of these trends can reveal different developments, for example in nature development zones (without human impact) and zones of controlled succession. The presented workflow is intended to show the opportunity to support a high temporal resolution monitoring of conversion areas such as former military training areas.

Top