Sample records for multiple process monitoring

  1. A model of human decision making in multiple process monitoring situations

    NASA Technical Reports Server (NTRS)

    Greenstein, J. S.; Rouse, W. B.

    1982-01-01

    Human decision making in multiple process monitoring situations is considered. It is proposed that human decision making in many multiple process monitoring situations can be modeled in terms of the human's detection of process related events and his allocation of attention among processes once he feels event have occurred. A mathematical model of human event detection and attention allocation performance in multiple process monitoring situations is developed. An assumption made in developing the model is that, in attempting to detect events, the human generates estimates of the probabilities that events have occurred. An elementary pattern recognition technique, discriminant analysis, is used to model the human's generation of these probability estimates. The performance of the model is compared to that of four subjects in a multiple process monitoring situation requiring allocation of attention among processes.

  2. Multiple products monitoring as a robust approach for peptide quantification.

    PubMed

    Baek, Je-Hyun; Kim, Hokeun; Shin, Byunghee; Yu, Myeong-Hee

    2009-07-01

    Quantification of target peptides and proteins is crucial for biomarker discovery. Approaches such as selected reaction monitoring (SRM) and multiple reaction monitoring (MRM) rely on liquid chromatography and mass spectrometric analysis of defined peptide product ions. These methods are not very widespread because the determination of quantifiable product ion using either SRM or MRM is a very time-consuming process. We developed a novel approach for quantifying target peptides without such an arduous process of ion selection. This method is based on monitoring multiple product ions (multiple products monitoring: MpM) from full-range MS2 spectra of a target precursor. The MpM method uses a scoring system that considers both the absolute intensities of product ions and the similarities between the query MS2 spectrum and the reference MS2 spectrum of the target peptide. Compared with conventional approaches, MpM greatly improves sensitivity and selectivity of peptide quantification using an ion-trap mass spectrometer.

  3. Multiple-User, Multitasking, Virtual-Memory Computer System

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Roth, Don J.; Stang, David B.

    1993-01-01

    Computer system designed and programmed to serve multiple users in research laboratory. Provides for computer control and monitoring of laboratory instruments, acquisition and anlaysis of data from those instruments, and interaction with users via remote terminals. System provides fast access to shared central processing units and associated large (from megabytes to gigabytes) memories. Underlying concept of system also applicable to monitoring and control of industrial processes.

  4. [Construction and analysis of a monitoring system with remote real-time multiple physiological parameters based on cloud computing].

    PubMed

    Zhu, Lingyun; Li, Lianjie; Meng, Chunyan

    2014-12-01

    There have been problems in the existing multiple physiological parameter real-time monitoring system, such as insufficient server capacity for physiological data storage and analysis so that data consistency can not be guaranteed, poor performance in real-time, and other issues caused by the growing scale of data. We therefore pro posed a new solution which was with multiple physiological parameters and could calculate clustered background data storage and processing based on cloud computing. Through our studies, a batch processing for longitudinal analysis of patients' historical data was introduced. The process included the resource virtualization of IaaS layer for cloud platform, the construction of real-time computing platform of PaaS layer, the reception and analysis of data stream of SaaS layer, and the bottleneck problem of multi-parameter data transmission, etc. The results were to achieve in real-time physiological information transmission, storage and analysis of a large amount of data. The simulation test results showed that the remote multiple physiological parameter monitoring system based on cloud platform had obvious advantages in processing time and load balancing over the traditional server model. This architecture solved the problems including long turnaround time, poor performance of real-time analysis, lack of extensibility and other issues, which exist in the traditional remote medical services. Technical support was provided in order to facilitate a "wearable wireless sensor plus mobile wireless transmission plus cloud computing service" mode moving towards home health monitoring for multiple physiological parameter wireless monitoring.

  5. System and process for pulsed multiple reaction monitoring

    DOEpatents

    Belov, Mikhail E

    2013-05-17

    A new pulsed multiple reaction monitoring process and system are disclosed that uses a pulsed ion injection mode for use in conjunction with triple-quadrupole instruments. The pulsed injection mode approach reduces background ion noise at the detector, increases amplitude of the ion signal, and includes a unity duty cycle that provides a significant sensitivity increase for reliable quantitation of proteins/peptides present at attomole levels in highly complex biological mixtures.

  6. Monitoring of the data processing and simulated production at CMS with a web-based service: the Production Monitoring Platform (pMp)

    NASA Astrophysics Data System (ADS)

    Franzoni, G.; Norkus, A.; Pol, A. A.; Srimanobhas, N.; Walker, J.

    2017-10-01

    Physics analysis at the Compact Muon Solenoid requires both the production of simulated events and processing of the data collected by the experiment. Since the end of the LHC Run-I in 2012, CMS has produced over 20 billion simulated events, from 75 thousand processing requests organised in one hundred different campaigns. These campaigns emulate different configurations of collision events, the detector, and LHC running conditions. In the same time span, sixteen data processing campaigns have taken place to reconstruct different portions of the Run-I and Run-II data with ever improving algorithms and calibrations. The scale and complexity of the events simulation and processing, and the requirement that multiple campaigns must proceed in parallel, demand that a comprehensive, frequently updated and easily accessible monitoring be made available. The monitoring must serve both the analysts, who want to know which and when datasets will become available, and the central production teams in charge of submitting, prioritizing, and running the requests across the distributed computing infrastructure. The Production Monitoring Platform (pMp) web-based service, has been developed in 2015 to address those needs. It aggregates information from multiple services used to define, organize, and run the processing requests. Information is updated hourly using a dedicated elastic database and the monitoring provides multiple configurable views to assess the status of single datasets as well as entire production campaigns. This contribution will describe the pMp development, the evolution of its functionalities, and one and half year of operational experience.

  7. The design of an m-Health monitoring system based on a cloud computing platform

    NASA Astrophysics Data System (ADS)

    Xu, Boyi; Xu, Lida; Cai, Hongming; Jiang, Lihong; Luo, Yang; Gu, Yizhi

    2017-01-01

    Compared to traditional medical services provided within hospitals, m-Health monitoring systems (MHMSs) face more challenges in personalised health data processing. To achieve personalised and high-quality health monitoring by means of new technologies, such as mobile network and cloud computing, in this paper, a framework of an m-Health monitoring system based on a cloud computing platform (Cloud-MHMS) is designed to implement pervasive health monitoring. Furthermore, the modules of the framework, which are Cloud Storage and Multiple Tenants Access Control Layer, Healthcare Data Annotation Layer, and Healthcare Data Analysis Layer, are discussed. In the data storage layer, a multiple tenant access method is designed to protect patient privacy. In the data annotation layer, linked open data are adopted to augment health data interoperability semantically. In the data analysis layer, the process mining algorithm and similarity calculating method are implemented to support personalised treatment plan selection. These three modules cooperate to implement the core functions in the process of health monitoring, which are data storage, data processing, and data analysis. Finally, we study the application of our architecture in the monitoring of antimicrobial drug usage to demonstrate the usability of our method in personal healthcare analysis.

  8. On the use of multi-agent systems for the monitoring of industrial systems

    NASA Astrophysics Data System (ADS)

    Rezki, Nafissa; Kazar, Okba; Mouss, Leila Hayet; Kahloul, Laid; Rezki, Djamil

    2016-03-01

    The objective of the current paper is to present an intelligent system for complex process monitoring, based on artificial intelligence technologies. This system aims to realize with success all the complex process monitoring tasks that are: detection, diagnosis, identification and reconfiguration. For this purpose, the development of a multi-agent system that combines multiple intelligences such as: multivariate control charts, neural networks, Bayesian networks and expert systems has became a necessity. The proposed system is evaluated in the monitoring of the complex process Tennessee Eastman process.

  9. Condition monitoring of turning process using infrared thermography technique - An experimental approach

    NASA Astrophysics Data System (ADS)

    Prasad, Balla Srinivasa; Prabha, K. Aruna; Kumar, P. V. S. Ganesh

    2017-03-01

    In metal cutting machining, major factors that affect the cutting tool life are machine tool vibrations, tool tip/chip temperature and surface roughness along with machining parameters like cutting speed, feed rate, depth of cut, tool geometry, etc., so it becomes important for the manufacturing industry to find the suitable levels of process parameters for obtaining maintaining tool life. Heat generation in cutting was always a main topic to be studied in machining. Recent advancement in signal processing and information technology has resulted in the use of multiple sensors for development of the effective monitoring of tool condition monitoring systems with improved accuracy. From a process improvement point of view, it is definitely more advantageous to proactively monitor quality directly in the process instead of the product, so that the consequences of a defective part can be minimized or even eliminated. In the present work, a real time process monitoring method is explored using multiple sensors. It focuses on the development of a test bed for monitoring the tool condition in turning of AISI 316L steel by using both coated and uncoated carbide inserts. Proposed tool condition monitoring (TCM) is evaluated in the high speed turning using multiple sensors such as Laser Doppler vibrometer and infrared thermography technique. The results indicate the feasibility of using the dominant frequency of the vibration signals for the monitoring of high speed turning operations along with temperatures gradient. A possible correlation is identified in both regular and irregular cutting tool wear. While cutting speed and feed rate proved to be influential parameter on the depicted temperatures and depth of cut to be less influential. Generally, it is observed that lower heat and temperatures are generated when coated inserts are employed. It is found that cutting temperatures are gradually increased as edge wear and deformation developed.

  10. Process tool monitoring and matching using interferometry technique

    NASA Astrophysics Data System (ADS)

    Anberg, Doug; Owen, David M.; Mileham, Jeffrey; Lee, Byoung-Ho; Bouche, Eric

    2016-03-01

    The semiconductor industry makes dramatic device technology changes over short time periods. As the semiconductor industry advances towards to the 10 nm device node, more precise management and control of processing tools has become a significant manufacturing challenge. Some processes require multiple tool sets and some tools have multiple chambers for mass production. Tool and chamber matching has become a critical consideration for meeting today's manufacturing requirements. Additionally, process tools and chamber conditions have to be monitored to ensure uniform process performance across the tool and chamber fleet. There are many parameters for managing and monitoring tools and chambers. Particle defect monitoring is a well-known and established example where defect inspection tools can directly detect particles on the wafer surface. However, leading edge processes are driving the need to also monitor invisible defects, i.e. stress, contamination, etc., because some device failures cannot be directly correlated with traditional visualized defect maps or other known sources. Some failure maps show the same signatures as stress or contamination maps, which implies correlation to device performance or yield. In this paper we present process tool monitoring and matching using an interferometry technique. There are many types of interferometry techniques used for various process monitoring applications. We use a Coherent Gradient Sensing (CGS) interferometer which is self-referencing and enables high throughput measurements. Using this technique, we can quickly measure the topography of an entire wafer surface and obtain stress and displacement data from the topography measurement. For improved tool and chamber matching and reduced device failure, wafer stress measurements can be implemented as a regular tool or chamber monitoring test for either unpatterned or patterned wafers as a good criteria for improved process stability.

  11. Children's comprehension monitoring of multiple situational dimensions of a narrative.

    PubMed

    Wassenburg, Stephanie I; Beker, Katinka; van den Broek, Paul; van der Schoot, Menno

    Narratives typically consist of information on multiple aspects of a situation. In order to successfully create a coherent representation of the described situation, readers are required to monitor all these situational dimensions during reading. However, little is known about whether these dimensions differ in the ease with which they can be monitored. In the present study, we examined whether children in Grades 4 and 6 monitor four different dimensions (i.e., emotion, causation, time, and space) during reading, using a self-paced reading task containing inconsistencies. Furthermore, to explore what causes failure in inconsistency detection, we differentiated between monitoring processes related to availability and validation of information by manipulating the distance between two pieces of conflicting information. The results indicated that the monitoring processes varied as a function of dimension. Children were able to validate emotional and causal information when it was still active in working memory, but this was not the case for temporal and spatial information. When context and target information were more distant from each other, only emotionally charged information remained available for further monitoring processes. These findings show that the influence of different situational dimensions should be taken into account when studying children's reading comprehension.

  12. 40 CFR 60.482-1a - Standards: General.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... time during the specified monitoring period (e.g., month, quarter, year), provided the monitoring is... Monthly Quarterly Semiannually. (2) Pumps and valves that are shared among two or more batch process units... be separated by at least 120 calendar days. (g) If the storage vessel is shared with multiple process...

  13. 40 CFR 60.482-1a - Standards: General.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... time during the specified monitoring period (e.g., month, quarter, year), provided the monitoring is... Monthly Quarterly Semiannually. (2) Pumps and valves that are shared among two or more batch process units... be separated by at least 120 calendar days. (g) If the storage vessel is shared with multiple process...

  14. 40 CFR 60.482-1a - Standards: General.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... time during the specified monitoring period (e.g., month, quarter, year), provided the monitoring is... Monthly Quarterly Semiannually. (2) Pumps and valves that are shared among two or more batch process units... be separated by at least 120 calendar days. (g) If the storage vessel is shared with multiple process...

  15. 40 CFR 60.482-1a - Standards: General.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... time during the specified monitoring period (e.g., month, quarter, year), provided the monitoring is... Monthly Quarterly Semiannually. (2) Pumps and valves that are shared among two or more batch process units... be separated by at least 120 calendar days. (g) If the storage vessel is shared with multiple process...

  16. 40 CFR 60.482-1a - Standards: General.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... time during the specified monitoring period (e.g., month, quarter, year), provided the monitoring is... Monthly Quarterly Semiannually. (2) Pumps and valves that are shared among two or more batch process units... be separated by at least 120 calendar days. (g) If the storage vessel is shared with multiple process...

  17. Use of Flowsheet Monitoring to Perform Environmental Evaluation of Chemical Process Flowsheets

    EPA Science Inventory

    Flowsheet monitoring interfaces have been proposed to the Cape-Open Laboratories Network to enable development of applications that access to multiple parts of the flowsheet or its thermodynamic models, without interfering with the flowsheet itself. These flowsheet monitoring app...

  18. Breaking down barriers in cooperative fault management: Temporal and functional information displays

    NASA Technical Reports Server (NTRS)

    Potter, Scott S.; Woods, David D.

    1994-01-01

    At the highest level, the fundamental question addressed by this research is how to aid human operators engaged in dynamic fault management. In dynamic fault management there is some underlying dynamic process (an engineered or physiological process referred to as the monitored process - MP) whose state changes over time and whose behavior must be monitored and controlled. In these types of applications (dynamic, real-time systems), a vast array of sensor data is available to provide information on the state of the MP. Faults disturb the MP and diagnosis must be performed in parallel with responses to maintain process integrity and to correct the underlying problem. These situations frequently involve time pressure, multiple interacting goals, high consequences of failure, and multiple interleaved tasks.

  19. Automated plasma control with optical emission spectroscopy

    NASA Astrophysics Data System (ADS)

    Ward, P. P.

    Plasma etching and desmear processes for printed wiring board (PWB) manufacture are difficult to predict and control. Non-uniformity of most plasma processes and sensitivity to environmental changes make it difficult to maintain process stability from day to day. To assure plasma process performance, weight loss coupons or post-plasma destructive testing must be used. These techniques are not real-time methods however, and do not allow for immediate diagnosis and process correction. These tests often require scrapping some fraction of a batch to insure the integrity of the rest. Since these tests verify a successful cycle with post-plasma diagnostics, poor test results often determine that a batch is substandard and the resulting parts unusable. These tests are a costly part of the overall fabrication cost. A more efficient method of testing would allow for constant monitoring of plasma conditions and process control. Process anomalies should be detected and corrected before the parts being treated are damaged. Real time monitoring would allow for instantaneous corrections. Multiple site monitoring would allow for process mapping within one system or simultaneous monitoring of multiple systems. Optical emission spectroscopy conducted external to the plasma apparatus would allow for this sort of multifunctional analysis without perturbing the glow discharge. In this paper, optical emission spectroscopy for non-intrusive, in situ process control will be explored along with applications of this technique to for process control, failure analysis and endpoint determination in PWB manufacture.

  20. Generic Raman-based calibration models enabling real-time monitoring of cell culture bioreactors.

    PubMed

    Mehdizadeh, Hamidreza; Lauri, David; Karry, Krizia M; Moshgbar, Mojgan; Procopio-Melino, Renee; Drapeau, Denis

    2015-01-01

    Raman-based multivariate calibration models have been developed for real-time in situ monitoring of multiple process parameters within cell culture bioreactors. Developed models are generic, in the sense that they are applicable to various products, media, and cell lines based on Chinese Hamster Ovarian (CHO) host cells, and are scalable to large pilot and manufacturing scales. Several batches using different CHO-based cell lines and corresponding proprietary media and process conditions have been used to generate calibration datasets, and models have been validated using independent datasets from separate batch runs. All models have been validated to be generic and capable of predicting process parameters with acceptable accuracy. The developed models allow monitoring multiple key bioprocess metabolic variables, and hence can be utilized as an important enabling tool for Quality by Design approaches which are strongly supported by the U.S. Food and Drug Administration. © 2015 American Institute of Chemical Engineers.

  1. COMPACT, CONTINUOUS MONITORING FOR VOLATILE ORGANIC COMPOUNDS - PHASE I

    EPA Science Inventory

    Improved methods for onsite measurement of multiple volatile organic compounds are needed for process control, monitoring, and remediation. This Phase I SBIR project sets forth an optical measurement method that meets these needs. The proposed approach provides an instantaneous m...

  2. Attending to Multiple Visual Streams: Interactions between Location-Based and Category-Based Attentional Selection

    ERIC Educational Resources Information Center

    Fagioli, Sabrina; Macaluso, Emiliano

    2009-01-01

    Behavioral studies indicate that subjects are able to divide attention between multiple streams of information at different locations. However, it is still unclear to what extent the observed costs reflect processes specifically associated with spatial attention, versus more general interference due the concurrent monitoring of multiple streams of…

  3. Plasma process control with optical emission spectroscopy

    NASA Astrophysics Data System (ADS)

    Ward, P. P.

    Plasma processes for cleaning, etching and desmear of electronic components and printed wiring boards (PWB) are difficult to predict and control. Non-uniformity of most plasma processes and sensitivity to environmental changes make it difficult to maintain process stability from day to day. To assure plasma process performance, weight loss coupons or post-plasma destructive testing must be used. The problem with these techniques is that they are not real-time methods and do not allow for immediate diagnosis and process correction. These methods often require scrapping some fraction of a batch to insure the integrity of the rest. Since these methods verify a successful cycle with post-plasma diagnostics, poor test results often determine that a batch is substandard and the resulting parts unusable. Both of these methods are a costly part of the overall fabrication cost. A more efficient method of testing would allow for constant monitoring of plasma conditions and process control. Process failures should be detected before the parts being treated. are damaged. Real time monitoring would allow for instantaneous corrections. Multiple site monitoring would allow for process mapping within one system or simultaneous monitoring of multiple systems. Optical emission spectroscopy conducted external to the plasma apparatus would allow for this sort of multifunctional analysis without perturbing the glow discharge. In this paper, optical emission spectroscopy for non-intrusive, in situ process control will be explored. A discussion of this technique as it applies towards process control, failure analysis and endpoint determination will be conducted. Methods for identifying process failures, progress and end of etch back and desmear processes will be discussed.

  4. Analysis of acoustic emission signals and monitoring of machining processes

    PubMed

    Govekar; Gradisek; Grabec

    2000-03-01

    Monitoring of a machining process on the basis of sensor signals requires a selection of informative inputs in order to reliably characterize and model the process. In this article, a system for selection of informative characteristics from signals of multiple sensors is presented. For signal analysis, methods of spectral analysis and methods of nonlinear time series analysis are used. With the aim of modeling relationships between signal characteristics and the corresponding process state, an adaptive empirical modeler is applied. The application of the system is demonstrated by characterization of different parameters defining the states of a turning machining process, such as: chip form, tool wear, and onset of chatter vibration. The results show that, in spite of the complexity of the turning process, the state of the process can be well characterized by just a few proper characteristics extracted from a representative sensor signal. The process characterization can be further improved by joining characteristics from multiple sensors and by application of chaotic characteristics.

  5. A model of human event detection in multiple process monitoring situations

    NASA Technical Reports Server (NTRS)

    Greenstein, J. S.; Rouse, W. B.

    1978-01-01

    It is proposed that human decision making in many multi-task situations might be modeled in terms of the manner in which the human detects events related to his tasks and the manner in which he allocates his attention among his tasks once he feels events have occurred. A model of human event detection performance in such a situation is presented. An assumption of the model is that, in attempting to detect events, the human generates the probability that events have occurred. Discriminant analysis is used to model the human's generation of these probabilities. An experimental study of human event detection performance in a multiple process monitoring situation is described and the application of the event detection model to this situation is addressed. The experimental study employed a situation in which subjects simulataneously monitored several dynamic processes for the occurrence of events and made yes/no decisions on the presence of events in each process. Input to the event detection model of the information displayed to the experimental subjects allows comparison of the model's performance with the performance of the subjects.

  6. Localization of multiple defects using the compact phased array (CPA) method

    NASA Astrophysics Data System (ADS)

    Senyurek, Volkan Y.; Baghalian, Amin; Tashakori, Shervin; McDaniel, Dwayne; Tansel, Ibrahim N.

    2018-01-01

    Array systems of transducers have found numerous applications in detection and localization of defects in structural health monitoring (SHM) of plate-like structures. Different types of array configurations and analysis algorithms have been used to improve the process of localization of defects. For accurate and reliable monitoring of large structures by array systems, a high number of actuator and sensor elements are often required. In this study, a compact phased array system consisting of only three piezoelectric elements is used in conjunction with an updated total focusing method (TFM) for localization of single and multiple defects in an aluminum plate. The accuracy of the localization process was greatly improved by including wave propagation information in TFM. Results indicated that the proposed CPA approach can locate single and multiple defects with high accuracy while decreasing the processing costs and the number of required transducers. This method can be utilized in critical applications such as aerospace structures where the use of a large number of transducers is not desirable.

  7. An ultra low energy biomedical signal processing system operating at near-threshold.

    PubMed

    Hulzink, J; Konijnenburg, M; Ashouei, M; Breeschoten, A; Berset, T; Huisken, J; Stuyt, J; de Groot, H; Barat, F; David, J; Van Ginderdeuren, J

    2011-12-01

    This paper presents a voltage-scalable digital signal processing system designed for the use in a wireless sensor node (WSN) for ambulatory monitoring of biomedical signals. To fulfill the requirements of ambulatory monitoring, power consumption, which directly translates to the WSN battery lifetime and size, must be kept as low as possible. The proposed processing platform is an event-driven system with resources to run applications with different degrees of complexity in an energy-aware way. The architecture uses effective system partitioning to enable duty cycling, single instruction multiple data (SIMD) instructions, power gating, voltage scaling, multiple clock domains, multiple voltage domains, and extensive clock gating. It provides an alternative processing platform where the power and performance can be scaled to adapt to the application need. A case study on a continuous wavelet transform (CWT)-based heart-beat detection shows that the platform not only preserves the sensitivity and positive predictivity of the algorithm but also achieves the lowest energy/sample for ElectroCardioGram (ECG) heart-beat detection publicly reported today.

  8. A Robust and Resilient Network Design Paradigm for Region-Based Faults Inflicted by WMD Attack

    DTIC Science & Technology

    2016-04-01

    MEASUREMENTS FOR GRID MONITORING AND CONTROL AGAINST POSSIBLE WMD ATTACKS We investigated big data processing of PMU measurements for grid monitoring and...control against possible WMD attacks. Big data processing and analytics of synchrophasor measurements, collected from multiple locations of power grids...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources

  9. Informing Drought Preparedness and Response with the South Asia Land Data Assimilation System

    NASA Astrophysics Data System (ADS)

    Zaitchik, B. F.; Ghatak, D.; Matin, M. A.; Qamer, F. M.; Adhikary, B.; Bajracharya, B.; Nelson, J.; Pulla, S. T.; Ellenburg, W. L.

    2017-12-01

    Decision-relevant drought monitoring in South Asia is a challenge from both a scientific and an institutional perspective. Scientifically, climatic diversity, inconsistent in situ monitoring, complex hydrology, and incomplete knowledge of atmospheric processes mean that monitoring and prediction are fraught with uncertainty. Institutionally, drought monitoring efforts need to align with the information needs and decision-making processes of relevant agencies at national and subnational levels. Here we present first results from an emerging operational drought monitoring and forecast system developed and supported by the NASA SERVIR Hindu-Kush Himalaya hub. The system has been designed in consultation with end users from multiple sectors in South Asian countries to maximize decision-relevant information content in the monitoring and forecast products. Monitoring of meteorological, agricultural, and hydrological drought is accomplished using the South Asia Land Data Assimilation System, a platform that supports multiple land surface models and meteorological forcing datasets to characterize uncertainty, and subseasonal to seasonal hydrological forecasts are produced by driving South Asia LDAS with downscaled meteorological fields drawn from an ensemble of global dynamically-based forecast systems. Results are disseminated to end users through a Tethys online visualization platform and custom communications that provide user oriented, easily accessible, timely, and decision-relevant scientific information.

  10. Early Detection of Cancer by Affinity Mass Spectrometry-Set Aside funds — EDRN Public Portal

    Cancer.gov

    A.   RATIONALE The recent introduction of multiple reaction monitoring capabilities offers unprecedented capability to the research arsenal available to protein based biomarker discovery. Specific to the discovery process this technology offers an ability to monitor specific protein changes in concentration and/or post-translational modification. The ability to accurately confirm specific biomarkers in a sensitive and reproducible manner is critical to the confirmation and pre-validation process. We are proposing two collaborative studies that promise to develop Multiple Reaction Monitoring (MRM) work flows for the biomarker scientific community and specifically for EDRN. B.   GOALS The overall goal for this proposal is the identification of protein biomarkers that can be associated with prostate cancer detection. The underlying goal is the application of a novel technological approach aided by MRM toward biomarker discovery. An additional goal will be the dissemination of knowledge gained from these studies EDRN wide.

  11. OpenLMD, multimodal monitoring and control of LMD processing

    NASA Astrophysics Data System (ADS)

    Rodríguez-Araújo, Jorge; García-Díaz, Antón

    2017-02-01

    This paper presents OpenLMD, a novel open-source solution for on-line multimodal monitoring of Laser Metal Deposition (LMD). The solution is also applicable to a wider range of laser-based applications that require on-line control (e.g. laser welding). OpenLMD is a middleware that enables the orchestration and virtualization of a LMD robot cell, using several open-source frameworks (e.g. ROS, OpenCV, PCL). The solution also allows reconfiguration by easy integration of multiple sensors and processing equipment. As a result, OpenLMD delivers significant advantages over existing monitoring and control approaches, such as improved scalability, and multimodal monitoring and data sharing capabilities.

  12. Big data and high-performance analytics in structural health monitoring for bridge management

    NASA Astrophysics Data System (ADS)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  13. Developing inventory and monitoring programs based on multiple objectives

    NASA Astrophysics Data System (ADS)

    Schmoldt, Daniel L.; Peterson, David L.; Silsbee, David G.

    1994-09-01

    Resource inventory and monitoring (I&M) programs in national parks combine multiple objectives in order to create a plan of action over a finite time horizon. Because all program activities are constrained by time and money, it is critical to plan I&M activities that make the best use of available agency resources. However, multiple objectives complicate a relatively straightforward allocation process. The analytic hierarchy process (AHP) offers a structure for multiobjective decision making so that decision-makers’ preferences can be formally incorporated in seeking potential solutions. Within the AHP, inventory and monitoring program objectives and decision criteria are organized into a hierarchy. Pairwise comparisons among decision elements at any level of the hierarchy provide a ratio scale ranking of those elements. The resulting priority values for all projects are used as each project’s contribution to the value of an overall I&M program. These priorities, along with budget and personnel constraints, are formulated as a zero/one integer programming problem that can be solved to select those projects that produce the best program. An extensive example illustrates how this approach is being applied to I&M projects in national parks in the Pacific Northwest region of the United States. The proposed planning process provides an analytical framework for multicriteria decisionmaking that is rational, consistent, explicit, and defensible.

  14. Course Modules on Structural Health Monitoring with Smart Materials

    ERIC Educational Resources Information Center

    Shih, Hui-Ru; Walters, Wilbur L.; Zheng, Wei; Everett, Jessica

    2009-01-01

    Structural Health Monitoring (SHM) is an emerging technology that has multiple applications. SHM emerged from the wide field of smart structures, and it also encompasses disciplines such as structural dynamics, materials and structures, nondestructive testing, sensors and actuators, data acquisition, signal processing, and possibly much more. To…

  15. Coral Reef Monitoring Needs Assessment Workshop U.S. Virgin Islands

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) and U.S. Virgin Island Department of Planning and Natural Resources (DPNR) held a workshop September 11-13, 2007 in St. Croix, U.S. Virgin Islands (USVI) to begin the process of designing a monitoring program that meets multiple mana...

  16. Exact Performance Analysis of Two Distributed Processes with Multiple Synchronization Points.

    DTIC Science & Technology

    1987-05-01

    number of processes with straight-line sequences of semaphore operations . We use the geometric model for performance analysis, in contrast to proving...distribution unlimited. 4. PERFORMING’*ORGANIZATION REPORT NUMBERS) 5. MONITORING ORGANIZATION REPORT NUMB CS-TR-1845 6a. NAME OF PERFORMING ORGANIZATION 6b...OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATIO U University of Maryland (If applicable) Office of Naval Research N/A 6c. ADDRESS (City, State, and

  17. Processing deficits in monitoring analog and digital displays: Implications for attentional theory and mental-state estimation research

    NASA Technical Reports Server (NTRS)

    Payne, David G.; Gunther, Virginia A. L.

    1988-01-01

    Subjects performed short term memory tasks, involving both spatial and verbal components, and a visual monitoring task involving either analog or digital display formats. These two tasks (memory vs. monitoring) were performed both singly and in conjunction. Contrary to expectations derived from multiple resource theories of attentional processes, there was no evidence that when the two tasks involved the same cognitive codes (i.e., either both spatial or both verbal/linguistics) there was more of a dual task performance decrement than when the two tasks employed different cognitive codes/processes. These results are discussed in terms of their implications for theories of attentional processes and also for research in mental state estimation.

  18. Groundwater data for selected wells within the Eastern San Joaquin Groundwater Subbasin, California, 2003-8

    USGS Publications Warehouse

    Clark, Dennis A.; Izbicki, John A.; Metzger, Loren F.; Everett, Rhett; Smith, Gregory A.; O'Leary, David R.; Teague, Nicholas F.; Burgess, Matthew K.

    2012-01-01

    Data were collected by the U.S. Geological Survey from 2003 through 2008 in the Eastern San Joaquin Groundwater Subbasin, 80 miles east of San Francisco, California, as part of a study of the increasing chloride concentrations in groundwater processes. Data collected include geologic, geophysical, chemical, and hydrologic data collected during and after the installation of five multiple-well monitoring sites, from three existing multiple-well sites, and from 79 selected public-supply, irrigation, and domestic wells. Each multiple-well monitoring site installed as part of this study contained three to five 2-inch diameter polyvinyl chloride (PVC)-cased wells ranging in depth from 68 to 880 feet below land surface. Continuous water-level data were collected from the 19 wells installed at these 5 sites and from 10 existing monitoring wells at 3 additional multiple-well sites in the study area. Thirty-one electromagnetic logs were collected seasonally from the deepest PVC-cased monitoring well at seven multiple-well sites. About 200 water samples were collected from 79 wells in the study area. Coupled well-bore flow data and depth-dependent water-quality data were collected from 12 production wells under pumped conditions, and well-bore flow data were collected from 10 additional wells under unpumped conditions.

  19. Data Pre-Processing for Label-Free Multiple Reaction Monitoring (MRM) Experiments

    PubMed Central

    Chung, Lisa M.; Colangelo, Christopher M.; Zhao, Hongyu

    2014-01-01

    Multiple Reaction Monitoring (MRM) conducted on a triple quadrupole mass spectrometer allows researchers to quantify the expression levels of a set of target proteins. Each protein is often characterized by several unique peptides that can be detected by monitoring predetermined fragment ions, called transitions, for each peptide. Concatenating large numbers of MRM transitions into a single assay enables simultaneous quantification of hundreds of peptides and proteins. In recognition of the important role that MRM can play in hypothesis-driven research and its increasing impact on clinical proteomics, targeted proteomics such as MRM was recently selected as the Nature Method of the Year. However, there are many challenges in MRM applications, especially data pre‑processing where many steps still rely on manual inspection of each observation in practice. In this paper, we discuss an analysis pipeline to automate MRM data pre‑processing. This pipeline includes data quality assessment across replicated samples, outlier detection, identification of inaccurate transitions, and data normalization. We demonstrate the utility of our pipeline through its applications to several real MRM data sets. PMID:24905083

  20. Data Pre-Processing for Label-Free Multiple Reaction Monitoring (MRM) Experiments.

    PubMed

    Chung, Lisa M; Colangelo, Christopher M; Zhao, Hongyu

    2014-06-05

    Multiple Reaction Monitoring (MRM) conducted on a triple quadrupole mass spectrometer allows researchers to quantify the expression levels of a set of target proteins. Each protein is often characterized by several unique peptides that can be detected by monitoring predetermined fragment ions, called transitions, for each peptide. Concatenating large numbers of MRM transitions into a single assay enables simultaneous quantification of hundreds of peptides and proteins. In recognition of the important role that MRM can play in hypothesis-driven research and its increasing impact on clinical proteomics, targeted proteomics such as MRM was recently selected as the Nature Method of the Year. However, there are many challenges in MRM applications, especially data pre‑processing where many steps still rely on manual inspection of each observation in practice. In this paper, we discuss an analysis pipeline to automate MRM data pre‑processing. This pipeline includes data quality assessment across replicated samples, outlier detection, identification of inaccurate transitions, and data normalization. We demonstrate the utility of our pipeline through its applications to several real MRM data sets.

  1. Children's Ability to Distinguish between Memories from Multiple Sources: Implications for the Quality and Accuracy of Eyewitness Statements.

    ERIC Educational Resources Information Center

    Roberts, Kim P.

    2002-01-01

    Outlines five perspectives addressing alternate aspects of the development of children's source monitoring: source-monitoring theory, fuzzy-trace theory, schema theory, person-based perspective, and mental-state reasoning model. Discusses research areas with relation to forensic developmental psychology: agent identity, prospective processing,…

  2. Intelligent composting assisted by a wireless sensing network.

    PubMed

    López, Marga; Martinez-Farre, Xavier; Casas, Oscar; Quilez, Marcos; Polo, Jose; Lopez, Oscar; Hornero, Gemma; Pinilla, Mirta R; Rovira, Carlos; Ramos, Pedro M; Borges, Beatriz; Marques, Hugo; Girão, Pedro Silva

    2014-04-01

    Monitoring of the moisture and temperature of composting process is a key factor to obtain a quality product beyond the quality of raw materials. Current methodologies for monitoring these two parameters are time consuming for workers, sometimes not sufficiently reliable to help decision-making and thus are ignored in some cases. This article describes an advance on monitoring of composting process through a Wireless Sensor Network (WSN) that allows measurement of temperature and moisture in real time in multiple points of the composting material, the Compo-ball system. To implement such measurement capabilities on-line, a WSN composed of multiple sensor nodes was designed and implemented to provide the staff with an efficient monitoring composting management tool. After framing the problem, the objectives and characteristics of the WSN are briefly discussed and a short description of the hardware and software of the network's components are presented. Presentation and discussion of practical issues and results obtained with the WSN during a demonstration stage that took place in several composting sites concludes the paper. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Event-Driven Messaging for Offline Data Quality Monitoring at ATLAS

    NASA Astrophysics Data System (ADS)

    Onyisi, Peter

    2015-12-01

    During LHC Run 1, the information flow through the offline data quality monitoring in ATLAS relied heavily on chains of processes polling each other's outputs for handshaking purposes. This resulted in a fragile architecture with many possible points of failure and an inability to monitor the overall state of the distributed system. We report on the status of a project undertaken during the LHC shutdown to replace the ad hoc synchronization methods with a uniform message queue system. This enables the use of standard protocols to connect processes on multiple hosts; reliable transmission of messages between possibly unreliable programs; easy monitoring of the information flow; and the removal of inefficient polling-based communication.

  4. In house validation of a high resolution mass spectrometry Orbitrap-based method for multiple allergen detection in a processed model food.

    PubMed

    Pilolli, Rosa; De Angelis, Elisabetta; Monaci, Linda

    2018-02-13

    In recent years, mass spectrometry (MS) has been establishing its role in the development of analytical methods for multiple allergen detection, but most analyses are being carried out on low-resolution mass spectrometers such as triple quadrupole or ion traps. In this investigation, performance provided by a high resolution (HR) hybrid quadrupole-Orbitrap™ MS platform for the multiple allergens detection in processed food matrix is presented. In particular, three different acquisition modes were compared: full-MS, targeted-selected ion monitoring with data-dependent fragmentation (t-SIM/dd2), and parallel reaction monitoring. In order to challenge the HR-MS platform, the sample preparation was kept as simple as possible, limited to a 30-min ultrasound-aided protein extraction followed by clean-up with disposable size exclusion cartridges. Selected peptide markers tracing for five allergenic ingredients namely skim milk, whole egg, soy flour, ground hazelnut, and ground peanut were monitored in home-made cookies chosen as model processed matrix. Timed t-SIM/dd2 was found the best choice as a good compromise between sensitivity and accuracy, accomplishing the detection of 17 peptides originating from the five allergens in the same run. The optimized method was validated in-house through the evaluation of matrix and processing effects, recoveries, and precision. The selected quantitative markers for each allergenic ingredient provided quantification of 60-100 μg ingred /g allergenic ingredient/matrix in incurred cookies.

  5. Multiple-Diode-Laser Gas-Detection Spectrometer

    NASA Technical Reports Server (NTRS)

    Webster, Christopher R.; Beer, Reinhard; Sander, Stanley P.

    1988-01-01

    Small concentrations of selected gases measured automatically. Proposed multiple-laser-diode spectrometer part of system for measuring automatically concentrations of selected gases at part-per-billion level. Array of laser/photodetector pairs measure infrared absorption spectrum of atmosphere along probing laser beams. Adaptable to terrestrial uses as monitoring pollution or control of industrial processes.

  6. Time takes space: selective effects of multitasking on concurrent spatial processing.

    PubMed

    Mäntylä, Timo; Coni, Valentina; Kubik, Veit; Todorov, Ivo; Del Missier, Fabio

    2017-08-01

    Many everyday activities require coordination and monitoring of complex relations of future goals and deadlines. Cognitive offloading may provide an efficient strategy for reducing control demands by representing future goals and deadlines as a pattern of spatial relations. We tested the hypothesis that multiple-task monitoring involves time-to-space transformational processes, and that these spatial effects are selective with greater demands on coordinate (metric) than categorical (nonmetric) spatial relation processing. Participants completed a multitasking session in which they monitored four series of deadlines, running on different time scales, while making concurrent coordinate or categorical spatial judgments. We expected and found that multitasking taxes concurrent coordinate, but not categorical, spatial processing. Furthermore, males showed a better multitasking performance than females. These findings provide novel experimental evidence for the hypothesis that efficient multitasking involves metric relational processing.

  7. Autonomous cloud based site monitoring through hydro geophysical data assimilation, processing and result delivery

    NASA Astrophysics Data System (ADS)

    Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.

    2017-12-01

    Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical geophysical ingestion and processing and the ability for co analysis and visualization of the raw and processed data with other data of interest (e.g. soil temperature, soil moisture, precipitation). We will demonstrate current PAF capabilities and discuss future efforts.

  8. Deadlines in space: Selective effects of coordinate spatial processing in multitasking.

    PubMed

    Todorov, Ivo; Del Missier, Fabio; Konke, Linn Andersson; Mäntylä, Timo

    2015-11-01

    Many everyday activities require coordination and monitoring of multiple deadlines. One way to handle these temporal demands might be to represent future goals and deadlines as a pattern of spatial relations. We examined the hypothesis that spatial ability, in addition to executive functioning, contributes to individual differences in multitasking. In two studies, participants completed a multitasking session in which they monitored four digital clocks running at different rates. In Study 1, we found that individual differences in spatial ability and executive functions were independent predictors of multiple-task performance. In Study 2, we found that individual differences in specific spatial abilities were selectively related to multiple-task performance, as only coordinate spatial processing, but not categorical, predicted multitasking, even beyond executive functioning and numeracy. In both studies, males outperformed females in spatial ability and multitasking and in Study 2 these sex differences generalized to a simulation of everyday multitasking. Menstrual changes moderated the effects on multitasking, in that sex differences in coordinate spatial processing and multitasking were observed between males and females in the luteal phase of the menstrual cycle, but not between males and females at menses. Overall, these findings suggest that multiple-task performance reflects independent contributions of spatial ability and executive functioning. Furthermore, our results support the distinction of categorical versus coordinate spatial processing, and suggest that these two basic relational processes are selectively affected by female sex hormones and differentially effective in transforming and handling temporal patterns as spatial relations in the context of multitasking.

  9. Monitoring system of multiple fire fighting based on computer vision

    NASA Astrophysics Data System (ADS)

    Li, Jinlong; Wang, Li; Gao, Xiaorong; Wang, Zeyong; Zhao, Quanke

    2010-10-01

    With the high demand of fire control in spacious buildings, computer vision is playing a more and more important role. This paper presents a new monitoring system of multiple fire fighting based on computer vision and color detection. This system can adjust to the fire position and then extinguish the fire by itself. In this paper, the system structure, working principle, fire orientation, hydrant's angle adjusting and system calibration are described in detail; also the design of relevant hardware and software is introduced. At the same time, the principle and process of color detection and image processing are given as well. The system runs well in the test, and it has high reliability, low cost, and easy nodeexpanding, which has a bright prospect of application and popularization.

  10. Automated Long-Term Monitoring of Parallel Microfluidic Operations Applying a Machine Vision-Assisted Positioning Method

    PubMed Central

    Yip, Hon Ming; Li, John C. S.; Cui, Xin; Gao, Qiannan; Leung, Chi Chiu

    2014-01-01

    As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities. PMID:25133248

  11. Pervasive technology in Neonatal Intensive Care Unit: a prototype for newborns unobtrusive monitoring.

    PubMed

    Ciani, Oriana; Piccini, Luca; Parini, Sergio; Rullo, Alessia; Bagnoli, Franco; Marti, Patrizia; Andreoni, Giuseppe

    2008-01-01

    Pervasive computing research is introducing new perspectives in a wide range of applications, including healthcare domain. In this study we explore the possibility to realize a prototype of a system for unobtrusive recording and monitoring of multiple biological parameters on premature newborns hospitalized in the Neonatal Intensive Care Unit (NICU). It consists of three different units: a sensitized belt for Electrocardiogram (ECG) and chest dilatation monitoring, augmented with extrinsic transducers for temperature and respiratory activity measure, a device for signals pre-processing, sampling and transmission through Bluetooth(R) (BT) technology to a remote PC station and a software for data capture and post-processing. Preliminary results obtained by monitoring babies just discharged from the ward demonstrated the feasibility of the unobtrusive monitoring on this kind of subjects and open a new scenario for premature newborns monitoring and developmental cares practice in NICU.

  12. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao

    In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less

  13. Task-dependent response conflict monitoring and cognitive control in anterior cingulate and dorsolateral prefrontal cortices.

    PubMed

    Kim, Chobok; Chung, Chongwook; Kim, Jeounghoon

    2013-11-06

    Previous experience affects our behavior in terms of adjustments. It has been suggested that the conflict monitor-controller system implemented in the prefrontal cortex plays a critical role in such adjustments. Previous studies suggested that there exists multiple conflict monitor-controller systems associated with the level of information (i.e., stimulus and response levels). In this study, we sought to test whether different types of conflicts occur at the same information processing level (i.e., response level) are independently processed. For this purpose, we designed a task paradigm to measure two different types of response conflicts using color-based and location-based conflict stimuli and measured the conflict adaptation effects associated with the two types of conflicts either independently (i.e., single conflict conditions) or simultaneously (i.e., a double-conflict condition). The behavioral results demonstrated that performance on current incongruent trials was faster only when the preceding trial was the same type of response conflict regardless of whether they included a single- or double-conflict. Imaging data also showed that anterior cingulate and dorsolateral prefrontal cortices operate in a task-specific manner. These findings suggest that there may be multiple monitor-controller loops for color-based and location-based conflicts even at the same response level. Importantly, our results suggest that double-conflict processing is qualitatively different from single-conflict processing although double-conflict shares the same sources of conflict with two single-conflict conditions. © 2013 Published by Elsevier B.V.

  14. External scintigraphy in monitoring the behavior of pharmaceutical formulations in vivo I: technique for acquiring high-resolution images of tablets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theodorakis, M.C.; Simpson, D.R.; Leung, D.M.

    1983-02-01

    A new method for monitoring tablet disintegration in vivo was developed. In this method, the tablets were labeled with a short-lived radionuclide, technetium 99m, and monitored by a gamma camera. Several innovations were introduced with this method. First, computer reconstruction algorithms were used to enhance the scintigraphic images of the disintegrating tablet in vivo. Second, the use of a four-pinhole collimator to acquire multiple views of the tablet resulted in high count rates and reduced acquisition times of the scintigraphic images. Third, the magnification of the scintigraphic images achieved by pinhole collimation led to significant improvement in resolution. Fourth, themore » radioinuclide was incorporated into the granulation so that the whole mass of the tablet was uniformly labeled with high levels of activity. This technique allowed the continuous monitoring of the disintegration process of tablets in vivo in experimental animals. Multiple pinhole collimation and the labeling process permitted the acquisition of quality scintigraphic images of the labeled tablet every 30 sec. The resolution of the method was tested in vitro and in vivo.« less

  15. Measurement of company effectiveness using analytic network process method

    NASA Astrophysics Data System (ADS)

    Goran, Janjić; Zorana, Tanasić; Borut, Kosec

    2017-07-01

    The sustainable development of an organisation is monitored through the organisation's performance, which beforehand incorporates all stakeholders' requirements in its strategy. The strategic management concept enables organisations to monitor and evaluate their effectiveness along with efficiency by monitoring of the implementation of set strategic goals. In the process of monitoring and measuring effectiveness, an organisation can use multiple-criteria decision-making methods as help. This study uses the method of analytic network process (ANP) to define the weight factors of the mutual influences of all the important elements of an organisation's strategy. The calculation of an organisation's effectiveness is based on the weight factors and the degree of fulfilment of the goal values of the strategic map measures. New business conditions influence the changes in the importance of certain elements of an organisation's business in relation to competitive advantage on the market, and on the market, increasing emphasis is given to non-material resources in the process of selection of the organisation's most important measures.

  16. Assessment of curing behavior of light-activated dental composites using intensity correlation based multiple reference optical coherence tomography.

    PubMed

    Dsouza, Roshan; Subhash, Hrebesh; Neuhaus, Kai; Kantamneni, Ramakrishna; McNamara, Paul M; Hogan, Josh; Wilson, Carol; Leahy, Martin

    2016-01-01

    Monitoring the curing kinetics of light-activated resin is a key area of research. These resins are used in restorative applications and particularly in dental applications. They can undergo volumetric shrinkage due to poor control of the depth dependent curing process, modulated by the intensity and duration of the curing light source. This often results in the formation of marginal gaps, causing pain and damage to the restoration site. In this study, we demonstrate the capabilities of a correlation method applied using a multiple references optical coherence tomography (MR-OCT) architecture to monitor the curing of the resin. A MR-OCT system is used in this study to monitor the curing of the resin. The system operates at the center wavelength of 1310 nm with an A-scan rate of 1200 A-scans per second. The axial and lateral resolution of the system is ∼13 μm and ∼27 μm. The method to determine the intensity correlation between adjacent B-frames is based on the Pearson correlation coefficient for a region of interest. Calculating the correlation coefficient for multiple B-frames related to the first B-frame at regular spaced time points, shows for a noncured resin a reduction of the correlation coefficient over time due to Brownian motion. The time constant of the reduction of the correlation value is a measure for the progress of the polymerization during LED light irradiation of the resin. The proposed approach is potentially a low-cost, powerful and unique optical imaging modality for measuring the curing behavior of dental resin and other resins, coatings, and adhesives in medical and industrial applications. To demonstrate the proposed method to monitor the curing process, a light-activated resin composite from GRADIA DIRECT ANTERIOR (GC Corporation, Japan) is studied. The curing time of resin was measured and monitored as a function of depth. The correlation coefficient method is highly sensitive to Brownian motion. The process of curing results in a change in intensity as measured by the MR-OCT signal and hence can be monitored using this method. These results show that MR-OCT has the potential to measure the curing time and monitor the curing process as a function of depth. Moreover, MR-OCT as a product has potential to be compact, low-cost and to fit into a smartphone. Using such a device for monitoring the curing of the resin will be suitable for dentists in stationary and mobile clinical settings. © 2015 Wiley Periodicals, Inc.

  17. Diagnosis and treatment of latent tuberculosis in patients with multiple sclerosis, expert consensus. On behalf of the Colombian Association of Neurology, Committee of Multiple Sclerosis.

    PubMed

    Navas, Carlos; Torres-Duque, Carlos A; Munoz-Ceron, Joe; Álvarez, Carlos; García, Juan R; Zarco, Luis; Vélez, Lázaro A; Awad, Carlos; Castro, Carlos Alberto

    2018-01-01

    Multiple sclerosis is an inflammatory and neurodegenerative demyelinating disease. Current treatment of multiple sclerosis focuses on the use of immunomodulatory, immunosuppressant, and selective immunosuppressant agents. Some of these medications may result in high risk of opportunistic infections including tuberculosis. The purpose of this study was to obtain consensus from a panel of neurologists, pulmonologists, infectious disease specialists, and epidemiology experts regarding the diagnosis, treatment, and monitoring of latent tuberculosis in patients with multiple sclerosis. A panel of experts in multiple sclerosis and tuberculosis was established. The methodological process was performed in three phases: definition of questions, answer using Delphi methodology, and the discussion of questions not agreed. Tuberculosis screening is suggested when multiple sclerosis drugs are prescribed. The recommended tests for latent tuberculosis are tuberculin and interferon gamma release test. When an anti-tuberculosis treatment is indicated, monitoring should be performed to determine liver enzyme values with consideration of age as well as comorbid conditions such as a history of alcoholism, age, obesity, concomitant hepatotoxic drugs, and history of liver disease. Latent tuberculosis should be considered in patients with multiple sclerosis who are going to be treated with immunomodulatory and immunosuppressant medications. Transaminase level monitoring is required on a periodic basis depending on clinical and laboratory characteristics. In addition to the liver impairment, other side effects should be considered when Isoniazid is prescribed.

  18. Universal SaaS platform of internet of things for real-time monitoring

    NASA Astrophysics Data System (ADS)

    Liu, Tongke; Wu, Gang

    2018-04-01

    Real-time monitoring service, as a member of the IoT (Internet of Things) service, has a wide range application scenario. To support rapid construction and deployment of applications and avoid repetitive development works in these processes, this paper designs and develops a universal SaaS platform of IoT for real-time monitoring. Evaluation shows that this platform can provide SaaS service to multiple tenants and achieve high real-time performance under the situation of large amount of device access.

  19. Monitoring Agents for Assisting NASA Engineers with Shuttle Ground Processing

    NASA Technical Reports Server (NTRS)

    Semmel, Glenn S.; Davis, Steven R.; Leucht, Kurt W.; Rowe, Danil A.; Smith, Kevin E.; Boeloeni, Ladislau

    2005-01-01

    The Spaceport Processing Systems Branch at NASA Kennedy Space Center has designed, developed, and deployed a rule-based agent to monitor the Space Shuttle's ground processing telemetry stream. The NASA Engineering Shuttle Telemetry Agent increases situational awareness for system and hardware engineers during ground processing of the Shuttle's subsystems. The agent provides autonomous monitoring of the telemetry stream and automatically alerts system engineers when user defined conditions are satisfied. Efficiency and safety are improved through increased automation. Sandia National Labs' Java Expert System Shell is employed as the agent's rule engine. The shell's predicate logic lends itself well to capturing the heuristics and specifying the engineering rules within this domain. The declarative paradigm of the rule-based agent yields a highly modular and scalable design spanning multiple subsystems of the Shuttle. Several hundred monitoring rules have been written thus far with corresponding notifications sent to Shuttle engineers. This chapter discusses the rule-based telemetry agent used for Space Shuttle ground processing. We present the problem domain along with design and development considerations such as information modeling, knowledge capture, and the deployment of the product. We also present ongoing work with other condition monitoring agents.

  20. The Multi-Isotope Process (MIP) Monitor Project: FY13 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meier, David E.; Coble, Jamie B.; Jordan, David V.

    The Multi-Isotope Process (MIP) Monitor provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of “… (minimization of) the risks of nuclear proliferation and terrorism.” The MIP Monitor measures the distribution of the radioactive isotopes in product and waste streams of a nuclear reprocessing facility. These isotopes are monitored online by gamma spectrometry and compared, in near-real-time, to spectral patterns representing “normal” process conditions using multivariate analysis and pattern recognition algorithms. The combination of multivariate analysis and gamma spectroscopy allows us to detect small changes in the gamma spectrum, which may indicatemore » changes in process conditions. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting for systems representing aqueous reprocessing facilities. However, pyroprocessing is emerging as an alternative to aqueous reprocessing techniques.« less

  1. Acoustical Detection Of Leakage In A Combustor

    NASA Technical Reports Server (NTRS)

    Puster, Richard L.; Petty, Jeffrey L.

    1993-01-01

    Abnormal combustion excites characteristic standing wave. Acoustical leak-detection system gives early warning of failure, enabling operating personnel to stop combustion process and repair spray bar before leak grows large enough to cause damage. Applicable to engines, gas turbines, furnaces, and other machines in which acoustic emissions at known frequencies signify onset of damage. Bearings in rotating machines monitored for emergence of characteristic frequencies shown in previous tests associated with incipient failure. Also possible to monitor for signs of trouble at multiple frequencies by feeding output of transducer simultaneously to multiple band-pass filters and associated circuitry, including separate trigger circuit set to appropriate level for each frequency.

  2. Optimal Design of Multitype Groundwater Monitoring Networks Using Easily Accessible Tools.

    PubMed

    Wöhling, Thomas; Geiges, Andreas; Nowak, Wolfgang

    2016-11-01

    Monitoring networks are expensive to establish and to maintain. In this paper, we extend an existing data-worth estimation method from the suite of PEST utilities with a global optimization method for optimal sensor placement (called optimal design) in groundwater monitoring networks. Design optimization can include multiple simultaneous sensor locations and multiple sensor types. Both location and sensor type are treated simultaneously as decision variables. Our method combines linear uncertainty quantification and a modified genetic algorithm for discrete multilocation, multitype search. The efficiency of the global optimization is enhanced by an archive of past samples and parallel computing. We demonstrate our methodology for a groundwater monitoring network at the Steinlach experimental site, south-western Germany, which has been established to monitor river-groundwater exchange processes. The target of optimization is the best possible exploration for minimum variance in predicting the mean travel time of the hyporheic exchange. Our results demonstrate that the information gain of monitoring network designs can be explored efficiently and with easily accessible tools prior to taking new field measurements or installing additional measurement points. The proposed methods proved to be efficient and can be applied for model-based optimal design of any type of monitoring network in approximately linear systems. Our key contributions are (1) the use of easy-to-implement tools for an otherwise complex task and (2) yet to consider data-worth interdependencies in simultaneous optimization of multiple sensor locations and sensor types. © 2016, National Ground Water Association.

  3. Smart Phase Tuning in Microwave Photonic Integrated Circuits Toward Automated Frequency Multiplication by Design

    NASA Astrophysics Data System (ADS)

    Nabavi, N.

    2018-07-01

    The author investigates the monitoring methods for fine adjustment of the previously proposed on-chip architecture for frequency multiplication and translation of harmonics by design. Digital signal processing (DSP) algorithms are utilized to create an optimized microwave photonic integrated circuit functionality toward automated frequency multiplication. The implemented DSP algorithms are formed on discrete Fourier transform and optimization-based algorithms (Greedy and gradient-based algorithms), which are analytically derived and numerically compared based on the accuracy and speed of convergence criteria.

  4. Monitoring multiple components in vinegar fermentation using Raman spectroscopy.

    PubMed

    Uysal, Reyhan Selin; Soykut, Esra Acar; Boyaci, Ismail Hakki; Topcu, Ali

    2013-12-15

    In this study, the utility of Raman spectroscopy (RS) with chemometric methods for quantification of multiple components in the fermentation process was investigated. Vinegar, the product of a two stage fermentation, was used as a model and glucose and fructose consumption, ethanol production and consumption and acetic acid production were followed using RS and the partial least squares (PLS) method. Calibration of the PLS method was performed using model solutions. The prediction capability of the method was then investigated with both model and real samples. HPLC was used as a reference method. The results from comparing RS-PLS and HPLC with each other showed good correlations were obtained between predicted and actual sample values for glucose (R(2)=0.973), fructose (R(2)=0.988), ethanol (R(2)=0.996) and acetic acid (R(2)=0.983). In conclusion, a combination of RS with chemometric methods can be applied to monitor multiple components of the fermentation process from start to finish with a single measurement in a short time. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Application of Multiregressive Linear Models, Dynamic Kriging Models and Neural Network Models to Predictive Maintenance of Hydroelectric Power Systems

    NASA Astrophysics Data System (ADS)

    Lucifredi, A.; Mazzieri, C.; Rossi, M.

    2000-05-01

    Since the operational conditions of a hydroelectric unit can vary within a wide range, the monitoring system must be able to distinguish between the variations of the monitored variable caused by variations of the operation conditions and those due to arising and progressing of failures and misoperations. The paper aims to identify the best technique to be adopted for the monitoring system. Three different methods have been implemented and compared. Two of them use statistical techniques: the first, the linear multiple regression, expresses the monitored variable as a linear function of the process parameters (independent variables), while the second, the dynamic kriging technique, is a modified technique of multiple linear regression representing the monitored variable as a linear combination of the process variables in such a way as to minimize the variance of the estimate error. The third is based on neural networks. Tests have shown that the monitoring system based on the kriging technique is not affected by some problems common to the other two models e.g. the requirement of a large amount of data for their tuning, both for training the neural network and defining the optimum plane for the multiple regression, not only in the system starting phase but also after a trivial operation of maintenance involving the substitution of machinery components having a direct impact on the observed variable. Or, in addition, the necessity of different models to describe in a satisfactory way the different ranges of operation of the plant. The monitoring system based on the kriging statistical technique overrides the previous difficulties: it does not require a large amount of data to be tuned and is immediately operational: given two points, the third can be immediately estimated; in addition the model follows the system without adapting itself to it. The results of the experimentation performed seem to indicate that a model based on a neural network or on a linear multiple regression is not optimal, and that a different approach is necessary to reduce the amount of work during the learning phase using, when available, all the information stored during the initial phase of the plant to build the reference baseline, elaborating, if it is the case, the raw information available. A mixed approach using the kriging statistical technique and neural network techniques could optimise the result.

  6. Self-regulated learning processes of medical students during an academic learning task.

    PubMed

    Gandomkar, Roghayeh; Mirzazadeh, Azim; Jalili, Mohammad; Yazdani, Kamran; Fata, Ladan; Sandars, John

    2016-10-01

    This study was designed to identify the self-regulated learning (SRL) processes of medical students during a biomedical science learning task and to examine the associations of the SRL processes with previous performance in biomedical science examinations and subsequent performance on a learning task. A sample of 76 Year 1 medical students were recruited based on their performance in biomedical science examinations and stratified into previous high and low performers. Participants were asked to complete a biomedical science learning task. Participants' SRL processes were assessed before (self-efficacy, goal setting and strategic planning), during (metacognitive monitoring) and after (causal attributions and adaptive inferences) their completion of the task using an SRL microanalytic interview. Descriptive statistics were used to analyse the means and frequencies of SRL processes. Univariate and multiple logistic regression analyses were conducted to examine the associations of SRL processes with previous examination performance and the learning task performance. Most participants (from 88.2% to 43.4%) reported task-specific processes for SRL measures. Students who exhibited higher self-efficacy (odds ratio [OR] 1.44, 95% confidence interval [CI] 1.09-1.90) and reported task-specific processes for metacognitive monitoring (OR 6.61, 95% CI 1.68-25.93) and causal attributions (OR 6.75, 95% CI 2.05-22.25) measures were more likely to be high previous performers. Multiple analysis revealed that similar SRL measures were associated with previous performance. The use of task-specific processes for causal attributions (OR 23.00, 95% CI 4.57-115.76) and adaptive inferences (OR 27.00, 95% CI 3.39-214.95) measures were associated with being a high learning task performer. In multiple analysis, only the causal attributions measure was associated with high learning task performance. Self-efficacy, metacognitive monitoring and causal attributions measures were associated positively with previous performance. Causal attributions and adaptive inferences measures were associated positively with learning task performance. These findings may inform remediation interventions in the early years of medical school training. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  7. Square2 - A Web Application for Data Monitoring in Epidemiological and Clinical Studies

    PubMed

    Schmidt, Carsten Oliver; Krabbe, Christine; Schössow, Janka; Albers, Martin; Radke, Dörte; Henke, Jörg

    2017-01-01

    Valid scientific inferences from epidemiological and clinical studies require high data quality. Data generating departments therefore aim to detect data irregularities as early as possible in order to guide quality management processes. In addition, after the completion of data collections the obtained data quality must be evaluated. This can be challenging in complex studies due to a wide scope of examinations, numerous study variables, multiple examiners, devices, and examination centers. This paper describes a Java EE web application used to monitor and evaluate data quality in institutions with complex and multiple studies, named Square 2 . It uses the Java libraries Apache MyFaces 2, extended by BootsFaces for layout and style. RServe and REngine manage calls to R server processes. All study data and metadata are stored in PostgreSQL. R is the statistics backend and LaTeX is used for the generation of print ready PDF reports. A GUI manages the entire workflow. Square 2 covers all steps in the data monitoring workflow, including the setup of studies and their structure, the handling of metadata for data monitoring purposes, selection of variables, upload of data, statistical analyses, and the generation as well as inspection of quality reports. To take into account data protection issues, Square 2 comprises an extensive user rights and roles concept.

  8. Automated monitoring of medical protocols: a secure and distributed architecture.

    PubMed

    Alsinet, T; Ansótegui, C; Béjar, R; Fernández, C; Manyà, F

    2003-03-01

    The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.

  9. An integrated condition-monitoring method for a milling process using reduced decomposition features

    NASA Astrophysics Data System (ADS)

    Liu, Jie; Wu, Bo; Wang, Yan; Hu, Youmin

    2017-08-01

    Complex and non-stationary cutting chatter affects productivity and quality in the milling process. Developing an effective condition-monitoring approach is critical to accurately identify cutting chatter. In this paper, an integrated condition-monitoring method is proposed, where reduced features are used to efficiently recognize and classify machine states in the milling process. In the proposed method, vibration signals are decomposed into multiple modes with variational mode decomposition, and Shannon power spectral entropy is calculated to extract features from the decomposed signals. Principal component analysis is adopted to reduce feature size and computational cost. With the extracted feature information, the probabilistic neural network model is used to recognize and classify the machine states, including stable, transition, and chatter states. Experimental studies are conducted, and results show that the proposed method can effectively detect cutting chatter during different milling operation conditions. This monitoring method is also efficient enough to satisfy fast machine state recognition and classification.

  10. Using Knowledge Base for Event-Driven Scheduling of Web Monitoring Systems

    NASA Astrophysics Data System (ADS)

    Kim, Yang Sok; Kang, Sung Won; Kang, Byeong Ho; Compton, Paul

    Web monitoring systems report any changes to their target web pages by revisiting them frequently. As they operate under significant resource constraints, it is essential to minimize revisits while ensuring minimal delay and maximum coverage. Various statistical scheduling methods have been proposed to resolve this problem; however, they are static and cannot easily cope with events in the real world. This paper proposes a new scheduling method that manages unpredictable events. An MCRDR (Multiple Classification Ripple-Down Rules) document classification knowledge base was reused to detect events and to initiate a prompt web monitoring process independent of a static monitoring schedule. Our experiment demonstrates that the approach improves monitoring efficiency significantly.

  11. Abstraction Techniques for Parameterized Verification

    DTIC Science & Technology

    2006-11-01

    approach for applying model checking to unbounded systems is to extract finite state models from them using conservative abstraction techniques. Prop...36 2.5.1 Multiple Reference Processes . . . . . . . . . . . . . . . . . . . 36 2.5.2 Adding Monitor Processes...model checking to complex pieces of code like device drivers depends on the use of abstraction methods. An abstraction method extracts a small finite

  12. FTIR MONITORING OF THE VENTILATION AIR OF CRITICAL BUILDINGS

    EPA Science Inventory

    Fourier transform infrared (FTIR) spectroscopy has been used for detailed analysis of environmental and industrial process samples for many years. FTIR spectrometers have the capability of measuring multiple compounds simultaneously, thus providing an advantage over most other me...

  13. An Application of X-Ray Fluorescence as Process Analytical Technology (PAT) to Monitor Particle Coating Processes.

    PubMed

    Nakano, Yoshio; Katakuse, Yoshimitsu; Azechi, Yasutaka

    2018-06-01

    An attempt to apply X-Ray Fluorescence (XRF) analysis to evaluate small particle coating process as a Process Analytical Technologies (PAT) was made. The XRF analysis was used to monitor coating level in small particle coating process with at-line manner. The small particle coating process usually consists of multiple coating processes. This study was conducted by a simple coating particles prepared by first coating of a model compound (DL-methionine) and second coating by talc on spherical microcrystalline cellulose cores. The particles with two layered coating are enough to demonstrate the small particle coating process. From the result by the small particle coating process, it was found that the XRF signal played different roles, resulting that XRF signals by first coating (layering) and second coating (mask coating) could demonstrate the extent with different mechanisms for the coating process. Furthermore, the particle coating of the different particle size has also been investigated to evaluate size effect of these coating processes. From these results, it was concluded that the XRF could be used as a PAT in monitoring particle coating processes and become powerful tool in pharmaceutical manufacturing.

  14. Final technical report. In-situ FT-IR monitoring of a black liquor recovery boiler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James Markham; Joseph Cosgrove; David Marran

    1999-05-31

    This project developed and tested advanced Fourier transform infrared (FT-IR) instruments for process monitoring of black liquor recovery boilers. The state-of-the-art FT-IR instruments successfully operated in the harsh environment of a black liquor recovery boiler and provided a wealth of real-time process information. Concentrations of multiple gas species were simultaneously monitored in-situ across the combustion flow of the boiler and extractively at the stack. Sensitivity to changes of particulate fume and carryover levels in the process flow were also demonstrated. Boiler set-up and operation is a complex balance of conditions that influence the chemical and physical processes in the combustionmore » flow. Operating parameters include black liquor flow rate, liquor temperature, nozzle pressure, primary air, secondary air, tertiary air, boiler excess oxygen and others. The in-process information provided by the FT-IR monitors can be used as a boiler control tool since species indicative of combustion efficiency (carbon monoxide, methane) and pollutant emissions (sulfur dioxide, hydrochloric acid and fume) were monitored in real-time and observed to fluctuate as operating conditions were varied. A high priority need of the U.S. industrial boiler market is improved measurement and control technology. The sensor technology demonstrated in this project is applicable to the need of industry.« less

  15. Multiple Two-Way Time Message Exchange (TTME) Time Synchronization for Bridge Monitoring Wireless Sensor Networks

    PubMed Central

    Shi, Fanrong; Tuo, Xianguo; Yang, Simon X.; Li, Huailiang; Shi, Rui

    2017-01-01

    Wireless sensor networks (WSNs) have been widely used to collect valuable information in Structural Health Monitoring (SHM) of bridges, using various sensors, such as temperature, vibration and strain sensors. Since multiple sensors are distributed on the bridge, accurate time synchronization is very important for multi-sensor data fusion and information processing. Based on shape of the bridge, a spanning tree is employed to build linear topology WSNs and achieve time synchronization in this paper. Two-way time message exchange (TTME) and maximum likelihood estimation (MLE) are employed for clock offset estimation. Multiple TTMEs are proposed to obtain a subset of TTME observations. The time out restriction and retry mechanism are employed to avoid the estimation errors that are caused by continuous clock offset and software latencies. The simulation results show that the proposed algorithm could avoid the estimation errors caused by clock drift and minimize the estimation error due to the large random variable delay jitter. The proposed algorithm is an accurate and low complexity time synchronization algorithm for bridge health monitoring. PMID:28471418

  16. Multiple Two-Way Time Message Exchange (TTME) Time Synchronization for Bridge Monitoring Wireless Sensor Networks.

    PubMed

    Shi, Fanrong; Tuo, Xianguo; Yang, Simon X; Li, Huailiang; Shi, Rui

    2017-05-04

    Wireless sensor networks (WSNs) have been widely used to collect valuable information in Structural Health Monitoring (SHM) of bridges, using various sensors, such as temperature, vibration and strain sensors. Since multiple sensors are distributed on the bridge, accurate time synchronization is very important for multi-sensor data fusion and information processing. Based on shape of the bridge, a spanning tree is employed to build linear topology WSNs and achieve time synchronization in this paper. Two-way time message exchange (TTME) and maximum likelihood estimation (MLE) are employed for clock offset estimation. Multiple TTMEs are proposed to obtain a subset of TTME observations. The time out restriction and retry mechanism are employed to avoid the estimation errors that are caused by continuous clock offset and software latencies. The simulation results show that the proposed algorithm could avoid the estimation errors caused by clock drift and minimize the estimation error due to the large random variable delay jitter. The proposed algorithm is an accurate and low complexity time synchronization algorithm for bridge health monitoring.

  17. On-line, continuous monitoring in solar cell and fuel cell manufacturing using spectral reflectance imaging

    DOEpatents

    Sopori, Bhushan; Rupnowski, Przemyslaw; Ulsh, Michael

    2016-01-12

    A monitoring system 100 comprising a material transport system 104 providing for the transportation of a substantially planar material 102, 107 through the monitoring zone 103 of the monitoring system 100. The system 100 also includes a line camera 106 positioned to obtain multiple line images across a width of the material 102, 107 as it is transported through the monitoring zone 103. The system 100 further includes an illumination source 108 providing for the illumination of the material 102, 107 transported through the monitoring zone 103 such that light reflected in a direction normal to the substantially planar surface of the material 102, 107 is detected by the line camera 106. A data processing system 110 is also provided in digital communication with the line camera 106. The data processing system 110 is configured to receive data output from the line camera 106 and further configured to calculate and provide substantially contemporaneous information relating to a quality parameter of the material 102, 107. Also disclosed are methods of monitoring a quality parameter of a material.

  18. MRI in the assessment and monitoring of multiple sclerosis: an update on best practice

    PubMed Central

    Kaunzner, Ulrike W.; Gauthier, Susan A.

    2017-01-01

    Magnetic resonance imaging (MRI) has developed into the most important tool for the diagnosis and monitoring of multiple sclerosis (MS). Its high sensitivity for the evaluation of inflammatory and neurodegenerative processes in the brain and spinal cord has made it the most commonly used technique for the evaluation of patients with MS. Moreover, MRI has become a powerful tool for treatment monitoring, safety assessment as well as for the prognostication of disease progression. Clinically, the use of MRI has increased in the past couple decades as a result of improved technology and increased availability that now extends well beyond academic centers. Consequently, there are numerous studies supporting the role of MRI in the management of patients with MS. The aim of this review is to summarize the latest insights into the utility of MRI in MS. PMID:28607577

  19. Dynamic electrical impedance imaging with the interacting multiple model scheme.

    PubMed

    Kim, Kyung Youn; Kim, Bong Seok; Kim, Min Chan; Kim, Sin; Isaacson, David; Newell, Jonathan C

    2005-04-01

    In this paper, an effective dynamical EIT imaging scheme is presented for on-line monitoring of the abruptly changing resistivity distribution inside the object, based on the interacting multiple model (IMM) algorithm. The inverse problem is treated as a stochastic nonlinear state estimation problem with the time-varying resistivity (state) being estimated on-line with the aid of the IMM algorithm. In the design of the IMM algorithm multiple models with different process noise covariance are incorporated to reduce the modeling uncertainty. Simulations and phantom experiments are provided to illustrate the proposed algorithm.

  20. Microseismic Monitoring of the Mounds Drill Cuttings Injection Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Branagan, P.T.; Mahrer, K.D.; Moschovidis, Z.A.

    This paper describes the microseismic mapping of repeated injections of drill cuttings into two separate formations at a test site near Mounds, OK. Injections were performed in sandstone and shale formations at depths of 830 and 595 m, respectively. Typical injection disposal was simulated using multiple small-volume injections over a three-day period, with long shut-in periods interspersed between the injections. Microseismic monitoring was achieved using a 5-level array of wireline-run, triaxial- accelerometer receivers in a monitor well 76 m from the disposed well. Results of the mapped microseismic locations showed that the disposal domti W= generally aligns with the majormore » horizontal stress with some variations in azimuth and that wide variations in height and length growth occurred with continued injections. These experiments show that the cuttings injection process cm be adequately monitored from a downhole, wireline-run receiver array, thus providing process control and environmental assurance.« less

  1. Multiple sensor fault diagnosis for dynamic processes.

    PubMed

    Li, Cheng-Chih; Jeng, Jyh-Cheng

    2010-10-01

    Modern industrial plants are usually large scaled and contain a great amount of sensors. Sensor fault diagnosis is crucial and necessary to process safety and optimal operation. This paper proposes a systematic approach to detect, isolate and identify multiple sensor faults for multivariate dynamic systems. The current work first defines deviation vectors for sensor observations, and further defines and derives the basic sensor fault matrix (BSFM), consisting of the normalized basic fault vectors, by several different methods. By projecting a process deviation vector to the space spanned by BSFM, this research uses a vector with the resulted weights on each direction for multiple sensor fault diagnosis. This study also proposes a novel monitoring index and derives corresponding sensor fault detectability. The study also utilizes that vector to isolate and identify multiple sensor faults, and discusses the isolatability and identifiability. Simulation examples and comparison with two conventional PCA-based contribution plots are presented to demonstrate the effectiveness of the proposed methodology. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Comparison of pharmacokinetic behavior of two iridoid glycosides in rat plasma after oral administration of crude Cornus officinals and its jiuzhipin by high performance liquid chromatography triple quadrupole mass spectrometry combined with multiple reactions monitoring mode

    PubMed Central

    Chen, Xiaocheng; Cao, Gang; Jiang, Jianping

    2014-01-01

    Objective: The present study examined the pharmacokinetic profiles of two iridoid glycosides named morroniside and loganin in rat plasma after oral administration of crude and processed Cornus officinals. Materials and Methods: A rapid, selective and specific high-performance liquid chromatography/electrospray ionization tandem mass spectrometry with multiple reactions monitoring mode was developed to simultaneously investigate the pharmacokinetic profiles of morroniside and loganin in rat plasma after oral administration of crude C. officinals and its jiuzhipin. Results: The morroniside and loganin in crude and processed C. officinals could be simultaneously determined within 7.4 min. Linear calibration curves were obtained over the concentration ranges of 45.45-4800 ng/mL for all the analytes. The intra-and inter-day precisions relative standard deviation was lesser than 2.84% and 4.12%, respectively. Conclusion: The pharmacokinetic parameters of two iridoid glucosides were also compared systematically between crude and processed C. officinals. This paper provides the theoretical proofs for further explaining the processing mechanism of Traditional Chinese Medicines. PMID:24914290

  3. Fatal overdoses involving hydromorphone and morphine among inpatients: a case series

    PubMed Central

    Lowe, Amanda; Hamilton, Michael; Greenall BScPhm MHSc, Julie; Ma, Jessica; Dhalla, Irfan; Persaud, Nav

    2017-01-01

    Background: Opioids have narrow therapeutic windows, and errors in ordering or administration can be fatal. The purpose of this study was to describe deaths involving hydromorphone and morphine, which have similar-sounding names, but different potencies. Methods: In this case series, we describe deaths of patients admitted to hospital or residents of long-term care facilities that involved hydromorphone and morphine. We searched for deaths referred to the Patient Safety Review Committee of the Office of the Chief Coroner for Ontario between 2007 and 2012, and subsequently reviewed by 2014. We reviewed each case to identify intervention points where errors could have been prevented. Results: We identified 8 cases involving decedents aged 19 to 91 years. The cases involved errors in prescribing, order processing and transcription, dispensing, administration and monitoring. For 7 of the 8 cases, there were multiple (2 or more) possible intervention points. Six cases may have been prevented by additional patient monitoring, and 5 cases involved dispensing errors. Interpretation: Opioid toxicity deaths in patients living in institutions can be prevented at multiple points in the prescribing and dispensing processes. Interventions aimed at preventing errors in hydromorphone and morphine prescribing, administration and patient monitoring should be implemented and rigorously evaluated. PMID:28401133

  4. Fatal overdoses involving hydromorphone and morphine among inpatients: a case series.

    PubMed

    Lowe, Amanda; Hamilton, Michael; Greenall BScPhm MHSc, Julie; Ma, Jessica; Dhalla, Irfan; Persaud, Nav

    2017-01-01

    Opioids have narrow therapeutic windows, and errors in ordering or administration can be fatal. The purpose of this study was to describe deaths involving hydromorphone and morphine, which have similar-sounding names, but different potencies. In this case series, we describe deaths of patients admitted to hospital or residents of long-term care facilities that involved hydromorphone and morphine. We searched for deaths referred to the Patient Safety Review Committee of the Office of the Chief Coroner for Ontario between 2007 and 2012, and subsequently reviewed by 2014. We reviewed each case to identify intervention points where errors could have been prevented. We identified 8 cases involving decedents aged 19 to 91 years. The cases involved errors in prescribing, order processing and transcription, dispensing, administration and monitoring. For 7 of the 8 cases, there were multiple (2 or more) possible intervention points. Six cases may have been prevented by additional patient monitoring, and 5 cases involved dispensing errors. Opioid toxicity deaths in patients living in institutions can be prevented at multiple points in the prescribing and dispensing processes. Interventions aimed at preventing errors in hydromorphone and morphine prescribing, administration and patient monitoring should be implemented and rigorously evaluated.

  5. When goals collide: monitoring the goals of multiple characters.

    PubMed

    Magliano, Joseph P; Taylor, Holly A; Kim, Hyun-Jeong Joyce

    2005-12-01

    Most story plots contain multiple characters who are independent, interact, and often have conflicting goals. One would expect that narrative understanding would require monitoring of the goals, concerns, and situations of multiple agents. There is considerable evidence that understanders monitor the primary protagonist's goal plans (e.g., Suh & Trabasso, 1993). However, there is relatively little research on the extent to which understanders monitor the goals of multiple agents. We investigated the impact of characters' roles and prominence on the extent to which understanders monitor the goal plans of multiple characters in a feature length film. In Experiment 1, participants made situation change judgments, and in Experiment 2, they verbally described scenes. Both types of judgments indicated that viewers monitor the goals and plans of multiple agents but do so to a greater extent for characters more prominent to the plotline.

  6. A Quadrupole Dalton-based multi-attribute method for product characterization, process development, and quality control of therapeutic proteins.

    PubMed

    Xu, Weichen; Jimenez, Rod Brian; Mowery, Rachel; Luo, Haibin; Cao, Mingyan; Agarwal, Nitin; Ramos, Irina; Wang, Xiangyang; Wang, Jihong

    2017-10-01

    During manufacturing and storage process, therapeutic proteins are subject to various post-translational modifications (PTMs), such as isomerization, deamidation, oxidation, disulfide bond modifications and glycosylation. Certain PTMs may affect bioactivity, stability or pharmacokinetics and pharmacodynamics profile and are therefore classified as potential critical quality attributes (pCQAs). Identifying, monitoring and controlling these PTMs are usually key elements of the Quality by Design (QbD) approach. Traditionally, multiple analytical methods are utilized for these purposes, which is time consuming and costly. In recent years, multi-attribute monitoring methods have been developed in the biopharmaceutical industry. However, these methods combine high-end mass spectrometry with complicated data analysis software, which could pose difficulty when implementing in a quality control (QC) environment. Here we report a multi-attribute method (MAM) using a Quadrupole Dalton (QDa) mass detector to selectively monitor and quantitate PTMs in a therapeutic monoclonal antibody. The result output from the QDa-based MAM is straightforward and automatic. Evaluation results indicate this method provides comparable results to the traditional assays. To ensure future application in the QC environment, this method was qualified according to the International Conference on Harmonization (ICH) guideline and applied in the characterization of drug substance and stability samples. The QDa-based MAM is shown to be an extremely useful tool for product and process characterization studies that facilitates facile understanding of process impact on multiple quality attributes, while being QC friendly and cost-effective.

  7. Punch stretching process monitoring using acoustic emission signal analysis. II - Application of frequency domain deconvolution

    NASA Technical Reports Server (NTRS)

    Liang, Steven Y.; Dornfeld, David A.; Nickerson, Jackson A.

    1987-01-01

    The coloring effect on the acoustic emission signal due to the frequency response of the data acquisition/processing instrumentation may bias the interpretation of AE signal characteristics. In this paper, a frequency domain deconvolution technique, which involves the identification of the instrumentation transfer functions and multiplication of the AE signal spectrum by the inverse of these system functions, has been carried out. In this way, the change in AE signal characteristics can be better interpreted as the result of the change in only the states of the process. Punch stretching process was used as an example to demonstrate the application of the technique. Results showed that, through the deconvolution, the frequency characteristics of AE signals generated during the stretching became more distinctive and can be more effectively used as tools for process monitoring.

  8. Monitoring technique for multiple power splitter-passive optical networks using a tunable OTDR and FBGs

    NASA Astrophysics Data System (ADS)

    Hann, Swook; Kim, Dong-Hwan; Park, Chang-Soo

    2006-04-01

    A monitoring technique for multiple power splitter-passive optical networks (PS-PON) is presented. The technique is based on the remote sensing of fiber Bragg grating (FBG) using a tunable OTDR. To monitor the multiple PS-PON, the FBG can be used for a wavelength dependent reflective reference on each branch end of the PS. The FBG helps discern an individual event of the multiple PS-PON for the monitoring in collaborate with information of Rayleigh backscattered power. The multiple PS-PON can be analyzed by the monitoring method at the central office under 10-Gbit/s in-service.

  9. Reactor Operations Monitoring System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, M.M.

    1989-01-01

    The Reactor Operations Monitoring System (ROMS) is a VME based, parallel processor data acquisition and safety action system designed by the Equipment Engineering Section and Reactor Engineering Department of the Savannah River Site. The ROMS will be analyzing over 8 million signal samples per minute. Sixty-eight microprocessors are used in the ROMS in order to achieve a real-time data analysis. The ROMS is composed of multiple computer subsystems. Four redundant computer subsystems monitor 600 temperatures with 2400 thermocouples. Two computer subsystems share the monitoring of 600 reactor coolant flows. Additional computer subsystems are dedicated to monitoring 400 signals from assortedmore » process sensors. Data from these computer subsystems are transferred to two redundant process display computer subsystems which present process information to reactor operators and to reactor control computers. The ROMS is also designed to carry out safety functions based on its analysis of process data. The safety functions include initiating a reactor scram (shutdown), the injection of neutron poison, and the loadshed of selected equipment. A complete development Reactor Operations Monitoring System has been built. It is located in the Program Development Center at the Savannah River Site and is currently being used by the Reactor Engineering Department in software development. The Equipment Engineering Section is designing and fabricating the process interface hardware. Upon proof of hardware and design concept, orders will be placed for the final five systems located in the three reactor areas, the reactor training simulator, and the hardware maintenance center.« less

  10. Multimodal Wireless Sensor Network-Based Ambient Assisted Living in Real Homes with Multiple Residents

    PubMed Central

    Tunca, Can; Alemdar, Hande; Ertan, Halil; Incel, Ozlem Durmaz; Ersoy, Cem

    2014-01-01

    Human activity recognition and behavior monitoring in a home setting using wireless sensor networks (WSNs) provide a great potential for ambient assisted living (AAL) applications, ranging from health and wellbeing monitoring to resource consumption monitoring. However, due to the limitations of the sensor devices, challenges in wireless communication and the challenges in processing large amounts of sensor data in order to recognize complex human activities, WSN-based AAL systems are not effectively integrated in the home environment. Additionally, given the variety of sensor types and activities, selecting the most suitable set of sensors in the deployment is an important task. In order to investigate and propose solutions to such challenges, we introduce a WSN-based multimodal AAL system compatible for homes with multiple residents. Particularly, we focus on the details of the system architecture, including the challenges of sensor selection, deployment, networking and data collection and provide guidelines for the design and deployment of an effective AAL system. We also present the details of the field study we conducted, using the systems deployed in two different real home environments with multiple residents. With these systems, we are able to collect ambient sensor data from multiple homes. This data can be used to assess the wellbeing of the residents and identify deviations from everyday routines, which may be indicators of health problems. Finally, in order to elaborate on the possible applications of the proposed AAL system and to exemplify directions for processing the collected data, we provide the results of several human activity inference experiments, along with examples on how such results could be interpreted. We believe that the experiences shared in this work will contribute towards accelerating the acceptance of WSN-based AAL systems in the home setting. PMID:24887044

  11. Multimodal wireless sensor network-based ambient assisted living in real homes with multiple residents.

    PubMed

    Tunca, Can; Alemdar, Hande; Ertan, Halil; Incel, Ozlem Durmaz; Ersoy, Cem

    2014-05-30

    Human activity recognition and behavior monitoring in a home setting using wireless sensor networks (WSNs) provide a great potential for ambient assisted living (AAL) applications, ranging from health and wellbeing monitoring to resource consumption monitoring. However, due to the limitations of the sensor devices, challenges in wireless communication and the challenges in processing large amounts of sensor data in order to recognize complex human activities, WSN-based AAL systems are not effectively integrated in the home environment. Additionally, given the variety of sensor types and activities, selecting the most suitable set of sensors in the deployment is an important task. In order to investigate and propose solutions to such challenges, we introduce a WSN-based multimodal AAL system compatible for homes with multiple residents. Particularly, we focus on the details of the system architecture, including the challenges of sensor selection, deployment, networking and data collection and provide guidelines for the design and deployment of an effective AAL system. We also present the details of the field study we conducted, using the systems deployed in two different real home environments with multiple residents. With these systems, we are able to collect ambient sensor data from multiple homes. This data can be used to assess the wellbeing of the residents and identify deviations from everyday routines, which may be indicators of health problems. Finally, in order to elaborate on the possible applications of the proposed AAL system and to exemplify directions for processing the collected data, we provide the results of several human activity inference experiments, along with examples on how such results could be interpreted. We believe that the experiences shared in this work will contribute towards accelerating the acceptance of WSN-based AAL systems in the home setting.

  12. Use of Low-Cost Acquisition Systems with an Embedded Linux Device for Volcanic Monitoring

    PubMed Central

    Moure, David; Torres, Pedro; Casas, Benito; Toma, Daniel; Blanco, María José; Del Río, Joaquín; Manuel, Antoni

    2015-01-01

    This paper describes the development of a low-cost multiparameter acquisition system for volcanic monitoring that is applicable to gravimetry and geodesy, as well as to the visual monitoring of volcanic activity. The acquisition system was developed using a System on a Chip (SoC) Broadcom BCM2835 Linux operating system (based on DebianTM) that allows for the construction of a complete monitoring system offering multiple possibilities for storage, data-processing, configuration, and the real-time monitoring of volcanic activity. This multiparametric acquisition system was developed with a software environment, as well as with different hardware modules designed for each parameter to be monitored. The device presented here has been used and validated under different scenarios for monitoring ocean tides, ground deformation, and gravity, as well as for monitoring with images the island of Tenerife and ground deformation on the island of El Hierro. PMID:26295394

  13. Use of Low-Cost Acquisition Systems with an Embedded Linux Device for Volcanic Monitoring.

    PubMed

    Moure, David; Torres, Pedro; Casas, Benito; Toma, Daniel; Blanco, María José; Del Río, Joaquín; Manuel, Antoni

    2015-08-19

    This paper describes the development of a low-cost multiparameter acquisition system for volcanic monitoring that is applicable to gravimetry and geodesy, as well as to the visual monitoring of volcanic activity. The acquisition system was developed using a System on a Chip (SoC) Broadcom BCM2835 Linux operating system (based on DebianTM) that allows for the construction of a complete monitoring system offering multiple possibilities for storage, data-processing, configuration, and the real-time monitoring of volcanic activity. This multiparametric acquisition system was developed with a software environment, as well as with different hardware modules designed for each parameter to be monitored. The device presented here has been used and validated under different scenarios for monitoring ocean tides, ground deformation, and gravity, as well as for monitoring with images the island of Tenerife and ground deformation on the island of El Hierro.

  14. Channel Access in Erlang

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicklaus, Dennis J.

    2013-10-13

    We have developed an Erlang language implementation of the Channel Access protocol. Included are low-level functions for encoding and decoding Channel Access protocol network packets as well as higher level functions for monitoring or setting EPICS process variables. This provides access to EPICS process variables for the Fermilab Acnet control system via our Erlang-based front-end architecture without having to interface to C/C++ programs and libraries. Erlang is a functional programming language originally developed for real-time telecommunications applications. Its network programming features and list management functions make it particularly well-suited for the task of managing multiple Channel Access circuits and PVmore » monitors.« less

  15. Intelligent robotic tracker

    NASA Technical Reports Server (NTRS)

    Otaguro, W. S.; Kesler, L. O.; Land, K. C.; Rhoades, D. E.

    1987-01-01

    An intelligent tracker capable of robotic applications requiring guidance and control of platforms, robotic arms, and end effectors has been developed. This packaged system capable of supervised autonomous robotic functions is partitioned into a multiple processor/parallel processing configuration. The system currently interfaces to cameras but has the capability to also use three-dimensional inputs from scanning laser rangers. The inputs are fed into an image processing and tracking section where the camera inputs are conditioned for the multiple tracker algorithms. An executive section monitors the image processing and tracker outputs and performs all the control and decision processes. The present architecture of the system is presented with discussion of its evolutionary growth for space applications. An autonomous rendezvous demonstration of this system was performed last year. More realistic demonstrations in planning are discussed.

  16. Proactive Problem Avoidance and Quality of Service (QOS) Guarantees for Large Heterogeneous Networks

    DTIC Science & Technology

    2002-03-01

    host, and can be used to monitor and provide problem response data to multiple network elements. A blowup of the components of an RA is shown in...developed based on stati signal processing and learning. T ts to stical here are two salient features on the intelligent gents developed: (1) an...For multiple routers, the physical connections between interfaces along with the respective health of terface are represented. in In addition to

  17. Laser vaporization of cirrus-like ice particles with secondary ice multiplication

    PubMed Central

    Matthews, Mary; Pomel, François; Wender, Christiane; Kiselev, Alexei; Duft, Denis; Kasparian, Jérôme; Wolf, Jean-Pierre; Leisner, Thomas

    2016-01-01

    We investigate the interaction of ultrashort laser filaments with individual 90-μm ice particles, representative of cirrus particles. The ice particles fragment under laser illumination. By monitoring the evolution of the corresponding ice/vapor system at up to 140,000 frames per second over 30 ms, we conclude that a shockwave vaporization supersaturates the neighboring region relative to ice, allowing the nucleation and growth of new ice particles, supported by laser-induced plasma photochemistry. This process constitutes the first direct observation of filament-induced secondary ice multiplication, a process that strongly modifies the particle size distribution and, thus, the albedo of typical cirrus clouds. PMID:27386537

  18. Laser vaporization of cirrus-like ice particles with secondary ice multiplication.

    PubMed

    Matthews, Mary; Pomel, François; Wender, Christiane; Kiselev, Alexei; Duft, Denis; Kasparian, Jérôme; Wolf, Jean-Pierre; Leisner, Thomas

    2016-05-01

    We investigate the interaction of ultrashort laser filaments with individual 90-μm ice particles, representative of cirrus particles. The ice particles fragment under laser illumination. By monitoring the evolution of the corresponding ice/vapor system at up to 140,000 frames per second over 30 ms, we conclude that a shockwave vaporization supersaturates the neighboring region relative to ice, allowing the nucleation and growth of new ice particles, supported by laser-induced plasma photochemistry. This process constitutes the first direct observation of filament-induced secondary ice multiplication, a process that strongly modifies the particle size distribution and, thus, the albedo of typical cirrus clouds.

  19. Simulating Mercury And Methyl Mercury Stream Concentrations At Multiple Scales in a Wetland Influenced Coastal Plain Watershed (McTier Creek, SC, USA)

    EPA Science Inventory

    Use of Mechanistic Models to?Improve Understanding: Differential, mass balance, process-based Spatial and temporal resolution Necessary simplifications of system complexity Combing field monitoring and modeling efforts Balance between capturing complexity and maintaining...

  20. Multiple Target Tracking in a Wide-Field-of-View Camera System

    DTIC Science & Technology

    1990-01-01

    assembly is mounted on a Contraves alt-azi axis table with a pointing accuracy of < 2 Urad. * Work performed under the auspices of the U.S. Department of... Contraves SUN 3 CCD DR11W VME EITHERNET SUN 3 !3T 3 RS170 Video 1 Video ^mglifier^ I WWV Clock VCR Datacube u Monitor Monitor UL...displaying processed images with overlay from the Datacube. We control the Contraves table using a GPIB interface on the SUN. GPIB also interfaces a

  1. A Real-Time Cardiac Arrhythmia Classification System with Wearable Sensor Networks

    PubMed Central

    Hu, Sheng; Wei, Hongxing; Chen, Youdong; Tan, Jindong

    2012-01-01

    Long term continuous monitoring of electrocardiogram (ECG) in a free living environment provides valuable information for prevention on the heart attack and other high risk diseases. This paper presents the design of a real-time wearable ECG monitoring system with associated cardiac arrhythmia classification algorithms. One of the striking advantages is that ECG analog front-end and on-node digital processing are designed to remove most of the noise and bias. In addition, the wearable sensor node is able to monitor the patient's ECG and motion signal in an unobstructive way. To realize the real-time medical analysis, the ECG is digitalized and transmitted to a smart phone via Bluetooth. On the smart phone, the ECG waveform is visualized and a novel layered hidden Markov model is seamlessly integrated to classify multiple cardiac arrhythmias in real time. Experimental results demonstrate that the clean and reliable ECG waveform can be captured in multiple stressed conditions and the real-time classification on cardiac arrhythmia is competent to other workbenches. PMID:23112746

  2. In line monitoring of the preparation of water-in-oil-in-water (W/O/W) type multiple emulsions via dielectric spectroscopy.

    PubMed

    Beer, Sebastian; Dobler, Dorota; Gross, Alexander; Ost, Martin; Elseberg, Christiane; Maeder, Ulf; Schmidts, Thomas Michael; Keusgen, Michael; Fiebich, Martin; Runkel, Frank

    2013-01-30

    Multiple emulsions offer various applications in a wide range of fields such as pharmaceutical, cosmetics and food technology. Two features are known to yield a great influence on multiple emulsion quality and utility as encapsulation efficiency and prolonged stability. To achieve a prolonged stability, the production of the emulsions has to be observed and controlled, preferably in line. In line measurements provide available parameters in a short time frame without the need for the sample to be removed from the process stream, thereby enabling continuous process control. In this study, information about the physical state of multiple emulsions obtained from dielectric spectroscopy (DS) is evaluated for this purpose. Results from dielectric measurements performed in line during the production cycle are compared to theoretically expected results and to well established off line measurements. Thus, a first step to include the production of multiple emulsions into the process analytical technology (PAT) guidelines of the Food and Drug Administration (FDA) is achieved. DS proved to be beneficial in determining the crucial stopping criterion, which is essential in the production of multiple emulsions. The stopping of the process at a less-than-ideal point can severely lower the encapsulation efficiency and the stability, thereby lowering the quality of the emulsion. DS is also expected to provide further information about the multiple emulsion like encapsulation efficiency. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Some new approaches in hail suppression experiments

    NASA Technical Reports Server (NTRS)

    Browning, K. A.; Atlas, D.

    1977-01-01

    It is suggested that progress in hail suppression research requires simultaneous improvements in methods of evaluating seeding effects and in monitoring the physical structure of the hailstorm and the hail growth processes. On this basis a case is made for the extensive use of multiple Doppler radar and chemical tracer techniques.

  4. Application of SAR remote sensing and crop modeling for operational rice crop monitoring in South and South East Asian Countries

    NASA Astrophysics Data System (ADS)

    Setiyono, T. D.; Holecz, F.; Khan, N. I.; Barbieri, M.; Maunahan, A. A.; Gatti, L.; Quicho, E. D.; Pazhanivelan, S.; Campos-Taberner, M.; Collivignarelli, F.; Haro, J. G.; Intrman, A.; Phuong, D.; Boschetti, M.; Prasadini, P.; Busetto, L.; Minh, V. Q.; Tuan, V. Q.

    2017-12-01

    This study uses multi-temporal SAR imagery, automated image processing, rule-based classification and field observations to classify rice in multiple locations in South and South Asian countries and assimilate the information into ORYZA Crop Growth Simulation Model (CGSM) to monitor rice yield. The study demonstrates examples of operational application of this rice monitoring system in: (1) detecting drought impact on rice planting in Central Thailand and Tamil Nadu, India, (2) mapping heat stress impact on rice yield in Andhra Pradesh, India, and (3) generating historical rice yield data for districts in Red River Delta, Vietnam.

  5. Automated soil gas monitoring chamber

    DOEpatents

    Edwards, Nelson T.; Riggs, Jeffery S.

    2003-07-29

    A chamber for trapping soil gases as they evolve from the soil without disturbance to the soil and to the natural microclimate within the chamber has been invented. The chamber opens between measurements and therefore does not alter the metabolic processes that influence soil gas efflux rates. A multiple chamber system provides for repetitive multi-point sampling, undisturbed metabolic soil processes between sampling, and an essentially airtight sampling chamber operating at ambient pressure.

  6. Signal Processing for Determining Water Height in Steam Pipes with Dynamic Surface Conditions

    NASA Technical Reports Server (NTRS)

    Lih, Shyh-Shiuh; Lee, Hyeong Jae; Bar-Cohen, Yoseph

    2015-01-01

    An enhanced signal processing method based on the filtered Hilbert envelope of the auto-correlation function of the wave signal has been developed to monitor the height of condensed water through the steel wall of steam pipes with dynamic surface conditions. The developed signal processing algorithm can also be used to estimate the thickness of the pipe to determine the cut-off frequency for the low pass filter frequency of the Hilbert Envelope. Testing and analysis results by using the developed technique for dynamic surface conditions are presented. A multiple array of transducers setup and methodology are proposed for both the pulse-echo and pitch-catch signals to monitor the fluctuation of the water height due to disturbance, water flow, and other anomaly conditions.

  7. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  8. Classifying multiple types of hand motions using electrocorticography during intraoperative awake craniotomy and seizure monitoring processes—case studies

    PubMed Central

    Xie, Tao; Zhang, Dingguo; Wu, Zehan; Chen, Liang; Zhu, Xiangyang

    2015-01-01

    In this work, some case studies were conducted to classify several kinds of hand motions from electrocorticography (ECoG) signals during intraoperative awake craniotomy & extraoperative seizure monitoring processes. Four subjects (P1, P2 with intractable epilepsy during seizure monitoring and P3, P4 with brain tumor during awake craniotomy) participated in the experiments. Subjects performed three types of hand motions (Grasp, Thumb-finger motion and Index-finger motion) contralateral to the motor cortex covered with ECoG electrodes. Two methods were used for signal processing. Method I: autoregressive (AR) model with burg method was applied to extract features, and additional waveform length (WL) feature has been considered, finally the linear discriminative analysis (LDA) was used as the classifier. Method II: stationary subspace analysis (SSA) was applied for data preprocessing, and the common spatial pattern (CSP) was used for feature extraction before LDA decoding process. Applying method I, the three-class accuracy of P1~P4 were 90.17, 96.00, 91.77, and 92.95% respectively. For method II, the three-class accuracy of P1~P4 were 72.00, 93.17, 95.22, and 90.36% respectively. This study verified the possibility of decoding multiple hand motion types during an awake craniotomy, which is the first step toward dexterous neuroprosthetic control during surgical implantation, in order to verify the optimal placement of electrodes. The accuracy during awake craniotomy was comparable to results during seizure monitoring. This study also indicated that ECoG was a promising approach for precise identification of eloquent cortex during awake craniotomy, and might form a promising BCI system that could benefit both patients and neurosurgeons. PMID:26483627

  9. Heterogeneous recurrence monitoring and control of nonlinear stochastic processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Hui, E-mail: huiyang@usf.edu; Chen, Yun

    Recurrence is one of the most common phenomena in natural and engineering systems. Process monitoring of dynamic transitions in nonlinear and nonstationary systems is more concerned with aperiodic recurrences and recurrence variations. However, little has been done to investigate the heterogeneous recurrence variations and link with the objectives of process monitoring and anomaly detection. Notably, nonlinear recurrence methodologies are based on homogeneous recurrences, which treat all recurrence states in the same way as black dots, and non-recurrence is white in recurrence plots. Heterogeneous recurrences are more concerned about the variations of recurrence states in terms of state properties (e.g., valuesmore » and relative locations) and the evolving dynamics (e.g., sequential state transitions). This paper presents a novel approach of heterogeneous recurrence analysis that utilizes a new fractal representation to delineate heterogeneous recurrence states in multiple scales, including the recurrences of both single states and multi-state sequences. Further, we developed a new set of heterogeneous recurrence quantifiers that are extracted from fractal representation in the transformed space. To that end, we integrated multivariate statistical control charts with heterogeneous recurrence analysis to simultaneously monitor two or more related quantifiers. Experimental results on nonlinear stochastic processes show that the proposed approach not only captures heterogeneous recurrence patterns in the fractal representation but also effectively monitors the changes in the dynamics of a complex system.« less

  10. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    PubMed

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales.

  11. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions

    PubMed Central

    Hahn, Beth A.; Dreitz, Victoria J.; George, T. Luke

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer’s sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer’s sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales. PMID:29065128

  12. Key design elements of a data utility for national biosurveillance: event-driven architecture, caching, and Web service model.

    PubMed

    Tsui, Fu-Chiang; Espino, Jeremy U; Weng, Yan; Choudary, Arvinder; Su, Hoah-Der; Wagner, Michael M

    2005-01-01

    The National Retail Data Monitor (NRDM) has monitored over-the-counter (OTC) medication sales in the United States since December 2002. The NRDM collects data from over 18,600 retail stores and processes over 0.6 million sales records per day. This paper describes key architectural features that we have found necessary for a data utility component in a national biosurveillance system. These elements include event-driven architecture to provide analyses of data in near real time, multiple levels of caching to improve query response time, high availability through the use of clustered servers, scalable data storage through the use of storage area networks and a web-service function for interoperation with affiliated systems. The methods and architectural principles are relevant to the design of any production data utility for public health surveillance-systems that collect data from multiple sources in near real time for use by analytic programs and user interfaces that have substantial requirements for time-series data aggregated in multiple dimensions.

  13. Groundwater monitoring of hydraulic fracturing in California: Recommendations for permit-required monitoring

    NASA Astrophysics Data System (ADS)

    Esser, B. K.; Beller, H. R.; Carroll, S.; Cherry, J. A.; Jackson, R. B.; Jordan, P. D.; Madrid, V.; Morris, J.; Parker, B. L.; Stringfellow, W. T.; Varadharajan, C.; Vengosh, A.

    2015-12-01

    California recently passed legislation mandating dedicated groundwater quality monitoring for new well stimulation operations. The authors provided the State with expert advice on the design of such monitoring networks. Factors that must be considered in designing a new and unique groundwater monitoring program include: Program design: The design of a monitoring program is contingent on its purpose, which can range from detection of individual well leakage to demonstration of regional impact. The regulatory goals for permit-required monitoring conducted by operators on a well-by-well basis will differ from the scientific goals of a regional monitoring program conducted by the State. Vulnerability assessment: Identifying factors that increase the probability of transport of fluids from the hydrocarbon target zone to a protected groundwater zone enables the intensity of permit-required monitoring to be tiered by risk and also enables prioritization of regional monitoring of groundwater basins based on vulnerability. Risk factors include well integrity; proximity to existing wellbores and geologic features; wastewater disposal; vertical separation between the hydrocarbon and groundwater zones; and site-specific hydrogeology. Analyte choice: The choice of chemical analytes in a regulatory monitoring program is guided by the goals of detecting impact, assuring public safety, preventing resource degradation, and minimizing cost. Balancing these goals may be best served by tiered approach in which targeted analysis of specific chemical additives is triggered by significant changes in relevant but more easily analyzed constituents. Such an approach requires characterization of baseline conditions, especially in areas with long histories of oil and gas development. Monitoring technology: Monitoring a deep subsurface process or a long wellbore is more challenging than monitoring a surface industrial source. The requirement for monitoring multiple groundwater aquifers across a range of depths and of monitoring at deeper depths than is typical for regulatory monitoring programs requires consideration of monitoring technology, which can range from clusters of wells to multiple wells in a single wellbore to multi-level systems in a single cased wellbore.

  14. A radar data processing and enhancement system

    NASA Technical Reports Server (NTRS)

    Anderson, K. F.; Wrin, J. W.; James, R.

    1986-01-01

    This report describes the space position data processing system of the NASA Western Aeronautical Test Range. The system is installed at the Dryden Flight Research Facility of NASA Ames Research Center. This operational radar data system (RADATS) provides simultaneous data processing for multiple data inputs and tracking and antenna pointing outputs while performing real-time monitoring, control, and data enhancement functions. Experience in support of the space shuttle and aeronautical flight research missions is described, as well as the automated calibration and configuration functions of the system.

  15. Advanced Image Processing for NASA Applications

    NASA Technical Reports Server (NTRS)

    LeMoign, Jacqueline

    2007-01-01

    The future of space exploration will involve cooperating fleets of spacecraft or sensor webs geared towards coordinated and optimal observation of Earth Science phenomena. The main advantage of such systems is to utilize multiple viewing angles as well as multiple spatial and spectral resolutions of sensors carried on multiple spacecraft but acting collaboratively as a single system. Within this framework, our research focuses on all areas related to sensing in collaborative environments, which means systems utilizing intracommunicating spatially distributed sensor pods or crafts being deployed to monitor or explore different environments. This talk will describe the general concept of sensing in collaborative environments, will give a brief overview of several technologies developed at NASA Goddard Space Flight Center in this area, and then will concentrate on specific image processing research related to that domain, specifically image registration and image fusion.

  16. Bacteriophage removal efficiency as a validation and operational monitoring tool for virus reduction in wastewater reclamation: Review.

    PubMed

    Amarasiri, Mohan; Kitajima, Masaaki; Nguyen, Thanh H; Okabe, Satoshi; Sano, Daisuke

    2017-09-15

    The multiple-barrier concept is widely employed in international and domestic guidelines for wastewater reclamation and reuse for microbiological risk management, in which a wastewater reclamation system is designed to achieve guideline values of the performance target of microbe reduction. Enteric viruses are one of the pathogens for which the target reduction values are stipulated in guidelines, but frequent monitoring to validate human virus removal efficacy is challenging in a daily operation due to the cumbersome procedures for virus quantification in wastewater. Bacteriophages have been the first choice surrogate for this task, because of the well-characterized nature of strains and the presence of established protocols for quantification. Here, we performed a meta-analysis to calculate the average log 10 reduction values (LRVs) of somatic coliphages, F-specific phages, MS2 coliphage and T4 phage by membrane bioreactor, activated sludge, constructed wetlands, pond systems, microfiltration and ultrafiltration. The calculated LRVs of bacteriophages were then compared with reported human enteric virus LRVs. MS2 coliphage LRVs in MBR processes were shown to be lower than those of norovirus GII and enterovirus, suggesting it as a possible validation and operational monitoring tool. The other bacteriophages provided higher LRVs compared to human viruses. The data sets on LRVs of human viruses and bacteriophages are scarce except for MBR and conventional activated sludge processes, which highlights the necessity of investigating LRVs of human viruses and bacteriophages in multiple treatment unit processes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. How much can a single webcam tell to the operation of a water system?

    NASA Astrophysics Data System (ADS)

    Giuliani, Matteo; Castelletti, Andrea; Fedorov, Roman; Fraternali, Piero

    2017-04-01

    Recent advances in environmental monitoring are making a wide range of hydro-meteorological data available with a great potential to enhance understanding, modelling and management of environmental processes. Despite this progress, continuous monitoring of highly spatiotemporal heterogeneous processes is not well established yet, especially in inaccessible sites. In this context, the unprecedented availability of user-generated data on the web might open new opportunities for enhancing real-time monitoring and modeling of environmental systems based on data that are public, low-cost, and spatiotemporally dense. In this work, we focus on snow and contribute a novel crowdsourcing procedure for extracting snow-related information from public web images, either produced by users or generated by touristic webcams. A fully automated process fetches mountain images from multiple sources, identifies the peaks present therein, and estimates virtual snow indexes representing a proxy of the snow-covered area. The operational value of the obtained virtual snow indexes is then assessed for a real-world water-management problem, where we use these indexes for informing the daily control of a regulated lake supplying water for multiple purposes. Numerical results show that such information is effective in extending the anticipation capacity of the lake operations, ultimately improving the system performance. Our procedure has the potential for complementing traditional snow-related information, minimizing costs and efforts for obtaining the virtual snow indexes and, at the same time, maximizing the portability of the procedure to several locations where such public images are available.

  18. Assessment of laboratory test utilization for HIV/AIDS care in urban ART clinics of Lilongwe, Malawi.

    PubMed

    Palchaudhuri, Sonali; Tweya, Hannock; Hosseinipour, Mina

    2014-06-01

    The 2011 Malawi HIV guidelines promote CD4 monitoring for pre-ART assessment and considering HIVRNA monitoring for ART response assessment, while some clinics used CD4 for both. We assessed clinical ordering practices as compared to guidelines, and determined whether the samples were successfully and promptly processed. We conducted a retrospective review of all patients seen in from August 2010 through July 2011,, in two urban HIV-care clinics that utilized 6-monthly CD4 monitoring regardless of ART status. We calculated the percentage of patients on whom clinicians ordered CD4 or HIVRNA analysis. For all samples sent, we determined rates of successful lab-processing, and mean time to returned results. Of 20581 patients seen, 8029 (39%) had at least one blood draw for CD4 count. Among pre-ART patients, 2668/2844 (93.8%) had CD4 counts performed for eligibility. Of all CD4 samples sent, 8082/9207 (89%) samples were successfully processed. Of those, mean time to processing was 1.6 days (s.d 1.5) but mean time to results being available to clinician was 9.3 days (s.d. 3.7). Regarding HIVRNA, 172 patients of 17737 on ART had a blood draw and only 118/213 (55%) samples were successfully processed. Mean processing time was 39.5 days (s.d. 21.7); mean time to results being available to clinician was 43.1 days (s.d. 25.1). During the one-year evaluated, there were multiple lapses in processing HIVRNA samples for up to 2 months. Clinicians underutilize CD4 and HIVRNA as monitoring tools in HIV care. Laboratory processing failures and turnaround times are unacceptably high for viral load analysis. Alternative strategies need to be considered in order to meet laboratory monitoring needs.

  19. A Fully Redundant On-Line Mass Spectrometer System Used to Monitor Cryogenic Fuel Leaks on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Griffin, Timothy P.; Naylor, Guy R.; Haskell, William D.; Breznik, Greg S.; Mizell, Carolyn A.; Helms, William R.; Voska, N. (Technical Monitor)

    2002-01-01

    An on-line gas monitoring system was developed to replace the older systems used to monitor for cryogenic leaks on the Space Shuttles before launch. The system uses a mass spectrometer to monitor multiple locations in the process, which allows the system to monitor all gas constituents of interest in a nearly simultaneous manner. The system is fully redundant and meets all requirements for ground support equipment (GSE). This includes ruggedness to withstand launch on the Mobile Launcher Platform (MLP), ease of operation, and minimal operator intervention. The system can be fully automated so that an operator is notified when an unusual situation or fault is detected. User inputs are through personal computer using mouse and keyboard commands. The graphical user for detecting cryogenic leaks, many other gas constituents could be monitored using the Hazardous Gas Detection System (HGDS) 2000.

  20. New strategies for SHM based on a multichannel wireless AE node

    NASA Astrophysics Data System (ADS)

    Godinez-Azcuaga, Valery; Ley, Obdulia

    2014-03-01

    This paper discusses the development of an Acoustic Emission (AE) wireless node and its application for SHM (Structural Health Monitoring). The instrument development was planned for applications monitoring steel and concrete bridges components. The final product, now commercially available, is a sensor node which includes multiple sensing elements, on board signal processing and analysis capabilities, signal conditioning electronics, power management circuits, wireless data transmission element and energy harvesting unit. The sensing elements are capable of functioning in both passive and active modes, while the multiple parametric inputs are available for connecting various sensor types to measure external characteristics affecting the performance of the structure under monitoring. The output of all these sensors are combined and analyzed at the node in order to minimize the data transmission rate, which consumes significant amount of power. Power management circuits are used to reduce the data collection intervals through selective data acquisition strategies and minimize the sensor node power consumption. This instrument, known as the 1284, is an excellent platform to deploy SHM in the original bridge applications, but initial prototypes has shown significant potential in monitoring composite wind turbine blades and composites mockups of Unmanned Autonomous Vehicles (UAV) components; currently we are working to extend the use of this system to fields such as coal flow, power transformer, and off-shore platform monitoring.

  1. Implementation of quality by design toward processing of food products.

    PubMed

    Rathore, Anurag S; Kapoor, Gautam

    2017-05-28

    Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.

  2. Immersion and dry lithography monitoring for flash memories (after develop inspection and photo cell monitor) using a darkfield imaging inspector with advanced binning technology

    NASA Astrophysics Data System (ADS)

    Parisi, P.; Mani, A.; Perry-Sullivan, C.; Kopp, J.; Simpson, G.; Renis, M.; Padovani, M.; Severgnini, C.; Piacentini, P.; Piazza, P.; Beccalli, A.

    2009-12-01

    After-develop inspection (ADI) and photo-cell monitoring (PM) are part of a comprehensive lithography process monitoring strategy. Capturing defects of interest (DOI) in the lithography cell rather than at later process steps shortens the cycle time and allows for wafer re-work, reducing overall cost and improving yield. Low contrast DOI and multiple noise sources make litho inspection challenging. Broadband brightfield inspectors provide the highest sensitivity to litho DOI and are traditionally used for ADI and PM. However, a darkfield imaging inspector has shown sufficient sensitivity to litho DOI, providing a high-throughput option for litho defect monitoring. On the darkfield imaging inspector, a very high sensitivity inspection is used in conjunction with advanced defect binning to detect pattern issues and other DOI and minimize nuisance defects. For ADI, this darkfield inspection methodology enables the separation and tracking of 'color variation' defects that correlate directly to CD variations allowing a high-sampling monitor for focus excursions, thereby reducing scanner re-qualification time. For PM, the darkfield imaging inspector provides sensitivity to critical immersion litho defects at a lower cost-of-ownership. This paper describes litho monitoring methodologies developed and implemented for flash devices for 65nm production and 45nm development using the darkfield imaging inspector.

  3. Sit less and move more: perspectives of adults with multiple sclerosis.

    PubMed

    Aminian, Saeideh; Ezeugwu, Victor E; Motl, Robert W; Manns, Patricia J

    2017-12-20

    Multiple sclerosis is a chronic neurological disease with the highest prevalence in Canada. Replacing sedentary behavior with light activities may be a feasible approach to manage multiple sclerosis symptoms. This study explored the perspectives of adults with multiple sclerosis about sedentary behavior, physical activity and ways to change behavior. Fifteen adults with multiple sclerosis (age 43 ± 13 years; mean ± standard deviation), recruited through the multiple sclerosis Clinic at the University of Alberta, Edmonton, Canada, participated in semi-structured interviews. Interview audios were transcribed verbatim and coded. NVivo software was used to facilitate the inductive process of thematic analysis. Balancing competing priorities between sitting and moving was the primary theme. Participants were aware of the benefits of physical activity to their overall health, and in the management of fatigue and muscle stiffness. Due to fatigue, they often chose sitting to get their energy back. Further, some barriers included perceived fear of losing balance or embarrassment while walking. Activity monitoring, accountability, educational and individualized programs were suggested strategies to motivate more movement. Adults with multiple sclerosis were open to the idea of replacing sitting with light activities. Motivational and educational programs are required to help them to change sedentary behavior to moving more. IMPLICATIONS FOR REHABILITATION One of the most challenging and common difficulties of multiple sclerosis is walking impairment that worsens because of multiple sclerosis progression, and is a common goal in the rehabilitation of people with multiple sclerosis. The deterioration in walking abilities is related to lower levels of physical activity and more sedentary behavior, such that adults with multiple sclerosis spend 8 to 10.5 h per day sitting. Replacing prolonged sedentary behavior with light physical activities, and incorporating education, encouragement, and self-monitoring strategies are feasible approaches to manage the symptoms of multiple sclerosis.

  4. Multivariate EMD and full spectrum based condition monitoring for rotating machinery

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaomin; Patel, Tejas H.; Zuo, Ming J.

    2012-02-01

    Early assessment of machinery health condition is of paramount importance today. A sensor network with sensors in multiple directions and locations is usually employed for monitoring the condition of rotating machinery. Extraction of health condition information from these sensors for effective fault detection and fault tracking is always challenging. Empirical mode decomposition (EMD) is an advanced signal processing technology that has been widely used for this purpose. Standard EMD has the limitation in that it works only for a single real-valued signal. When dealing with data from multiple sensors and multiple health conditions, standard EMD faces two problems. First, because of the local and self-adaptive nature of standard EMD, the decomposition of signals from different sources may not match in either number or frequency content. Second, it may not be possible to express the joint information between different sensors. The present study proposes a method of extracting fault information by employing multivariate EMD and full spectrum. Multivariate EMD can overcome the limitations of standard EMD when dealing with data from multiple sources. It is used to extract the intrinsic mode functions (IMFs) embedded in raw multivariate signals. A criterion based on mutual information is proposed for selecting a sensitive IMF. A full spectral feature is then extracted from the selected fault-sensitive IMF to capture the joint information between signals measured from two orthogonal directions. The proposed method is first explained using simple simulated data, and then is tested for the condition monitoring of rotating machinery applications. The effectiveness of the proposed method is demonstrated through monitoring damage on the vane trailing edge of an impeller and rotor-stator rub in an experimental rotor rig.

  5. 40 CFR 86.004-40 - Heavy-duty engine rebuilding practices.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., replacement of multiple parts due to wear, and reassembly, and also may include the removal of the engine from..., emissions-related codes or signals from on-board monitoring systems may not be erased or reset without... signals may not be rendered inoperative during the rebuilding process. (d) When conducting a rebuild...

  6. 40 CFR 86.004-40 - Heavy-duty engine rebuilding practices.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., replacement of multiple parts due to wear, and reassembly, and also may include the removal of the engine from..., emissions-related codes or signals from on-board monitoring systems may not be erased or reset without... signals may not be rendered inoperative during the rebuilding process. (d) When conducting a rebuild...

  7. 40 CFR 86.004-40 - Heavy-duty engine rebuilding practices.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., replacement of multiple parts due to wear, and reassembly, and also may include the removal of the engine from..., emissions-related codes or signals from on-board monitoring systems may not be erased or reset without... signals may not be rendered inoperative during the rebuilding process. (d) When conducting a rebuild...

  8. 40 CFR 86.004-40 - Heavy-duty engine rebuilding practices.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., replacement of multiple parts due to wear, and reassembly, and also may include the removal of the engine from..., emissions-related codes or signals from on-board monitoring systems may not be erased or reset without... signals may not be rendered inoperative during the rebuilding process. (d) When conducting a rebuild...

  9. Spatial and temporal ecology of eastern spadefoot toads on a Florida landscape

    Treesearch

    Cathryn H. Greenberg; George W. Tanner

    2005-01-01

    Effective amphibian conservation must consider population and landscape processes, but information at multiple scales is rare. We explore spatial and temporal patterns of breeding and recruitment by eastern spadefoot toads (Scaphiopus holbrookii), using nine years of data from continuous monitoring with drift fences and pitfall traps at eight...

  10. Multiple wavelength light collimator and monitor

    NASA Technical Reports Server (NTRS)

    Gore, Warren J. (Inventor)

    2011-01-01

    An optical system for receiving and collimating light and for transporting and processing light received in each of N wavelength ranges, including near-ultraviolet, visible, near-infrared and mid-infrared wavelengths, to determine a fraction of light received, and associated dark current, in each wavelength range in each of a sequence of time intervals.

  11. Electrophysiological models of neural processing.

    PubMed

    Nelson, Mark E

    2011-01-01

    The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons.

  12. Recent advancement in biosensors technology for animal and livestock health management.

    PubMed

    Neethirajan, Suresh; Tuteja, Satish K; Huang, Sheng-Tung; Kelton, David

    2017-12-15

    The term biosensors encompasses devices that have the potential to quantify physiological, immunological and behavioural responses of livestock and multiple animal species. Novel biosensing methodologies offer highly specialised monitoring devices for the specific measurement of individual and multiple parameters covering an animal's physiology as well as monitoring of an animal's environment. These devices are not only highly specific and sensitive for the parameters being analysed, but they are also reliable and easy to use, and can accelerate the monitoring process. Novel biosensors in livestock management provide significant benefits and applications in disease detection and isolation, health monitoring and detection of reproductive cycles, as well as monitoring physiological wellbeing of the animal via analysis of the animal's environment. With the development of integrated systems and the Internet of Things, the continuously monitoring devices are expected to become affordable. The data generated from integrated livestock monitoring is anticipated to assist farmers and the agricultural industry to improve animal productivity in the future. The data is expected to reduce the impact of the livestock industry on the environment, while at the same time driving the new wave towards the improvements of viable farming techniques. This review focusses on the emerging technological advancements in monitoring of livestock health for detailed, precise information on productivity, as well as physiology and well-being. Biosensors will contribute to the 4th revolution in agriculture by incorporating innovative technologies into cost-effective diagnostic methods that can mitigate the potentially catastrophic effects of infectious outbreaks in farmed animals. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Optical noninvasive monitoring of skin blood pulsations

    NASA Astrophysics Data System (ADS)

    Spigulis, Janis

    2005-04-01

    Time-resolved detection and analysis of skin backscattered optical signals (remission photoplethysmography or PPG) provide rich information on skin blood volume pulsations and can serve for reliable cardiovascular assessment. Single- and multiple-channel PPG concepts are discussed. Simultaneous data flow from several locations on the human body allows us to study heartbeat pulse-wave propagation in real time and to evaluate vascular resistance. Portable single-, dual-, and four-channel PPG monitoring devices with special software have been designed for real-time data acquisition and processing. The prototype devices have been clinically studied, and their potential for monitoring heart arrhythmias, drug-efficiency tests, steady-state cardiovascular assessment, body fitness control, and express diagnostics of the arterial occlusions has been confirmed.

  14. Based Real Time Remote Health Monitoring Systems: A Review on Patients Prioritization and Related "Big Data" Using Body Sensors information and Communication Technology.

    PubMed

    Kalid, Naser; Zaidan, A A; Zaidan, B B; Salman, Omar H; Hashim, M; Muzammil, H

    2017-12-29

    The growing worldwide population has increased the need for technologies, computerised software algorithms and smart devices that can monitor and assist patients anytime and anywhere and thus enable them to lead independent lives. The real-time remote monitoring of patients is an important issue in telemedicine. In the provision of healthcare services, patient prioritisation poses a significant challenge because of the complex decision-making process it involves when patients are considered 'big data'. To our knowledge, no study has highlighted the link between 'big data' characteristics and real-time remote healthcare monitoring in the patient prioritisation process, as well as the inherent challenges involved. Thus, we present comprehensive insights into the elements of big data characteristics according to the six 'Vs': volume, velocity, variety, veracity, value and variability. Each of these elements is presented and connected to a related part in the study of the connection between patient prioritisation and real-time remote healthcare monitoring systems. Then, we determine the weak points and recommend solutions as potential future work. This study makes the following contributions. (1) The link between big data characteristics and real-time remote healthcare monitoring in the patient prioritisation process is described. (2) The open issues and challenges for big data used in the patient prioritisation process are emphasised. (3) As a recommended solution, decision making using multiple criteria, such as vital signs and chief complaints, is utilised to prioritise the big data of patients with chronic diseases on the basis of the most urgent cases.

  15. Enriching the biological space of natural products and charting drug metabolites, through real time biotransformation monitoring: The NMR tube bioreactor.

    PubMed

    Chatzikonstantinou, Alexandra V; Chatziathanasiadou, Maria V; Ravera, Enrico; Fragai, Marco; Parigi, Giacomo; Gerothanassis, Ioannis P; Luchinat, Claudio; Stamatis, Haralambos; Tzakos, Andreas G

    2018-01-01

    Natural products offer a wide range of biological activities, but they are not easily integrated in the drug discovery pipeline, because of their inherent scaffold intricacy and the associated complexity in their synthetic chemistry. Enzymes may be used to perform regioselective and stereoselective incorporation of functional groups in the natural product core, avoiding harsh reaction conditions, several protection/deprotection and purification steps. Herein, we developed a three step protocol carried out inside an NMR-tube. 1st-step: STD-NMR was used to predict the: i) capacity of natural products as enzyme substrates and ii) possible regioselectivity of the biotransformations. 2nd-step: The real-time formation of multiple-biotransformation products in the NMR-tube bioreactor was monitored in-situ. 3rd-step: STD-NMR was applied in the mixture of the biotransformed products to screen ligands for protein targets. Herein, we developed a simple and time-effective process, the "NMR-tube bioreactor", that is able to: (i) predict which component of a mixture of natural products can be enzymatically transformed, (ii) monitor in situ the transformation efficacy and regioselectivity in crude extracts and multiple substrate biotransformations without fractionation and (iii) simultaneously screen for interactions of the biotransformation products with pharmaceutical protein targets. We have developed a green, time-, and cost-effective process that provide a simple route from natural products to lead compounds for drug discovery. This process can speed up the most crucial steps in the early drug discovery process, and reduce the chemical manipulations usually involved in the pipeline, improving the environmental compatibility. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. A Field Programmable Gate Array-Based Reconfigurable Smart-Sensor Network for Wireless Monitoring of New Generation Computer Numerically Controlled Machines

    PubMed Central

    Moreno-Tapia, Sandra Veronica; Vera-Salas, Luis Alberto; Osornio-Rios, Roque Alfredo; Dominguez-Gonzalez, Aurelio; Stiharu, Ion; de Jesus Romero-Troncoso, Rene

    2010-01-01

    Computer numerically controlled (CNC) machines have evolved to adapt to increasing technological and industrial requirements. To cover these needs, new generation machines have to perform monitoring strategies by incorporating multiple sensors. Since in most of applications the online Processing of the variables is essential, the use of smart sensors is necessary. The contribution of this work is the development of a wireless network platform of reconfigurable smart sensors for CNC machine applications complying with the measurement requirements of new generation CNC machines. Four different smart sensors are put under test in the network and their corresponding signal processing techniques are implemented in a Field Programmable Gate Array (FPGA)-based sensor node. PMID:22163602

  17. A field programmable gate array-based reconfigurable smart-sensor network for wireless monitoring of new generation computer numerically controlled machines.

    PubMed

    Moreno-Tapia, Sandra Veronica; Vera-Salas, Luis Alberto; Osornio-Rios, Roque Alfredo; Dominguez-Gonzalez, Aurelio; Stiharu, Ion; Romero-Troncoso, Rene de Jesus

    2010-01-01

    Computer numerically controlled (CNC) machines have evolved to adapt to increasing technological and industrial requirements. To cover these needs, new generation machines have to perform monitoring strategies by incorporating multiple sensors. Since in most of applications the online Processing of the variables is essential, the use of smart sensors is necessary. The contribution of this work is the development of a wireless network platform of reconfigurable smart sensors for CNC machine applications complying with the measurement requirements of new generation CNC machines. Four different smart sensors are put under test in the network and their corresponding signal processing techniques are implemented in a Field Programmable Gate Array (FPGA)-based sensor node.

  18. Advances in industrial biopharmaceutical batch process monitoring: Machine-learning methods for small data problems.

    PubMed

    Tulsyan, Aditya; Garvin, Christopher; Ündey, Cenk

    2018-04-06

    Biopharmaceutical manufacturing comprises of multiple distinct processing steps that require effective and efficient monitoring of many variables simultaneously in real-time. The state-of-the-art real-time multivariate statistical batch process monitoring (BPM) platforms have been in use in recent years to ensure comprehensive monitoring is in place as a complementary tool for continued process verification to detect weak signals. This article addresses a longstanding, industry-wide problem in BPM, referred to as the "Low-N" problem, wherein a product has a limited production history. The current best industrial practice to address the Low-N problem is to switch from a multivariate to a univariate BPM, until sufficient product history is available to build and deploy a multivariate BPM platform. Every batch run without a robust multivariate BPM platform poses risk of not detecting potential weak signals developing in the process that might have an impact on process and product performance. In this article, we propose an approach to solve the Low-N problem by generating an arbitrarily large number of in silico batches through a combination of hardware exploitation and machine-learning methods. To the best of authors' knowledge, this is the first article to provide a solution to the Low-N problem in biopharmaceutical manufacturing using machine-learning methods. Several industrial case studies from bulk drug substance manufacturing are presented to demonstrate the efficacy of the proposed approach for BPM under various Low-N scenarios. © 2018 Wiley Periodicals, Inc.

  19. Real-time process monitoring in a semi-continuous fluid-bed dryer - microwave resonance technology versus near-infrared spectroscopy.

    PubMed

    Peters, Johanna; Teske, Andreas; Taute, Wolfgang; Döscher, Claas; Höft, Michael; Knöchel, Reinhard; Breitkreutz, Jörg

    2018-02-15

    The trend towards continuous manufacturing in the pharmaceutical industry is associated with an increasing demand for advanced control strategies. It is a mandatory requirement to obtain reliable real-time information on critical quality attributes (CQA) during every process step as the decision on diversion of material needs to be performed fast and automatically. Where possible, production equipment should provide redundant systems for in-process control (IPC) measurements to ensure continuous process monitoring even if one of the systems is not available. In this paper, two methods for real-time monitoring of granule moisture in a semi-continuous fluid-bed drying unit are compared. While near-infrared (NIR) spectroscopy has already proven to be a suitable process analytical technology (PAT) tool for moisture measurements in fluid-bed applications, microwave resonance technology (MRT) showed difficulties to monitor moistures above 8% until recently. The results indicate, that the newly developed MRT sensor operating at four resonances is capable to compete with NIR spectroscopy. While NIR spectra were preprocessed by mean centering and first derivative before application of partial least squares (PLS) regression to build predictive models (RMSEP = 0.20%), microwave moisture values of two resonances sufficed to build a statistically close multiple linear regression (MLR) model (RMSEP = 0.07%) for moisture prediction. Thereby, it could be verified that moisture monitoring by MRT sensor systems could be a valuable alternative to NIR spectroscopy or could be used as a redundant system providing great ease of application. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. A KPI-based process monitoring and fault detection framework for large-scale processes.

    PubMed

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Monitoring Space Weather Hazards caused by geomagnetic disturbances with Space Hazard Monitor (SHM) systems

    NASA Astrophysics Data System (ADS)

    Xu, Z.; Gannon, J. L.; Peek, T. A.; Lin, D.

    2017-12-01

    One space weather hazard is the Geomagnetically Induced Currents (GICs) in the electric power transmission systems, which is naturally induced geoelectric field during the geomagnetic disturbances (GMDs). GICs are a potentially catastrophic threat to bulk power systems. For instance, the Blackout in Quebec in March 1989 was caused by GMDs during a significant magnetic storm. To monitor the GMDs, the autonomous Space Hazard Monitor (SHM) system is developed recently. The system includes magnetic field measurement from magnetometers and geomagnetic field measurement from electrodes. In this presentation, we introduce the six sites of SHMs which have been deployed in the US continental regions. The data from the magnetometers are processed with the Multiple Observatory Geomagnetic Data Analysis Software (MOGDAS). And the statistical results are presented here. It reveals not only the impacts of space weather over US continental region but also the potential of improving instrumentation development to provide better space weather monitor.

  2. Integrating Land Cover Modeling and Adaptive Management to Conserve Endangered Species and Reduce Catastrophic Fire Risk

    NASA Technical Reports Server (NTRS)

    Breininger, David; Duncan, Brean; Eaton, Mitchell; Johnson, Fred; Nichols, James

    2014-01-01

    Land cover modeling is used to inform land management, but most often via a two-step process where science informs how management alternatives can influence resources and then decision makers can use this to make decisions. A more efficient process is to directly integrate science and decision making, where science allows us to learn to better accomplish management objectives and is developed to address specific decisions. Co-development of management and science is especially productive when decisions are complicated by multiple objectives and impeded by uncertainty. Multiple objectives can be met by specification of tradeoffs, and relevant uncertainty can be addressed through targeted science (i.e., models and monitoring). We describe how to integrate habitat and fuels monitoring with decision making focused on dual objectives of managing for endangered species and minimizing catastrophic fire risk. Under certain conditions, both objectives might be achieved by a similar management policy, but habitat trajectories suggest tradeoffs. Knowledge about system responses to actions can be informed by applying competing management actions to different land units in the same system state and by ideas about fire behavior. Monitoring and management integration is important to optimize state-specific management decisions and increase knowledge about system responses. We believe this approach has broad utility for and cover modeling programs intended to inform decision making.

  3. Continuous monitoring of dissolved gases with membrane inlet mass spectrometry to fingerprint river biochemical activity

    NASA Astrophysics Data System (ADS)

    Vautier, Camille; Chatton, Eliot; Abbott, Benjamin; Harjung, Astrid; Labasque, Thierry; Guillou, Aurélie; Pannard, Alexandrine; Piscart, Christophe; Laverman, Anniet; Kolbe, Tamara; Massé, Stéphanie; de Dreuzy, Jean-Raynald; Thomas, Zahra; Aquilina, Luc; Pinay, Gilles

    2017-04-01

    Water quality in rivers results from biogeochemical processes in contributing hydrological compartments (soils, aquifers, hyporheic and riparian zones) and biochemical activity in the river network itself. Consequently, chemical fluxes fluctuate on multiple spatial and temporal scales, leading eventually to complex concentration signals in rivers. We characterized these fluctuations with innovative continuous monitoring of dissolved gases, to quantify transport and reaction processes occurring in different hydrological compartments. We performed stream-scale experiments in two headwater streams in Brittany, France. Factorial injections of inorganic nitrogen (NH4NO3), inorganic phosphate (P2O5) and multiple sources of labile carbon (acetate, tryptophan) were implemented in the two streams. We used a new field application of membrane inlet mass spectrometry to continuously monitor dissolved gases for multiple day-night periods (Chatton et al., 2016). Quantified gases included He, O2, N2, CO2, CH4, N2O, and 15N of dissolved N2 and N2O. We calibrated and assessed the methodology with well-established complementary techniques including gas chromatography and high-frequency water quality sensors. Wet chemistry and radon analysis complemented the study. The analyses provided several methodological and ecological insights and demonstrated that high frequency variations linked to background noise can be efficiently determined and filtered to derive effective fluxes. From a more fundamental point of view, the tested stream segments were fully characterized with extensive sampling of riverbeds and laboratory experiments, allowing scaling of point-level microbial and invertebrate diversity and activity on in-stream processing. This innovative technology allows fully-controlled in-situ experiments providing rich information with a high signal to noise ratio. We present the integrated nutrient demand and uptake and discuss limiting processes and elements at the reach and catchment scales. Eliot Chatton, Thierry Labasque, Jérôme de La Bernardie, Nicolas Guihéneuf, Olivier Bour, Luc Aquilina. 2016. Field Continuous Measurement of Dissolved Gases with a CF-MIMS: Applications to the Physics and Biogeochemistry of Groundwater Flow. Environ. Sci. Technol.

  4. Decentralized diagnostics based on a distributed micro-genetic algorithm for transducer networks monitoring large experimental systems.

    PubMed

    Arpaia, P; Cimmino, P; Girone, M; La Commara, G; Maisto, D; Manna, C; Pezzetti, M

    2014-09-01

    Evolutionary approach to centralized multiple-faults diagnostics is extended to distributed transducer networks monitoring large experimental systems. Given a set of anomalies detected by the transducers, each instance of the multiple-fault problem is formulated as several parallel communicating sub-tasks running on different transducers, and thus solved one-by-one on spatially separated parallel processes. A micro-genetic algorithm merges evaluation time efficiency, arising from a small-size population distributed on parallel-synchronized processors, with the effectiveness of centralized evolutionary techniques due to optimal mix of exploitation and exploration. In this way, holistic view and effectiveness advantages of evolutionary global diagnostics are combined with reliability and efficiency benefits of distributed parallel architectures. The proposed approach was validated both (i) by simulation at CERN, on a case study of a cold box for enhancing the cryogeny diagnostics of the Large Hadron Collider, and (ii) by experiments, under the framework of the industrial research project MONDIEVOB (Building Remote Monitoring and Evolutionary Diagnostics), co-funded by EU and the company Del Bo srl, Napoli, Italy.

  5. Enhanced methodology of focus control and monitoring on scanner tool

    NASA Astrophysics Data System (ADS)

    Chen, Yen-Jen; Kim, Young Ki; Hao, Xueli; Gomez, Juan-Manuel; Tian, Ye; Kamalizadeh, Ferhad; Hanson, Justin K.

    2017-03-01

    As the demand of the technology node shrinks from 14nm to 7nm, the reliability of tool monitoring techniques in advanced semiconductor fabs to achieve high yield and quality becomes more critical. Tool health monitoring methods involve periodic sampling of moderately processed test wafers to detect for particles, defects, and tool stability in order to ensure proper tool health. For lithography TWINSCAN scanner tools, the requirements for overlay stability and focus control are very strict. Current scanner tool health monitoring methods include running BaseLiner to ensure proper tool stability on a periodic basis. The focus measurement on YIELDSTAR by real-time or library-based reconstruction of critical dimensions (CD) and side wall angle (SWA) has been demonstrated as an accurate metrology input to the control loop. The high accuracy and repeatability of the YIELDSTAR focus measurement provides a common reference of scanner setup and user process. In order to further improve the metrology and matching performance, Diffraction Based Focus (DBF) metrology enabling accurate, fast, and non-destructive focus acquisition, has been successfully utilized for focus monitoring/control of TWINSCAN NXT immersion scanners. The optimal DBF target was determined to have minimized dose crosstalk, dynamic precision, set-get residual, and lens aberration sensitivity. By exploiting this new measurement target design, 80% improvement in tool-to-tool matching, >16% improvement in run-to-run mean focus stability, and >32% improvement in focus uniformity have been demonstrated compared to the previous BaseLiner methodology. Matching <2.4 nm across multiple NXT immersion scanners has been achieved with the new methodology of set baseline reference. This baseline technique, with either conventional BaseLiner low numerical aperture (NA=1.20) mode or advanced illumination high NA mode (NA=1.35), has also been evaluated to have consistent performance. This enhanced methodology of focus control and monitoring on multiple illumination conditions, opens an avenue to significantly reduce Focus-Exposure Matrix (FEM) wafer exposure for new product/layer best focus (BF) setup.

  6. Radar image and data fusion for natural hazards characterisation

    USGS Publications Warehouse

    Lu, Zhong; Dzurisin, Daniel; Jung, Hyung-Sup; Zhang, Jixian; Zhang, Yonghong

    2010-01-01

    Fusion of synthetic aperture radar (SAR) images through interferometric, polarimetric and tomographic processing provides an all - weather imaging capability to characterise and monitor various natural hazards. This article outlines interferometric synthetic aperture radar (InSAR) processing and products and their utility for natural hazards characterisation, provides an overview of the techniques and applications related to fusion of SAR/InSAR images with optical and other images and highlights the emerging SAR fusion technologies. In addition to providing precise land - surface digital elevation maps, SAR - derived imaging products can map millimetre - scale elevation changes driven by volcanic, seismic and hydrogeologic processes, by landslides and wildfires and other natural hazards. With products derived from the fusion of SAR and other images, scientists can monitor the progress of flooding, estimate water storage changes in wetlands for improved hydrological modelling predictions and assessments of future flood impacts and map vegetation structure on a global scale and monitor its changes due to such processes as fire, volcanic eruption and deforestation. With the availability of SAR images in near real - time from multiple satellites in the near future, the fusion of SAR images with other images and data is playing an increasingly important role in understanding and forecasting natural hazards.

  7. A framework for developing urban forest ecosystem services and goods indicators

    Treesearch

    Cynnamon Dobbs; Francisco J. Escobedo; Wayne C. Zipperer

    2011-01-01

    The social and ecological processes impacting on urban forests have been studied at multiple temporal and spatial scales in order to help us quantify, monitor, and value the ecosystem services that benefit people. Few studies have comprehensively analyzed the full suite of ecosystem services, goods (ESG), and ecosystem disservices provided by an urban forest....

  8. Developing Critical Understanding by Teaching Action Research to Undergraduate Psychology Students

    ERIC Educational Resources Information Center

    Jacobs, Gaby; Murray, Michael

    2010-01-01

    Action research assumes the active engagement of the stakeholders, such as the community, in the research, and a multiple-level process of reflection in order to evaluate and monitor the actions taken. This makes action research a suitable methodology to increase the critical understanding of the participants. In this paper we describe the…

  9. Control charts for monitoring accumulating adverse event count frequencies from single and multiple blinded trials.

    PubMed

    Gould, A Lawrence

    2016-12-30

    Conventional practice monitors accumulating information about drug safety in terms of the numbers of adverse events reported from trials in a drug development program. Estimates of between-treatment adverse event risk differences can be obtained readily from unblinded trials with adjustment for differences among trials using conventional statistical methods. Recent regulatory guidelines require monitoring the cumulative frequency of adverse event reports to identify possible between-treatment adverse event risk differences without unblinding ongoing trials. Conventional statistical methods for assessing between-treatment adverse event risks cannot be applied when the trials are blinded. However, CUSUM charts can be used to monitor the accumulation of adverse event occurrences. CUSUM charts for monitoring adverse event occurrence in a Bayesian paradigm are based on assumptions about the process generating the adverse event counts in a trial as expressed by informative prior distributions. This article describes the construction of control charts for monitoring adverse event occurrence based on statistical models for the processes, characterizes their statistical properties, and describes how to construct useful prior distributions. Application of the approach to two adverse events of interest in a real trial gave nearly identical results for binomial and Poisson observed event count likelihoods. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. Effective solutions for monitoring the electrostatic separation of metal and plastic granular waste from electric and electronic equipment.

    PubMed

    Senouci, Khouira; Medles, Karim; Dascalescu, Lucian

    2013-02-01

    The variability of the quantity and purity of the recovered materials is a serious drawback for the application of electrostatic separation technologies to the recycling of granular wastes. In a series of previous articles we have pointed out how capability and classic control chart concepts could be employed for better mastering the outcome of such processes. In the present work, the multiple exponentially weighted moving average (MEWMA) control chart is introduced and shown to be more effective than the Hotelling T2 chart for monitoring slow varying changes in the electrostatic separation of granular mixtures originating from electric and electronic equipment waste. The operation of the industrial process was simulated by using a laboratory roll-type electrostatic separator and granular samples resulting from shredded electric cable wastes. The 25 tests carried out during the observation phase enabled the calculation of the upper and lower control limits for the two control charts considered in the present study. The 11 additional tests that simulated the monitoring phase pointed out that the MEWMA chart is more effective than Hotelling's T(2) chart in detecting slow varying changes in the outcome of a process. As the reverse is true in the case of abrupt alterations of monitored process performances, simultaneous usage of the two control charts is strongly recommended. While this study focused on a specific electrostatic separation process, using the MEWMA chart together with the well known Hotelling's T(2) chart should be applicable to the statistical control of other complex processes in the field of waste processing.

  11. Bone marrow invasion in multiple myeloma and metastatic disease.

    PubMed

    Vilanova, J C; Luna, A

    2016-04-01

    Magnetic resonance imaging (MRI) of the spine is the imaging study of choice for the management of bone marrow disease. MRI sequences enable us to integrate structural and functional information for detecting, staging, and monitoring the response the treatment of multiple myeloma and bone metastases in the spine. Whole-body MRI has been incorporated into different guidelines as the technique of choice for managing multiple myeloma and metastatic bone disease. Normal physiological changes in the yellow and red bone marrow represent a challenge in analyses to differentiate clinically significant findings from those that are not clinically significant. This article describes the findings for normal bone marrow, variants, and invasive processes in multiple myeloma and bone metastases. Copyright © 2015 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  12. Evaluation of anti-sticking layers performances for 200mm wafer scale Smart NILTM process through surface and defectivity characterizations

    NASA Astrophysics Data System (ADS)

    Delachat, F.; Phillipe, J.-C.; Larrey, V.; Fournel, F.; Bos, S.; Teyssèdre, H.; Chevalier, Xavier; Nicolet, Célia; Navarro, Christophe; Cayrefourcq, Ian

    2018-03-01

    In this work, an evaluation of various ASL processes for 200 mm wafer scale in the HERCULES® NIL equipment platform available at the CEA-Leti through the INSPIRE program is reported. The surface and adherence energies were correlated to the AFM and defectivity results in order to select the most promising ASL process for high resolution etch mask applications. The ASL performances of the selected process were evaluated by multiple working stamp fabrication using unpatterned and patterned masters though defectivity monitoring on optical based-inspection tools. Optical and SEM defect reviews were systematically performed. Multiple working stamps fabrication without degradation of the master defectivity was witnessed. This evaluation enabled to benchmark several ASL solutions based on the grafted technology develop by ARKEMA in order to reduce and optimize the soft stamp defectivity prior to its replication and therefore considerably reduce the final imprint defectivity for the Smart NIL process.

  13. HoloMonitor M4: holographic imaging cytometer for real-time kinetic label-free live-cell analysis of adherent cells

    NASA Astrophysics Data System (ADS)

    Sebesta, Mikael; Egelberg, Peter J.; Langberg, Anders; Lindskov, Jens-Henrik; Alm, Kersti; Janicke, Birgit

    2016-03-01

    Live-cell imaging enables studying dynamic cellular processes that cannot be visualized in fixed-cell assays. An increasing number of scientists in academia and the pharmaceutical industry are choosing live-cell analysis over or in addition to traditional fixed-cell assays. We have developed a time-lapse label-free imaging cytometer HoloMonitorM4. HoloMonitor M4 assists researchers to overcome inherent disadvantages of fluorescent analysis, specifically effects of chemical labels or genetic modifications which can alter cellular behavior. Additionally, label-free analysis is simple and eliminates the costs associated with staining procedures. The underlying technology principle is based on digital off-axis holography. While multiple alternatives exist for this type of analysis, we prioritized our developments to achieve the following: a) All-inclusive system - hardware and sophisticated cytometric analysis software; b) Ease of use enabling utilization of instrumentation by expert- and entrylevel researchers alike; c) Validated quantitative assay end-points tracked over time such as optical path length shift, optical volume and multiple derived imaging parameters; d) Reliable digital autofocus; e) Robust long-term operation in the incubator environment; f) High throughput and walk-away capability; and finally g) Data management suitable for single- and multi-user networks. We provide examples of HoloMonitor applications of label-free cell viability measurements and monitoring of cell cycle phase distribution.

  14. Key Design Elements of a Data Utility for National Biosurveillance: Event-driven Architecture, Caching, and Web Service Model

    PubMed Central

    Tsui, Fu-Chiang; Espino, Jeremy U.; Weng, Yan; Choudary, Arvinder; Su, Hoah-Der; Wagner, Michael M.

    2005-01-01

    The National Retail Data Monitor (NRDM) has monitored over-the-counter (OTC) medication sales in the United States since December 2002. The NRDM collects data from over 18,600 retail stores and processes over 0.6 million sales records per day. This paper describes key architectural features that we have found necessary for a data utility component in a national biosurveillance system. These elements include event-driven architecture to provide analyses of data in near real time, multiple levels of caching to improve query response time, high availability through the use of clustered servers, scalable data storage through the use of storage area networks and a web-service function for interoperation with affiliated systems. The methods and architectural principles are relevant to the design of any production data utility for public health surveillance—systems that collect data from multiple sources in near real time for use by analytic programs and user interfaces that have substantial requirements for time-series data aggregated in multiple dimensions. PMID:16779138

  15. Time-lapse ERT interpretation methodology for leachate injection monitoring based on multiple inversions and a clustering strategy (MICS)

    NASA Astrophysics Data System (ADS)

    Audebert, M.; Clément, R.; Touze-Foltz, N.; Günther, T.; Moreau, S.; Duquennoi, C.

    2014-12-01

    Leachate recirculation is a key process in municipal waste landfills functioning as bioreactors. To quantify the water content and to assess the leachate injection system, in-situ methods are required to obtain spatially distributed information, usually electrical resistivity tomography (ERT). This geophysical method is based on the inversion process, which presents two major problems in terms of delimiting the infiltration area. First, it is difficult for ERT users to choose an appropriate inversion parameter set. Indeed, it might not be sufficient to interpret only the optimum model (i.e. the model with the chosen regularisation strength) because it is not necessarily the model which best represents the physical process studied. Second, it is difficult to delineate the infiltration front based on resistivity models because of the smoothness of the inversion results. This paper proposes a new methodology called MICS (multiple inversions and clustering strategy), which allows ERT users to improve the delimitation of the infiltration area in leachate injection monitoring. The MICS methodology is based on (i) a multiple inversion step by varying the inversion parameter values to take a wide range of resistivity models into account and (ii) a clustering strategy to improve the delineation of the infiltration front. In this paper, MICS was assessed on two types of data. First, a numerical assessment allows us to optimise and test MICS for different infiltration area sizes, contrasts and shapes. Second, MICS was applied to a field data set gathered during leachate recirculation on a bioreactor.

  16. Interstellar scintillation observations for PSR B0355+54

    NASA Astrophysics Data System (ADS)

    Xu, Y. H.; Lee, K. J.; Hao, L. F.; Wang, H. G.; Liu, Z. Y.; Yue, Y. L.; Yuan, J. P.; Li, Z. X.; Wang, M.; Dong, J.; Tan, J. J.; Chen, W.; Bai, J. M.

    2018-06-01

    In this paper, we report our investigation of pulsar scintillation phenomena by monitoring PSR B0355+54 at 2.25 GHz for three successive months using the Kunming 40-m radio telescope. We measured the dynamic spectrum, the two-dimensional correlation function and the secondary spectrum. These observations have a high signal-to-noise ratio (S/N ≥ 100). We detected scintillation arcs, which are rarely observable using such a small telescope. The sub-microsecond scale width of the scintillation arc indicates that the transverse scale of the structures on the scattering screen is as compact as astronomical unit size. Our monitoring shows that the scintillation bandwidth, the time-scale and the arc curvature of PSR B0355+54 were varying temporally. A plausible explanation would need to invoke a multiple-scattering-screen or multiple-scattering-structure scenario, in which different screens or ray paths dominate the scintillation process at different epochs.

  17. Monitoring damage growth in titanium matrix composites using acoustic emission

    NASA Technical Reports Server (NTRS)

    Bakuckas, J. G., Jr.; Prosser, W. H.; Johnson, W. S.

    1993-01-01

    The application of the acoustic emission (AE) technique to locate and monitor damage growth in titanium matrix composites (TMC) was investigated. Damage growth was studied using several optical techniques including a long focal length, high magnification microscope system with image acquisition capabilities. Fracture surface examinations were conducted using a scanning electron microscope (SEM). The AE technique was used to locate damage based on the arrival times of AE events between two sensors. Using model specimens exhibiting a dominant failure mechanism, correlations were established between the observed damage growth mechanisms and the AE results in terms of the events amplitude. These correlations were used to monitor the damage growth process in laminates exhibiting multiple modes of damage. Results revealed that the AE technique is a viable and effective tool to monitor damage growth in TMC.

  18. A review of microdialysis coupled to microchip electrophoresis for monitoring biological events

    PubMed Central

    Saylor, Rachel A.; Lunte, Susan M.

    2015-01-01

    Microdialysis is a powerful sampling technique that enables monitoring of dynamic processes in vitro and in vivo. The combination of microdialysis with chromatographic or electrophoretic methods yields along with selective detection methods yields a “separation-based sensor” capable of monitoring multiple analytes in near real time. Analysis of microdialysis samples requires techniques that are fast (<1 min), have low volume requirements (nL–pL), and, ideally, can be employed on-line. Microchip electrophoresis fulfills these requirements and also permits the possibility of integrating sample preparation and manipulation with detection strategies directly on-chip. Microdialysis coupled to microchip electrophoresis has been employed for monitoring biological events in vivo and in vitro. This review discusses technical considerations for coupling microdialysis sampling and microchip electrophoresis, including various interface designs, and current applications in the field. PMID:25637011

  19. Shared performance monitor in a multiprocessor system

    DOEpatents

    Chiu, George; Gara, Alan G.; Salapura, Valentina

    2012-07-24

    A performance monitoring unit (PMU) and method for monitoring performance of events occurring in a multiprocessor system. The multiprocessor system comprises a plurality of processor devices units, each processor device for generating signals representing occurrences of events in the processor device, and, a single shared counter resource for performance monitoring. The performance monitor unit is shared by all processor cores in the multiprocessor system. The PMU comprises: a plurality of performance counters each for counting signals representing occurrences of events from one or more the plurality of processor units in the multiprocessor system; and, a plurality of input devices for receiving the event signals from one or more processor devices of the plurality of processor units, the plurality of input devices programmable to select event signals for receipt by one or more of the plurality of performance counters for counting, wherein the PMU is shared between multiple processing units, or within a group of processors in the multiprocessing system. The PMU is further programmed to monitor event signals issued from non-processor devices.

  20. Analysis of Urinary Metabolites of Nerve and Blister Chemical Warfare Agents

    DTIC Science & Technology

    2014-08-01

    of CWAs. The analysis methods use UHPLC-MS/MS in Multiple Reaction Monitoring ( MRM ) mode to enhance the selectivity and sensitivity of the method...Chromatography Mass Spectrometry LOD Limit Of Detection LOQ Limit of Quantitation MRM Multiple Reaction Monitoring MSMS Tandem mass...urine [1]. Those analysis methods use UHPLC- MS/MS in Multiple Reaction Monitoring ( MRM ) mode to enhance the selectivity and sensitivity of the method

  1. Benchmarking: measuring the outcomes of evidence-based practice.

    PubMed

    DeLise, D C; Leasure, A R

    2001-01-01

    Measurement of the outcomes associated with implementation of evidence-based practice changes is becoming increasingly emphasized by multiple health care disciplines. A final step to the process of implementing and sustaining evidence-supported practice changes is that of outcomes evaluation and monitoring. The comparison of outcomes to internal and external measures is known as benchmarking. This article discusses evidence-based practice, provides an overview of outcomes evaluation, and describes the process of benchmarking to improve practice. A case study is used to illustrate this concept.

  2. Bridging the gap between financial reporting and the revenue cycle.

    PubMed

    Clark, Kari; Bang, Derek A

    2012-09-01

    Implementing a standardized financial reporting and revenue cycle monitoring platform can help healthcare organizations improve their net revenue reporting and budgeting processes. Consistent, standardized data help the finance office estimate accounts receivable reserves more accurately, streamline the month-end closing process, and strengthen internal controls. The benefits of standardizing the finance and revenue cycle functions are particularly significant in large organizations with multiple facilities, but even single-facility providers can benefit from improved communication between the business office and finance.

  3. Operational Procedures for Collecting Water-Quality Samples at Monitoring Sites on Maple Creek Near Nickerson and the Platte River at Louisville, Eastern Nebraska

    USGS Publications Warehouse

    Johnson, Steven M.; Swanson, Robert B.

    1994-01-01

    Prototype stream-monitoring sites were operated during part of 1992 in the Central Nebraska Basins (CNBR) and three other study areas of the National Water-Quality Assessment (NAWQ) Program of the U.S. Geological Survey. Results from the prototype project provide information needed to operate a net- work of intensive fixed station stream-monitoring sites. This report evaluates operating procedures for two NAWQA prototype sites at Maple Creek near Nickerson and the Platte River at Louisville, eastern Nebraska. Each site was sampled intensively in the spring and late summer 1992, with less intensive sampling in midsummer. In addition, multiple samples were collected during two high- flow periods at the Maple Creek site--one early and the other late in the growing season. Water-samples analyses included determination of pesticides, nutrients, major ions, suspended sediment, and measurements of physical properties. Equipment and protocols for the water-quality sampling procedures were evaluated. Operation of the prototype stream- monitoring sites included development and comparison of onsite and laboratory sample-processing proce- dures. Onsite processing was labor intensive but allowed for immediate preservation of all sampled constituents. Laboratory processing required less field labor and decreased the risk of contamination, but allowed for no immediate preservation of the samples.

  4. High Temporal Resolution Permafrost Monitoring Using a Multiple Stack Insar Technique

    NASA Astrophysics Data System (ADS)

    Eppler, J.; Kubanski, M.; Sharma, J.; Busler, J.

    2015-04-01

    The combined effect of climate change and accelerated economic development in Northern regions increases the threat of permafrost related surface deformation to buildings and transportation infrastructure. Satellite based InSAR provides a means for monitoring infrastructure that may be both remote and spatially extensive. However, permafrost poses challenges for InSAR monitoring due to the complex temporal deformation patterns caused by both seasonal active layer fluctuations and long-term changes in permafrost thickness. These dynamics suggest a need for increasing the temporal resolution of multi-temporal InSAR methods. To address this issue we have developed a method that combines and jointly processes two or more same side geometry InSAR stacks to provide a high-temporal resolution estimate of surface deformation. The method allows for combining stacks from more than a single SAR sensor and for a combination of frequency bands. Data for this work have been collected and analysed for an area near the community of Umiujaq, Quebec in Northern Canada and include scenes from RADARSAT-2, TerraSAR-X and COSMO-SkyMed. Multiple stack based surface deformation estimates are compared for several cases including results from the three sensors individually and for all sensors combined. The test cases show substantially similar surface deformation results which correlate well with surficial geology. The best spatial coverage of coherent targets was achieved when data from all sensors were combined. The proposed multiple stack method is demonstrated to improve the estimation of surface deformation in permafrost affected areas and shows potential for deriving InSAR based permafrost classification maps to aid in the monitoring of Northern infrastructure.

  5. Method and apparatus for monitoring plasma processing operations

    DOEpatents

    Smith, Jr., Michael Lane; Ward, Pamela Denise Peardon; Stevenson, Joel O'Don

    2002-01-01

    The invention generally relates to various aspects of a plasma process, and more specifically the monitoring of such plasma processes. One aspect relates in at least some manner to calibrating or initializing a plasma monitoring assembly. This type of calibration may be used to address wavelength shifts, intensity shifts, or both associated with optical emissions data obtained on a plasma process. A calibration light may be directed at a window through which optical emissions data is being obtained to determine the effect, if any, that the inner surface of the window is having on the optical emissions data being obtained therethrough, the operation of the optical emissions data gathering device, or both. Another aspect relates in at least some manner to various types of evaluations which may be undertaken of a plasma process which was run, and more typically one which is currently being run, within the processing chamber. Plasma health evaluations and process identification through optical emissions analysis are included in this aspect. Yet another aspect associated with the present invention relates in at least some manner to the endpoint of a plasma process (e.g., plasma recipe, plasma clean, conditioning wafer operation) or discrete/discernible portion thereof (e.g., a plasma step of a multiple step plasma recipe). Another aspect associated with the present invention relates to how one or more of the above-noted aspects may be implemented into a semiconductor fabrication facility, such as the distribution of wafers to a wafer production system. A final aspect of the present invention relates to a network a plurality of plasma monitoring systems, including with remote capabilities (i.e., outside of the clean room).

  6. Targeted quantitative analysis of Streptococcus pyogenes virulence factors by multiple reaction monitoring.

    PubMed

    Lange, Vinzenz; Malmström, Johan A; Didion, John; King, Nichole L; Johansson, Björn P; Schäfer, Juliane; Rameseder, Jonathan; Wong, Chee-Hong; Deutsch, Eric W; Brusniak, Mi-Youn; Bühlmann, Peter; Björck, Lars; Domon, Bruno; Aebersold, Ruedi

    2008-08-01

    In many studies, particularly in the field of systems biology, it is essential that identical protein sets are precisely quantified in multiple samples such as those representing differentially perturbed cell states. The high degree of reproducibility required for such experiments has not been achieved by classical mass spectrometry-based proteomics methods. In this study we describe the implementation of a targeted quantitative approach by which predetermined protein sets are first identified and subsequently quantified at high sensitivity reliably in multiple samples. This approach consists of three steps. First, the proteome is extensively mapped out by multidimensional fractionation and tandem mass spectrometry, and the data generated are assembled in the PeptideAtlas database. Second, based on this proteome map, peptides uniquely identifying the proteins of interest, proteotypic peptides, are selected, and multiple reaction monitoring (MRM) transitions are established and validated by MS2 spectrum acquisition. This process of peptide selection, transition selection, and validation is supported by a suite of software tools, TIQAM (Targeted Identification for Quantitative Analysis by MRM), described in this study. Third, the selected target protein set is quantified in multiple samples by MRM. Applying this approach we were able to reliably quantify low abundance virulence factors from cultures of the human pathogen Streptococcus pyogenes exposed to increasing amounts of plasma. The resulting quantitative protein patterns enabled us to clearly define the subset of virulence proteins that is regulated upon plasma exposure.

  7. Monitoring Network and Interfacial Healing Processes by Broadband Dielectric Spectroscopy: A Case Study on Natural Rubber.

    PubMed

    Hernández, M; Grande, A M; van der Zwaag, S; García, S J

    2016-04-27

    Broadband dielectric spectroscopy (BDS) is introduced as a new and powerful technique to monitor network and macroscale damage healing in an elastomer. For the proof of concept, a partially cured sulfur-cured natural rubber (NR) containing reversible disulfides as the healing moiety was employed. The forms of damage healed and monitored were an invisible damage in the rubber network due to multiple straining and an imposed macroscopic crack. The relaxation times of pristine, damaged, and healed samples were determined and fitted to the Havriliak-Negami equation to obtain the characteristic polymer parameters. It is shown that seemingly full mechanical healing occurred regardless the type of damage, while BDS demonstrates that the polymer architecture in the healed material differs from that in the original one. These results represent a step forward in the understanding of damage and healing processes in intrinsic self-healing polymer systems with prospective applications such as coatings, tires, seals, and gaskets.

  8. Apparatus for monitoring crystal growth

    DOEpatents

    Sachs, Emanual M.

    1981-01-01

    A system and method are disclosed for monitoring the growth of a crystalline body from a liquid meniscus in a furnace. The system provides an improved human/machine interface so as to reduce operator stress, strain and fatigue while improving the conditions for observation and control of the growing process. The system comprises suitable optics for forming an image of the meniscus and body wherein the image is anamorphic so that the entire meniscus can be viewed with good resolution in both the width and height dimensions. The system also comprises a video display for displaying the anamorphic image. The video display includes means for enhancing the contrast between any two contrasting points in the image. The video display also comprises a signal averager for averaging the intensity of at least one preselected portions of the image. The value of the average intensity, can in turn be utilized to control the growth of the body. The system and method are also capable of observing and monitoring multiple processes.

  9. Method of monitoring crystal growth

    DOEpatents

    Sachs, Emanual M.

    1982-01-01

    A system and method are disclosed for monitoring the growth of a crystalline body from a liquid meniscus in a furnace. The system provides an improved human/machine interface so as to reduce operator stress, strain and fatigue while improving the conditions for observation and control of the growing process. The system comprises suitable optics for forming an image of the meniscus and body wherein the image is anamorphic so that the entire meniscus can be viewed with good resolution in both the width and height dimensions. The system also comprises a video display for displaying the anamorphic image. The video display includes means for enhancing the contrast between any two contrasting points in the image. The video display also comprises a signal averager for averaging the intensity of at least one preselected portions of the image. The value of the average intensity, can in turn be utilized to control the growth of the body. The system and method are also capable of observing and monitoring multiple processes.

  10. Polymer waveguide grating sensor integrated with a thin-film photodetector

    PubMed Central

    Song, Fuchuan; Xiao, Jing; Xie, Antonio Jou; Seo, Sang-Woo

    2014-01-01

    This paper presents a planar waveguide grating sensor integrated with a photodetector (PD) for on-chip optical sensing systems which are suitable for diagnostics in the field and in-situ measurements. III–V semiconductor-based thin-film PD is integrated with a polymer based waveguide grating device on a silicon platform. The fabricated optical sensor successfully discriminates optical spectral characteristics of the polymer waveguide grating from the on-chip PD. In addition, its potential use as a refractive index sensor is demonstrated. Based on a planar waveguide structure, the demonstrated sensor chip may incorporate multiple grating waveguide sensing regions with their own optical detection PDs. In addition, the demonstrated processing is based on a post-integration process which is compatible with silicon complementary metal-oxide semiconductor (CMOS) electronics. Potentially, this leads a compact, chip-scale optical sensing system which can monitor multiple physical parameters simultaneously without need for external signal processing. PMID:24466407

  11. A novel methodology for in-process monitoring of flow forming

    NASA Astrophysics Data System (ADS)

    Appleby, Andrew; Conway, Alastair; Ion, William

    2017-10-01

    Flow forming (FF) is an incremental cold working process with near-net-shape forming capability. Failures by fracture due to high deformation can be unexpected and sometimes catastrophic, causing tool damage. If process failures can be identified in real time, an automatic cut-out could prevent costly tool damage. Sound and vibration monitoring is well established and commercially viable in the machining sector to detect current and incipient process failures, but not for FF. A broad-frequency microphone was used to record the sound signature of the manufacturing cycle for a series of FF parts. Parts were flow formed using single and multiple passes, and flaws were introduced into some of the parts to simulate the presence of spontaneously initiated cracks. The results show that this methodology is capable of identifying both introduced defects and spontaneous failures during flow forming. Further investigation is needed to categorise and identify different modes of failure and identify further potential applications in rotary forming.

  12. Pattern centric design based sensitive patterns and process monitor in manufacturing

    NASA Astrophysics Data System (ADS)

    Hsiang, Chingyun; Cheng, Guojie; Wu, Kechih

    2017-03-01

    When design rule is mitigating to smaller dimension, process variation requirement is tighter than ever and challenges the limits of device yield. Masks, lithography, etching and other processes have to meet very tight specifications in order to keep defect and CD within the margins of the process window. Conventionally, Inspection and metrology equipments are utilized to monitor and control wafer quality in-line. In high throughput optical inspection, nuisance and review-classification become a tedious labor intensive job in manufacturing. Certain high-resolution SEM images are taken to validate defects after optical inspection. These high resolution SEM images catch not only optical inspection highlighted point, also its surrounding patterns. However, this pattern information is not well utilized in conventional quality control method. Using this complementary design based pattern monitor not only monitors and analyzes the variation of patterns sensitivity but also reduce nuisance and highlight defective patterns or killer defects. After grouping in either single or multiple layers, systematic defects can be identified quickly in this flow. In this paper, we applied design based pattern monitor in different layers to monitor process variation impacts on all kinds of patterns. First, the contour of high resolutions SEM image is extracted and aligned to design with offset adjustment and fine alignment [1]. Second, specified pattern rules can be applied on design clip area, the same size as SEM image, and form POI (pattern of interest) areas. Third, the discrepancy of contour and design measurement at different pattern types in measurement blocks. Fourth, defective patterns are reported by discrepancy detection criteria and pattern grouping [4]. Meanwhile, reported pattern defects are ranked by number and severity by discrepancy. In this step, process sensitive high repeatable systematic defects can be identified quickly Through this design based process pattern monitor method, most of optical inspection nuisances can be filtered out at contour to design discrepancy measurement. Daily analysis results are stored at database as reference to compare with incoming data. Defective pattern library contains existing and known systematic defect patterns which help to catch and identify new pattern defects or process impacts. On the other hand, this defect pattern library provides extra valuable information for mask, pattern and defects verification, inspection care area generation, further OPC fix and process enhancement and investigation.

  13. Differential Dishabituation as a Function of Magnitude of Stimulus Discrepancy and Sex of the Newborn Infant.

    ERIC Educational Resources Information Center

    Friedman, Steven; And Others

    This study uses a habituation paradigm to systematically investigate the discrepancy hypothesis with male and female new borns. In addition, multiple visual response measures are used in monitoring the habituation process and the infant's response to various degrees of novelty. Ss were 36 apparently normal newborns (half of each sex) ranging in…

  14. Multiple Sclerosis: Epidemiologic, Clinical, and Therapeutic Aspects.

    PubMed

    Vidal-Jordana, Angela; Montalban, Xavier

    2017-05-01

    Multiple sclerosis (MS) is a chronic autoimmune and degenerative disease of the central nervous system that affects young people. MS develops in genetically susceptible individuals exposed to different unknown triggering factors. Different phenotypes are described. About 15% of patients present with a primary progressive course and 85% with a relapsing-remitting course. An increasing number of disease-modifying treatments has emerged. Although encouraging, the number of drugs challenges the neurologist because each treatment has its own risk-benefit profile. Patients should be involved in the decision-making process to ensure good treatment and safety monitoring adherence. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Using a Sequence of Earcons to Monitor Multiple Simulated Patients.

    PubMed

    Hickling, Anna; Brecknell, Birgit; Loeb, Robert G; Sanderson, Penelope

    2017-03-01

    The aim of this study was to determine whether a sequence of earcons can effectively convey the status of multiple processes, such as the status of multiple patients in a clinical setting. Clinicians often monitor multiple patients. An auditory display that intermittently conveys the status of multiple patients may help. Nonclinician participants listened to sequences of 500-ms earcons that each represented the heart rate (HR) and oxygen saturation (SpO 2 ) levels of a different simulated patient. In each sequence, one, two, or three patients had an abnormal level of HR and/or SpO 2 . In Experiment 1, participants reported which of nine patients in a sequence were abnormal. In Experiment 2, participants identified the vital signs of one, two, or three abnormal patients in sequences of one, five, or nine patients, where the interstimulus interval (ISI) between earcons was 150 ms. Experiment 3 used the five-sequence condition of Experiment 2, but the ISI was either 150 ms or 800 ms. Participants reported which patient(s) were abnormal with median 95% accuracy. Identification accuracy for vital signs decreased as the number of abnormal patients increased from one to three, p < .001, but accuracy was unaffected by number of patients in a sequence. Overall, identification accuracy was significantly higher with an ISI of 800 ms (89%) compared with an ISI of 150 ms (83%), p < .001. A multiple-patient display can be created by cycling through earcons that represent individual patients. The principles underlying the multiple-patient display can be extended to other vital signs, designs, and domains.

  16. Using normalisation process theory to understand barriers and facilitators to implementing mindfulness-based stress reduction for people with multiple sclerosis.

    PubMed

    Simpson, Robert; Simpson, Sharon; Wood, Karen; Mercer, Stewart W; Mair, Frances S

    2018-01-01

    Objectives To study barriers and facilitators to implementation of mindfulness-based stress reduction for people with multiple sclerosis. Methods Qualitative interviews were used to explore barriers and facilitators to implementation of mindfulness-based stress reduction, including 33 people with multiple sclerosis, 6 multiple sclerosis clinicians and 2 course instructors. Normalisation process theory provided the underpinning conceptual framework. Data were analysed deductively using normalisation process theory constructs (coherence, cognitive participation, collective action and reflexive monitoring). Results Key barriers included mismatched stakeholder expectations, lack of knowledge about mindfulness-based stress reduction, high levels of comorbidity and disability and skepticism about embedding mindfulness-based stress reduction in routine multiple sclerosis care. Facilitators to implementation included introducing a pre-course orientation session; adaptations to mindfulness-based stress reduction to accommodate comorbidity and disability and participants suggested smaller, shorter classes, shortened practices, exclusion of mindful-walking and more time with peers. Post-mindfulness-based stress reduction booster sessions may be required, and objective and subjective reports of benefit would increase clinician confidence in mindfulness-based stress reduction. Discussion Multiple sclerosis patients and clinicians know little about mindfulness-based stress reduction. Mismatched expectations are a barrier to participation, as is rigid application of mindfulness-based stress reduction in the context of disability. Course adaptations in response to patient needs would facilitate uptake and utilisation. Rendering access to mindfulness-based stress reduction rapid and flexible could facilitate implementation. Embedded outcome assessment is desirable.

  17. The use of process mapping in healthcare quality improvement projects.

    PubMed

    Antonacci, Grazia; Reed, Julie E; Lennox, Laura; Barlow, James

    2018-05-01

    Introduction Process mapping provides insight into systems and processes in which improvement interventions are introduced and is seen as useful in healthcare quality improvement projects. There is little empirical evidence on the use of process mapping in healthcare practice. This study advances understanding of the benefits and success factors of process mapping within quality improvement projects. Methods Eight quality improvement projects were purposively selected from different healthcare settings within the UK's National Health Service. Data were gathered from multiple data-sources, including interviews exploring participants' experience of using process mapping in their projects and perceptions of benefits and challenges related to its use. These were analysed using inductive analysis. Results Eight key benefits related to process mapping use were reported by participants (gathering a shared understanding of the reality; identifying improvement opportunities; engaging stakeholders in the project; defining project's objectives; monitoring project progress; learning; increased empathy; simplicity of the method) and five factors related to successful process mapping exercises (simple and appropriate visual representation, information gathered from multiple stakeholders, facilitator's experience and soft skills, basic training, iterative use of process mapping throughout the project). Conclusions Findings highlight benefits and versatility of process mapping and provide practical suggestions to improve its use in practice.

  18. Wearable sensors for human health monitoring

    NASA Astrophysics Data System (ADS)

    Asada, H. Harry; Reisner, Andrew

    2006-03-01

    Wearable sensors for continuous monitoring of vital signs for extended periods of weeks or months are expected to revolutionize healthcare services in the home and workplace as well as in hospitals and nursing homes. This invited paper describes recent research progress in wearable health monitoring technology and its clinical applications, with emphasis on blood pressure and circulatory monitoring. First, a finger ring-type wearable blood pressure sensor based on photo plethysmogram is presented. Technical issues, including motion artifact reduction, power saving, and wearability enhancement, will be addressed. Second, sensor fusion and sensor networking for integrating multiple sensors with diverse modalities will be discussed for comprehensive monitoring and diagnosis of health status. Unlike traditional snap-shot measurements, continuous monitoring with wearable sensors opens up the possibility to treat the physiological system as a dynamical process. This allows us to apply powerful system dynamics and control methodologies, such as adaptive filtering, single- and multi-channel system identification, active noise cancellation, and adaptive control, to the monitoring and treatment of highly complex physiological systems. A few clinical trials illustrate the potentials of the wearable sensor technology for future heath care services.

  19. Method and apparatus for monitoring plasma processing operations

    DOEpatents

    Smith, Jr., Michael Lane; Stevenson, Joel O'Don; Ward, Pamela Peardon Denise

    2001-01-01

    The invention generally relates to various aspects of a plasma process, and more specifically the monitoring of such plasma processes. One aspect relates in at least some manner to calibrating or initializing a plasma monitoring assembly. This type of calibration may be used to address wavelength shifts, intensity shifts, or both associated with optical emissions data obtained on a plasma process. A calibration light may be directed at a window through which optical emissions data is being obtained to determine the effect, if any, that the inner surface of the window is having on the optical emissions data being obtained therethrough, the operation of the optical emissions data gathering device, or both. Another aspect relates in at least some manner to various types of evaluations which may be undertaken of a plasma process which was run, and more typically one which is currently being run, within the processing chamber. Plasma health evaluations and process identification through optical emissions analysis are included in this aspect. Yet another aspect associated with the present invention relates in at least some manner to the endpoint of a plasma process (e.g., plasma recipe, plasma clean, conditioning wafer operation) or discrete/discernible portion thereof (e.g., a plasma step of a multiple step plasma recipe). A final aspect associated with the present invention relates to how one or more of the above-noted aspects may be implemented into a semiconductor fabrication facility, such as the distribution of wafers to a wafer production system.

  20. Method and apparatus for monitoring plasma processing operations

    DOEpatents

    Smith, Jr., Michael Lane; Stevenson, Joel O'Don; Ward, Pamela Peardon Denise

    2001-01-01

    The invention generally relates to various aspects of a plasma process, and more specifically the monitoring of such plasma processes. One aspect relates in at least some manner to calibrating or initializing a plasma monitoring assembly. This type of calibration may be used to address wavelength shifts, intensity shifts, or both associated with optical emissions data obtained on a plasma process. A calibration light may be directed at a window through which optical emissions data is being obtained to determine the effect, if any, that the inner surface of the window is having on the optical emissions data being obtained therethrough, the operation of the optical emissions data gathering device, or both. Another aspect relates in at least some manner to various types of evaluations which may be undertaken of a plasma process which was run, and more typically one which is currently being run, within the processing chamber. Plasma health evaluations and process identification through optical emissions analysis are included in this aspect. Yet another aspect associated with the present invention relates in at least some manner to the endpoint of a plasma process (e.g., plasma recipe, plasma clean, conditioning wafer operation) or discrete/discemible portion thereof (e.g., a plasma step of a multiple step plasma recipe). A final aspect associated with the present invention relates to how one or more of the above-noted aspects may be implemented into a semiconductor fabrication facility, such as the distribution of wafers to a wafer production system.

  1. Method and apparatus for monitoring plasma processing operations

    DOEpatents

    Smith, Jr., Michael Lane; Stevenson, Joel O'Don; Ward, Pamela Peardon Denise

    2000-01-01

    The invention generally relates to various aspects of a plasma process, and more specifically the monitoring of such plasma processes. One aspect relates in at least some manner to calibrating or initializing a plasma monitoring assembly. This type of calibration may be used to address wavelength shifts, intensity shifts, or both associated with optical emissions data obtained on a plasma process. A calibration light may be directed at a window through which optical emissions data is being obtained to determine the effect, if any, that the inner surface of the window is having on the optical emissions data being obtained therethrough, the operation of the optical emissions data gathering device, or both. Another aspect relates in at least some manner to various types of evaluations which may be undertaken of a plasma process which was run, and more typically one which is currently being run, within the processing chamber. Plasma health evaluations and process identification through optical emissions analysis are included in this aspect. Yet another aspect associated with the present invention relates in at least some manner to the endpoint of a plasma process (e.g., plasma recipe, plasma clean, conditioning wafer operation) or discrete/discernible portion thereof (e.g., a plasma step of a multiple step plasma recipe). A final aspect associated with the present invention relates to how one or more of the above-noted aspects may be implemented into a semiconductor fabrication facility, such as the distribution of wafers to a wafer production system.

  2. Method and apparatus for monitoring plasma processing operations

    DOEpatents

    Smith, Jr., Michael Lane; Stevenson, Joel O'Don; Ward, Pamela Peardon Denise

    2002-07-16

    The invention generally relates to various aspects of a plasma process, and more specifically the monitoring of such plasma processes. One aspect relates in at least some manner to calibrating or initializing a plasma monitoring assembly. This type of calibration may be used to address wavelength shifts, intensity shifts, or both associated with optical emissions data obtained on a plasma process. A calibration light may be directed at a window through which optical emissions data is being obtained to determine the effect, if any, that the inner surface of the window is having on the optical emissions data being obtained therethrough, the operation of the optical emissions data gathering device, or both. Another aspect relates in at least some manner to various types of evaluations which may be undertaken of a plasma process which was run, and more typically one which is currently being run, within the processing chamber. Plasma health evaluations and process identification through optical emissions analysis are included in this aspect. Yet another aspect associated with the present invention relates in at least some manner to the endpoint of a plasma process (e.g., plasma recipe, plasma clean, conditioning wafer operation) or discrete/discernible portion thereof (e.g., a plasma step of a multiple step plasma recipe). A final aspect associated with the present invention relates to how one or more of the above-noted aspects may be implemented into a semiconductor fabrication facility, such as the distribution of wafers to a wafer production system.

  3. Computerized procedures system

    DOEpatents

    Lipner, Melvin H.; Mundy, Roger A.; Franusich, Michael D.

    2010-10-12

    An online data driven computerized procedures system that guides an operator through a complex process facility's operating procedures. The system monitors plant data, processes the data and then, based upon this processing, presents the status of the current procedure step and/or substep to the operator. The system supports multiple users and a single procedure definition supports several interface formats that can be tailored to the individual user. Layered security controls access privileges and revisions are version controlled. The procedures run on a server that is platform independent of the user workstations that the server interfaces with and the user interface supports diverse procedural views.

  4. Understanding current steam sterilization recommendations and guidelines.

    PubMed

    Spry, Cynthia

    2008-10-01

    Processing surgical instruments in preparation for surgery is a complex multistep practice. It is impractical to culture each and every item to determine sterility; therefore, the best assurance of a sterile product is careful execution of every step in the process coupled with an ongoing quality control program. Perioperative staff nurses and managers responsible for instrument processing, whether for a single instrument or multiple sets, must be knowledgeable with regard to cleaning; packaging; cycle selection; and the use of physical, chemical, and biological monitors. Nurses also should be able to resolve issues related to loaner sets, flash sterilization, and extended cycles.

  5. Updating Parameters for Volcanic Hazard Assessment Using Multi-parameter Monitoring Data Streams And Bayesian Belief Networks

    NASA Astrophysics Data System (ADS)

    Odbert, Henry; Aspinall, Willy

    2014-05-01

    Evidence-based hazard assessment at volcanoes assimilates knowledge about the physical processes of hazardous phenomena and observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We discuss the uncertainty of inferences, and how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.

  6. Combining Volcano Monitoring Timeseries Analyses with Bayesian Belief Networks to Update Hazard Forecast Estimates

    NASA Astrophysics Data System (ADS)

    Odbert, Henry; Hincks, Thea; Aspinall, Willy

    2015-04-01

    Volcanic hazard assessments must combine information about the physical processes of hazardous phenomena with observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We show how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.

  7. Exploring resting-state EEG brain oscillatory activity in relation to cognitive functioning in multiple sclerosis.

    PubMed

    Keune, Philipp M; Hansen, Sascha; Weber, Emily; Zapf, Franziska; Habich, Juliane; Muenssinger, Jana; Wolf, Sebastian; Schönenberg, Michael; Oschmann, Patrick

    2017-09-01

    Neurophysiologic monitoring parameters related to cognition in Multiple Sclerosis (MS) are sparse. Previous work reported an association between magnetoencephalographic (MEG) alpha-1 activity and information processing speed. While this remains to be replicated by more available electroencephalographic (EEG) methods, also other established EEG markers, e.g. the slow-wave/fast-wave ratio (theta/beta ratio), remain to be explored in this context. Performance on standard tests addressing information processing speed and attention (Symbol-Digit Modalities Test, SDMT; Test of Attention Performance, TAP) was examined in relation to resting-state EEG alpha-1 and alpha-2 activity and the theta/beta ratio in 25MS patients. Increased global alpha-1 and alpha-2 activity and an increased frontal theta/beta ratio (pronounced slow-wave relative to fast-wave activity) were associated with lower SDMT processing speed. In an exploratory analysis, clinically impaired attention was associated with a significantly increased frontal theta/beta ratio whereas alpha power did not show sensitivity to clinical impairment. EEG global alpha power and the frontal theta/beta ratio were both associated with attention. The theta/beta ratio involved potential clinical sensitivity. Resting-state EEG recordings can be obtained during the routine clinical process. The examined resting-state measures may represent feasible monitoring parameters in MS. This notion should be explored in future intervention studies. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  8. Analysing and correcting the differences between multi-source and multi-scale spatial remote sensing observations.

    PubMed

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation.

  9. Analysing and Correcting the Differences between Multi-Source and Multi-Scale Spatial Remote Sensing Observations

    PubMed Central

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation. PMID:25405760

  10. Integrating land cover modeling and adaptive management to conserve endangered species and reduce catastrophic fire risk

    USGS Publications Warehouse

    Breininger, David; Duncan, Brean; Eaton, Mitchell J.; Johnson, Fred; Nichols, James

    2014-01-01

    Land cover modeling is used to inform land management, but most often via a two-step process, where science informs how management alternatives can influence resources, and then, decision makers can use this information to make decisions. A more efficient process is to directly integrate science and decision-making, where science allows us to learn in order to better accomplish management objectives and is developed to address specific decisions. Co-development of management and science is especially productive when decisions are complicated by multiple objectives and impeded by uncertainty. Multiple objectives can be met by the specification of tradeoffs, and relevant uncertainty can be addressed through targeted science (i.e., models and monitoring). We describe how to integrate habitat and fuel monitoring with decision-making focused on the dual objectives of managing for endangered species and minimizing catastrophic fire risk. Under certain conditions, both objectives might be achieved by a similar management policy; other conditions require tradeoffs between objectives. Knowledge about system responses to actions can be informed by developing hypotheses based on ideas about fire behavior and then applying competing management actions to different land units in the same system state. Monitoring and management integration is important to optimize state-specific management decisions and to increase knowledge about system responses. We believe this approach has broad utility and identifies a clear role for land cover modeling programs intended to inform decision-making.

  11. Implant for in-vivo parameter monitoring, processing and transmitting

    DOEpatents

    Ericson, Milton N [Knoxville, TN; McKnight, Timothy E [Greenback, TN; Smith, Stephen F [London, TN; Hylton, James O [Clinton, TN

    2009-11-24

    The present invention relates to a completely implantable intracranial pressure monitor, which can couple to existing fluid shunting systems as well as other internal monitoring probes. The implant sensor produces an analog data signal which is then converted electronically to a digital pulse by generation of a spreading code signal and then transmitted to a location outside the patient by a radio-frequency transmitter to an external receiver. The implanted device can receive power from an internal source as well as an inductive external source. Remote control of the implant is also provided by a control receiver which passes commands from an external source to the implant system logic. Alarm parameters can be programmed into the device which are capable of producing an audible or visual alarm signal. The utility of the monitor can be greatly expanded by using multiple pressure sensors simultaneously or by combining sensors of various physiological types.

  12. Long-term synchronized electrophysiological and behavioral wireless monitoring of freely moving animals

    PubMed Central

    Grand, Laszlo; Ftomov, Sergiu; Timofeev, Igor

    2012-01-01

    Parallel electrophysiological recording and behavioral monitoring of freely moving animals is essential for a better understanding of the neural mechanisms underlying behavior. In this paper we describe a novel wireless recording technique, which is capable of synchronously recording in vivo multichannel electrophysiological (LFP, MUA, EOG, EMG) and activity data (accelerometer, video) from freely moving cats. The method is based on the integration of commercially available components into a simple monitoring system and is complete with accelerometers and the needed signal processing tools. LFP activities of freely moving group-housed cats were recorded from multiple intracortical areas and from the hippocampus. EMG, EOG, accelerometer and video were simultaneously acquired with LFP activities 24-h a day for 3 months. These recordings confirm the possibility of using our wireless method for 24-h long-term monitoring of neurophysiological and behavioral data of freely moving experimental animals such as cats, ferrets, rabbits and other large animals. PMID:23099345

  13. A Four Channel Beam Current Monitor Data Acquisition System Using Embedded Processors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wheat, Jr., Robert Mitchell; Dalmas, Dale A.; Dale, Gregory E.

    2015-08-11

    Data acquisition from multiple beam current monitors is required for electron accelerator production of Mo-99. A two channel system capable of recording data from two beam current monitors has been developed, is currently in use, and is discussed below. The development of a cost-effective method of extending this system to more than two channels and integrating of these measurements into an accelerator control system is the main focus of this report. Data from these current monitors is digitized, processed, and stored by a digital data acquisition system. Limitations and drawbacks with the currently deployed digital data acquisition system have beenmore » identified as have been potential solutions, or at least improvements, to these problems. This report will discuss and document the efforts we've made in improving the flexibility and lowering the cost of the data acquisition system while maintaining the minimum requirements.« less

  14. A clinical decision-making mechanism for context-aware and patient-specific remote monitoring systems using the correlations of multiple vital signs.

    PubMed

    Forkan, Abdur Rahim Mohammad; Khalil, Ibrahim

    2017-02-01

    In home-based context-aware monitoring patient's real-time data of multiple vital signs (e.g. heart rate, blood pressure) are continuously generated from wearable sensors. The changes in such vital parameters are highly correlated. They are also patient-centric and can be either recurrent or can fluctuate. The objective of this study is to develop an intelligent method for personalized monitoring and clinical decision support through early estimation of patient-specific vital sign values, and prediction of anomalies using the interrelation among multiple vital signs. In this paper, multi-label classification algorithms are applied in classifier design to forecast these values and related abnormalities. We proposed a completely new approach of patient-specific vital sign prediction system using their correlations. The developed technique can guide healthcare professionals to make accurate clinical decisions. Moreover, our model can support many patients with various clinical conditions concurrently by utilizing the power of cloud computing technology. The developed method also reduces the rate of false predictions in remote monitoring centres. In the experimental settings, the statistical features and correlations of six vital signs are formulated as multi-label classification problem. Eight multi-label classification algorithms along with three fundamental machine learning algorithms are used and tested on a public dataset of 85 patients. Different multi-label classification evaluation measures such as Hamming score, F1-micro average, and accuracy are used for interpreting the prediction performance of patient-specific situation classifications. We achieved 90-95% Hamming score values across 24 classifier combinations for 85 different patients used in our experiment. The results are compared with single-label classifiers and without considering the correlations among the vitals. The comparisons show that multi-label method is the best technique for this problem domain. The evaluation results reveal that multi-label classification techniques using the correlations among multiple vitals are effective ways for early estimation of future values of those vitals. In context-aware remote monitoring this process can greatly help the doctors in quick diagnostic decision making. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. A formalized approach to making effective natural resource management decisions for Alaska National Parks

    USGS Publications Warehouse

    MacCluskie, Margaret C.; Romito, Angela; Peterson, James T.; Lawler, James P.

    2015-01-01

    A fundamental goal of the National Park Service (NPS) is the long-term protection and management of resources in the National Park System. Reaching this goal requires multiple approaches, including the conservation of essential habitats and the identification and elimination of potential threats to biota and habitats. To accomplish these goals, the NPS has implemented the Alaska Region Vital Signs Inventory and Monitoring (I&M) Program to monitor key biological, chemical, and physical components of ecosystems at more than 270 national parks. The Alaska Region has four networks—Arctic, Central, Southeast, and Southwest. By monitoring vital signs over large spatial and temporal scales, park managers are provided with information on the status and trajectory of park resources as well as a greater understanding and insight into the ecosystem dynamics. While detecting and quantifying change is important to conservation efforts, to be useful for formulating remedial actions, monitoring data must explicitly relate to management objectives and be collected in such a manner as to resolve key uncertainties about the dynamics of the system (Nichols and Williams 2006). Formal decision making frameworks (versus more traditional processes described below) allow for the explicit integration of monitoring data into decision making processes to improve the understanding of system dynamics, thereby improving future decisions (Williams 2011).

  16. A method for achieving an order-of-magnitude increase in the temporal resolution of a standard CRT computer monitor.

    PubMed

    Fiesta, Matthew P; Eagleman, David M

    2008-09-15

    As the frequency of a flickering light is increased, the perception of flicker is replaced by the perception of steady light at what is known as the critical flicker fusion threshold (CFFT). This threshold provides a useful measure of the brain's information processing speed, and has been used in medicine for over a century both for diagnostic and drug efficacy studies. However, the hardware for presenting the stimulus has not advanced to take advantage of computers, largely because the refresh rates of typical monitors are too slow to provide fine-grained changes in the alternation rate of a visual stimulus. For example, a cathode ray tube (CRT) computer monitor running at 100Hz will render a new frame every 10 ms, thus restricting the period of a flickering stimulus to multiples of 20 ms. These multiples provide a temporal resolution far too low to make precise threshold measurements, since typical CFFT values are in the neighborhood of 35 ms. We describe here a simple and novel technique to enable alternating images at several closely-spaced periods on a standard monitor. The key to our technique is to programmatically control the video card to dynamically reset the refresh rate of the monitor. Different refresh rates allow slightly different frame durations; this can be leveraged to vastly increase the resolution of stimulus presentation times. This simple technique opens new inroads for experiments on computers that require more finely-spaced temporal resolution than a monitor at a single, fixed refresh rate can allow.

  17. Optimized protocol for quantitative multiple reaction monitoring-based proteomic analysis of formalin-fixed, paraffin embedded tissues

    PubMed Central

    Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.

    2016-01-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  18. Guiding Principles for a Pediatric Neurology ICU (neuroPICU) Bedside Multimodal Monitor

    PubMed Central

    Eldar, Yonina C.; Gopher, Daniel; Gottlieb, Amihai; Lammfromm, Rotem; Mangat, Halinder S; Peleg, Nimrod; Pon, Steven; Rozenberg, Igal; Schiff, Nicholas D; Stark, David E; Yan, Peter; Pratt, Hillel; Kosofsky, Barry E

    2016-01-01

    Summary Background Physicians caring for children with serious acute neurologic disease must process overwhelming amounts of physiological and medical information. Strategies to optimize real time display of this information are understudied. Objectives Our goal was to engage clinical and engineering experts to develop guiding principles for creating a pediatric neurology intensive care unit (neuroPICU) monitor that integrates and displays data from multiple sources in an intuitive and informative manner. Methods To accomplish this goal, an international group of physicians and engineers communicated regularly for one year. We integrated findings from clinical observations, interviews, a survey, signal processing, and visualization exercises to develop a concept for a neuroPICU display. Results Key conclusions from our efforts include: (1) A neuroPICU display should support (a) rapid review of retrospective time series (i.e. cardiac, pulmonary, and neurologic physiology data), (b) rapidly modifiable formats for viewing that data according to the specialty of the reviewer, and (c) communication of the degree of risk of clinical decline. (2) Specialized visualizations of physiologic parameters can highlight abnormalities in multivariable temporal data. Examples include 3-D stacked spider plots and color coded time series plots. (3) Visual summaries of EEG with spectral tools (i.e. hemispheric asymmetry and median power) can highlight seizures via patient-specific “fingerprints.” (4) Intuitive displays should emphasize subsets of physiology and processed EEG data to provide a rapid gestalt of the current status and medical stability of a patient. Conclusions A well-designed neuroPICU display must present multiple datasets in dynamic, flexible, and informative views to accommodate clinicians from multiple disciplines in a variety of clinical scenarios. PMID:27437048

  19. Guiding Principles for a Pediatric Neurology ICU (neuroPICU) Bedside Multimodal Monitor: Findings from an International Working Group.

    PubMed

    Grinspan, Zachary M; Eldar, Yonina C; Gopher, Daniel; Gottlieb, Amihai; Lammfromm, Rotem; Mangat, Halinder S; Peleg, Nimrod; Pon, Steven; Rozenberg, Igal; Schiff, Nicholas D; Stark, David E; Yan, Peter; Pratt, Hillel; Kosofsky, Barry E

    2016-01-01

    Physicians caring for children with serious acute neurologic disease must process overwhelming amounts of physiological and medical information. Strategies to optimize real time display of this information are understudied. Our goal was to engage clinical and engineering experts to develop guiding principles for creating a pediatric neurology intensive care unit (neuroPICU) monitor that integrates and displays data from multiple sources in an intuitive and informative manner. To accomplish this goal, an international group of physicians and engineers communicated regularly for one year. We integrated findings from clinical observations, interviews, a survey, signal processing, and visualization exercises to develop a concept for a neuroPICU display. Key conclusions from our efforts include: (1) A neuroPICU display should support (a) rapid review of retrospective time series (i.e. cardiac, pulmonary, and neurologic physiology data), (b) rapidly modifiable formats for viewing that data according to the specialty of the reviewer, and (c) communication of the degree of risk of clinical decline. (2) Specialized visualizations of physiologic parameters can highlight abnormalities in multivariable temporal data. Examples include 3-D stacked spider plots and color coded time series plots. (3) Visual summaries of EEG with spectral tools (i.e. hemispheric asymmetry and median power) can highlight seizures via patient-specific "fingerprints." (4) Intuitive displays should emphasize subsets of physiology and processed EEG data to provide a rapid gestalt of the current status and medical stability of a patient. A well-designed neuroPICU display must present multiple datasets in dynamic, flexible, and informative views to accommodate clinicians from multiple disciplines in a variety of clinical scenarios.

  20. Image Corruption Detection in Diffusion Tensor Imaging for Post-Processing and Real-Time Monitoring

    PubMed Central

    Li, Yue; Shea, Steven M.; Lorenz, Christine H.; Jiang, Hangyi; Chou, Ming-Chung; Mori, Susumu

    2013-01-01

    Due to the high sensitivity of diffusion tensor imaging (DTI) to physiological motion, clinical DTI scans often suffer a significant amount of artifacts. Tensor-fitting-based, post-processing outlier rejection is often used to reduce the influence of motion artifacts. Although it is an effective approach, when there are multiple corrupted data, this method may no longer correctly identify and reject the corrupted data. In this paper, we introduce a new criterion called “corrected Inter-Slice Intensity Discontinuity” (cISID) to detect motion-induced artifacts. We compared the performance of algorithms using cISID and other existing methods with regard to artifact detection. The experimental results show that the integration of cISID into fitting-based methods significantly improves the retrospective detection performance at post-processing analysis. The performance of the cISID criterion, if used alone, was inferior to the fitting-based methods, but cISID could effectively identify severely corrupted images with a rapid calculation time. In the second part of this paper, an outlier rejection scheme was implemented on a scanner for real-time monitoring of image quality and reacquisition of the corrupted data. The real-time monitoring, based on cISID and followed by post-processing, fitting-based outlier rejection, could provide a robust environment for routine DTI studies. PMID:24204551

  1. Multi-terminal remote monitoring and warning system using Micro Air Vehicle for dangerous environment

    NASA Astrophysics Data System (ADS)

    Yu, Yanan; Wang, Xiaoxun; He, Chengcheng; Lai, Chenlong; Liu, Yuanchao

    2015-11-01

    For overcoming the problems such as remote operation and dangerous tasks, multi-terminal remote monitoring and warning system based on STC89C52 Micro Control Unit and wireless communication technique was proposed. The system with MCU as its core adopted multiple sets of sensor device to monitor environment parameters of different locations, such as temperature, humidity, smoke other harmful gas concentration. Data information collected was transmitted remotely by wireless transceiver module, and then multi-channel data parameter was processed and displayed through serial communication protocol between the module and PC. The results of system could be checked in the form of web pages within a local network which plays a wireless monitoring and warning role. In a remote operation, four-rotor micro air vehicle which fixed airborne data acquisition device was utilized as a middleware between collecting terminal and PC to increase monitoring scope. Whole test system has characteristics of simple construction, convenience, real time ability and high reliability, which could meet the requirements of actual use.

  2. Brief Communication: A low-cost Arduino®-based wire extensometer for earth flow monitoring

    NASA Astrophysics Data System (ADS)

    Guerriero, Luigi; Guerriero, Giovanni; Grelle, Gerardo; Guadagno, Francesco M.; Revellino, Paola

    2017-06-01

    Continuous monitoring of earth flow displacement is essential for the understanding of the dynamic of the process, its ongoing evolution and designing mitigation measures. Despite its importance, it is not always applied due to its expense and the need for integration with additional sensors to monitor factors controlling movement. To overcome these problems, we developed and tested a low-cost Arduino-based wire-rail extensometer integrating a data logger, a power system and multiple digital and analog inputs. The system is equipped with a high-precision position transducer that in the test configuration offers a measuring range of 1023 mm and an associated accuracy of ±1 mm, and integrates an operating temperature sensor that should allow potential thermal drift that typically affects this kind of systems to be identified and corrected. A field test, conducted at the Pietrafitta earth flow where additional monitoring systems had been installed, indicates a high reliability of the measurement and a high monitoring stability without visible thermal drift.

  3. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    PubMed

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  4. Laser beam welding quality monitoring system based in high-speed (10 kHz) uncooled MWIR imaging sensors

    NASA Astrophysics Data System (ADS)

    Linares, Rodrigo; Vergara, German; Gutiérrez, Raúl; Fernández, Carlos; Villamayor, Víctor; Gómez, Luis; González-Camino, Maria; Baldasano, Arturo; Castro, G.; Arias, R.; Lapido, Y.; Rodríguez, J.; Romero, Pablo

    2015-05-01

    The combination of flexibility, productivity, precision and zero-defect manufacturing in future laser-based equipment are a major challenge that faces this enabling technology. New sensors for online monitoring and real-time control of laserbased processes are necessary for improving products quality and increasing manufacture yields. New approaches to fully automate processes towards zero-defect manufacturing demand smarter heads where lasers, optics, actuators, sensors and electronics will be integrated in a unique compact and affordable device. Many defects arising in laser-based manufacturing processes come from instabilities in the dynamics of the laser process. Temperature and heat dynamics are key parameters to be monitored. Low cost infrared imagers with high-speed of response will constitute the next generation of sensors to be implemented in future monitoring and control systems for laser-based processes, capable to provide simultaneous information about heat dynamics and spatial distribution. This work describes the result of using an innovative low-cost high-speed infrared imager based on the first quantum infrared imager monolithically integrated with Si-CMOS ROIC of the market. The sensor is able to provide low resolution images at frame rates up to 10 KHz in uncooled operation at the same cost as traditional infrared spot detectors. In order to demonstrate the capabilities of the new sensor technology, a low-cost camera was assembled on a standard production laser welding head, allowing to register melting pool images at frame rates of 10 kHz. In addition, a specific software was developed for defect detection and classification. Multiple laser welding processes were recorded with the aim to study the performance of the system and its application to the real-time monitoring of laser welding processes. During the experiments, different types of defects were produced and monitored. The classifier was fed with the experimental images obtained. Self-learning strategies were implemented with very promising results, demonstrating the feasibility of using low-cost high-speed infrared imagers in advancing towards a real-time / in-line zero-defect production systems.

  5. A Fully Redundant On-Line Mass Spectrometer System Used to Monitor Cryogenic Fuel Leaks on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Griffin, Timothy P.; Naylor, Guy R.; Haskell, William D.; Breznik, Greg S.; Mizell, Carolyn A.; Helms, William R.; Steinrock, T. (Technical Monitor)

    2001-01-01

    An on-line gas monitoring system was developed to replace the older systems used to monitor for cryogenic leaks on the Space Shuttles before launch. The system uses a mass spectrometer to monitor multiple locations in the process, which allows the system to monitor all gas constituents of interest in a nearly simultaneous manner. The system is fully redundant and meets all requirements for ground support equipment (GSE). This includes ruggedness to withstand launch on the Mobile Launcher Platform (MLP), ease of operation, and minimal operator intervention. The system can be fully automated so that an operator is notified when an unusual situation or fault is detected. User inputs are through personal computer using mouse and keyboard commands. The graphical user interface is very intuitive and easy to operate. The system has successfully supported four launches to date. It is currently being permanently installed as the primary system monitoring the Space Shuttles during ground processing and launch operations. Time and cost savings will be substantial over the current systems when it is fully implemented in the field. Tests were performed to demonstrate the performance of the system. Low limits-of-detection coupled with small drift make the system a major enhancement over the current systems. Though this system is currently optimized for detecting cryogenic leaks, many other gas constituents could be monitored using the Hazardous Gas Detection System (HGDS) 2000.

  6. Timing divided attention.

    PubMed

    Hogendoorn, Hinze; Carlson, Thomas A; VanRullen, Rufin; Verstraten, Frans A J

    2010-11-01

    Visual attention can be divided over multiple objects or locations. However, there is no single theoretical framework within which the effects of dividing attention can be interpreted. In order to develop such a model, here we manipulated the stage of visual processing at which attention was divided, while simultaneously probing the costs of dividing attention on two dimensions. We show that dividing attention incurs dissociable time and precision costs, which depend on whether attention is divided during monitoring or during access. Dividing attention during monitoring resulted in progressively delayed access to attended locations as additional locations were monitored, as well as a one-off precision cost. When dividing attention during access, time costs were systematically lower at one of the accessed locations than at the other, indicating that divided attention during access, in fact, involves rapid sequential allocation of undivided attention. We propose a model in which divided attention is understood as the simultaneous parallel preparation and subsequent sequential execution of multiple shifts of undivided attention. This interpretation has the potential to bring together diverse findings from both the divided-attention and saccade preparation literature and provides a framework within which to integrate the broad spectrum of divided-attention methodologies.

  7. Building a Predictive Capability for Decision-Making that Supports MultiPEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmichael, Joshua Daniel

    Multi-phenomenological explosion monitoring (multiPEM) is a developing science that uses multiple geophysical signatures of explosions to better identify and characterize their sources. MultiPEM researchers seek to integrate explosion signatures together to provide stronger detection, parameter estimation, or screening capabilities between different sources or processes. This talk will address forming a predictive capability for screening waveform explosion signatures to support multiPEM.

  8. Evaluating Cell Processes, Quality, and Biomarkers in Pluripotent Stem Cells Using Video Bioinformatics

    PubMed Central

    Lin, Sabrina C.; Bays, Brett C.; Omaiye, Esther; Bhanu, Bir; Talbot, Prue

    2016-01-01

    There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells. PMID:26848582

  9. Evaluating Cell Processes, Quality, and Biomarkers in Pluripotent Stem Cells Using Video Bioinformatics.

    PubMed

    Zahedi, Atena; On, Vincent; Lin, Sabrina C; Bays, Brett C; Omaiye, Esther; Bhanu, Bir; Talbot, Prue

    2016-01-01

    There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells.

  10. Saltwater intrusion monitoring in Florida

    USGS Publications Warehouse

    Prinos, Scott T.

    2016-01-01

    Florida's communities are largely dependent on freshwater from groundwater aquifers. Existing saltwater in the aquifers, or seawater that intrudes parts of the aquifers that were fresh, can make the water unusable without additional processing. The quality of Florida's saltwater intrusion monitoring networks varies. In Miami-Dade and Broward Counties, for example, there is a well-designed network with recently constructed short open-interval monitoring wells that bracket the saltwater interface in the Biscayne aquifer. Geochemical analyses of water samples from the network help scientists evaluate pathways of saltwater intrusion and movement of the saltwater interface. Geophysical measurements, collected in these counties, aid the mapping of the saltwater interface and the design of monitoring networks. In comparison, deficiencies in the Collier County monitoring network include the positioning of monitoring wells, reliance on wells with long open intervals that when sampled might provide questionable results, and the inability of existing analyses to differentiate between multiple pathways of saltwater intrusion. A state-wide saltwater intrusion monitoring network is being planned; the planned network could improve saltwater intrusion monitoring by adopting the applicable strategies of the networks of Miami-Dade and Broward Counties, and by addressing deficiencies such as those described for the Collier County network.

  11. A wireless body measurement system to study fatigue in multiple sclerosis.

    PubMed

    Yu, Fei; Bilberg, Arne; Stenager, Egon; Rabotti, Chiara; Zhang, Bin; Mischi, Massimo

    2012-12-01

    Fatigue is reported as the most common symptom by patients with multiple sclerosis (MS). The physiological and functional parameters related to fatigue in MS patients are currently not well established. A new wearable wireless body measurement system, named Fatigue Monitoring System (FAMOS), was developed to study fatigue in MS. It can continuously measure electrocardiogram, body-skin temperature, electromyogram and motions of feet. The goal of this study is to test the ability of distinguishing fatigued MS patients from healthy subjects by the use of FAMOS. This paper presents the realization of the measurement system including the design of both hardware and dedicated signal processing algorithms. Twenty-six participants including 17 MS patients with fatigue and 9 sex- and age-matched healthy controls were included in the study for continuous 24 h monitoring. The preliminary results show significant differences between fatigued MS patients and healthy controls. In conclusion, the FAMOS enables continuous data acquisition and estimation of multiple physiological and functional parameters. It provides a new, flexible and objective approach to study fatigue in MS, which can distinguish between fatigued MS patients and healthy controls. The usability and reliability of the FAMOS should however be further improved and validated through larger clinical trials.

  12. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify a variety of plant phenomena and improve monitoring capabilities.

  13. Energy index decomposition methodology at the plant level

    NASA Astrophysics Data System (ADS)

    Kumphai, Wisit

    Scope and method of study. The dissertation explores the use of a high level energy intensity index as a facility-level energy performance monitoring indicator with a goal of developing a methodology for an economically based energy performance monitoring system that incorporates production information. The performance measure closely monitors energy usage, production quantity, and product mix and determines the production efficiency as a part of an ongoing process that would enable facility managers to keep track of and, in the future, be able to predict when to perform a recommissioning process. The study focuses on the use of the index decomposition methodology and explored several high level (industry, sector, and country levels) energy utilization indexes, namely, Additive Log Mean Divisia, Multiplicative Log Mean Divisia, and Additive Refined Laspeyres. One level of index decomposition is performed. The indexes are decomposed into Intensity and Product mix effects. These indexes are tested on a flow shop brick manufacturing plant model in three different climates in the United States. The indexes obtained are analyzed by fitting an ARIMA model and testing for dependency between the two decomposed indexes. Findings and conclusions. The results concluded that the Additive Refined Laspeyres index decomposition methodology is suitable to use on a flow shop, non air conditioned production environment as an energy performance monitoring indicator. It is likely that this research can be further expanded in to predicting when to perform a recommissioning process.

  14. The selected reaction monitoring/multiple reaction monitoring-based mass spectrometry approach for the accurate quantitation of proteins: clinical applications in the cardiovascular diseases.

    PubMed

    Gianazza, Erica; Tremoli, Elena; Banfi, Cristina

    2014-12-01

    Selected reaction monitoring, also known as multiple reaction monitoring, is a powerful targeted mass spectrometry approach for a confident quantitation of proteins/peptides in complex biological samples. In recent years, its optimization and application have become pivotal and of great interest in clinical research to derive useful outcomes for patient care. Thus, selected reaction monitoring/multiple reaction monitoring is now used as a highly sensitive and selective method for the evaluation of protein abundances and biomarker verification with potential applications in medical screening. This review describes technical aspects for the development of a robust multiplex assay and discussing its recent applications in cardiovascular proteomics: verification of promising disease candidates to select only the highest quality peptides/proteins for a preclinical validation, as well as quantitation of protein isoforms and post-translational modifications.

  15. Production of erythrocytes from directly isolated or Delta1 Notch ligand expanded CD34+ hematopoietic progenitor cells: process characterization, monitoring and implications for manufacture.

    PubMed

    Glen, Katie E; Workman, Victoria L; Ahmed, Forhad; Ratcliffe, Elizabeth; Stacey, Adrian J; Thomas, Robert J

    2013-09-01

    Economic ex vivo manufacture of erythrocytes at 10(12) cell doses requires an efficiently controlled bio-process capable of extensive proliferation and high terminal density. High-resolution characterization of the process would identify production strategies for increased efficiency, monitoring and control. CD34(+) cord blood cells or equivalent cells that had been pre-expanded for 7 days with Delta1 Notch ligand were placed in erythroid expansion and differentiation conditions in a micro-scale ambr suspension bioreactor. Multiple culture parameters were varied, and phenotype markers and metabolites measured to identify conserved trends and robust monitoring markers. The cells exhibited a bi-modal erythroid differentiation pattern with an erythroid marker peak after 2 weeks and 3 weeks of culture; differentiation was comparatively weighted toward the second peak in Delta1 pre-expanded cells. Both differentiation events were strengthened by omission of stem cell factor and dexamethasone. The cumulative cell proliferation and death, or directly measured CD45 expression, enabled monitoring of proliferative rate of the cells. The metabolic activities of the cultures (glucose, glutamine and ammonia consumption or production) were highly variable but exhibited systematic change synchronized with the change in differentiation state. Erythroid differentiation chronology is partly determined by the heterogeneous CD34(+) progenitor compartment with implications for input control; Delta1 ligand-mediated progenitor culture can alter differentiation profile with control benefits for engineering production strategy. Differentiation correlated changes in cytokine response, markers and metabolic state will enable scientifically designed monitoring and timing of manufacturing process steps. Copyright © 2013 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  16. Nanohole Arrays of Mixed Designs and Microwriting for Simultaneous and Multiple Protein Binding Studies

    PubMed Central

    Ji, Jin; Yang, Jiun-Chan; Larson, Dale N.

    2009-01-01

    We demonstrate using nanohole arrays of mixed designs and a microwriting process based on dip-pen nanolithography to monitor multiple, different protein binding events simultaneously in real time based on the intensity of Extraordinary Optical Transmission of nanohole arrays. The microwriting process and small footprint of the individual nanohole arrays enabled us to observe different binding events located only 16μm apart, achieving high spatial resolution. We also present a novel concept that incorporates nanohole arrays of different designs to improve confidence and accuracy of binding studies. For proof of concept, two types of nanohole arrays, designed to exhibit opposite responses to protein bindings, were fabricated on one transducer. Initial studies indicate that the mixed designs could help to screen out artifacts such as protein intrinsic signals, providing improved accuracy of binding interpretation. PMID:19297143

  17. A Self-Referenced Optical Intensity Sensor Network Using POFBGs for Biomedical Applications

    PubMed Central

    Moraleda, Alberto Tapetado; Montero, David Sánchez; Webb, David J.; García, Carmen Vázquez

    2014-01-01

    This work bridges the gap between the remote interrogation of multiple optical sensors and the advantages of using inherently biocompatible low-cost polymer optical fiber (POF)-based photonic sensing. A novel hybrid sensor network combining both silica fiber Bragg gratings (FBG) and polymer FBGs (POFBG) is analyzed. The topology is compatible with WDM networks so multiple remote sensors can be addressed providing high scalability. A central monitoring unit with virtual data processing is implemented, which could be remotely located up to units of km away. The feasibility of the proposed solution for potential medical environments and biomedical applications is shown. PMID:25615736

  18. A self-referenced optical intensity sensor network using POFBGs for biomedical applications.

    PubMed

    Tapetado Moraleda, Alberto; Sánchez Montero, David; Webb, David J; Vázquez García, Carmen

    2014-12-12

    This work bridges the gap between the remote interrogation of multiple optical sensors and the advantages of using inherently biocompatible low-cost polymer optical fiber (POF)-based photonic sensing. A novel hybrid sensor network combining both silica fiber Bragg gratings (FBG) and polymer FBGs (POFBG) is analyzed. The topology is compatible with WDM networks so multiple remote sensors can be addressed providing high scalability. A central monitoring unit with virtual data processing is implemented, which could be remotely located up to units of km away. The feasibility of the proposed solution for potential medical environments and biomedical applications is shown.

  19. Porous polycarbene-bearing membrane actuator for ultrasensitive weak-acid detection and real-time chemical reaction monitoring.

    PubMed

    Sun, Jian-Ke; Zhang, Weiyi; Guterman, Ryan; Lin, Hui-Juan; Yuan, Jiayin

    2018-04-30

    Soft actuators with integration of ultrasensitivity and capability of simultaneous interaction with multiple stimuli through an entire event ask for a high level of structure complexity, adaptability, and/or multi-responsiveness, which is a great challenge. Here, we develop a porous polycarbene-bearing membrane actuator built up from ionic complexation between a poly(ionic liquid) and trimesic acid (TA). The actuator features two concurrent structure gradients, i.e., an electrostatic complexation (EC) degree and a density distribution of a carbene-NH 3 adduct (CNA) along the membrane cross-section. The membrane actuator performs the highest sensitivity among the state-of-the-art soft proton actuators toward acetic acid at 10 -6  mol L -1 (M) level in aqueous media. Through competing actuation of the two gradients, it is capable of monitoring an entire process of proton-involved chemical reactions that comprise multiple stimuli and operational steps. The present achievement constitutes a significant step toward real-life application of soft actuators in chemical sensing and reaction technology.

  20. Developing Inventory and Monitoring Programs Based on Multiple Objectives

    Treesearch

    Daniel L. Schmoldt; David L. Peterson; David G. Silsbee

    1995-01-01

    Resource inventory and monitoring (I&M) programs in national parks combine multiple objectives in order to create a plan of action over a finite time horizon. Because all program activities are constrained by time and money, it is critical to plan I&M activities that make the best use of available agency resources. However, multiple objectives complicate a...

  1. Online Meta-data Collection and Monitoring Framework for the STAR Experiment at RHIC

    NASA Astrophysics Data System (ADS)

    Arkhipkin, D.; Lauret, J.; Betts, W.; Van Buren, G.

    2012-12-01

    The STAR Experiment further exploits scalable message-oriented model principles to achieve a high level of control over online data streams. In this paper we present an AMQP-powered Message Interface and Reliable Architecture framework (MIRA), which allows STAR to orchestrate the activities of Meta-data Collection, Monitoring, Online QA and several Run-Time and Data Acquisition system components in a very efficient manner. The very nature of the reliable message bus suggests parallel usage of multiple independent storage mechanisms for our meta-data. We describe our experience with a robust data-taking setup employing MySQL- and HyperTable-based archivers for meta-data processing. In addition, MIRA has an AJAX-enabled web GUI, which allows real-time visualisation of online process flow and detector subsystem states, and doubles as a sophisticated alarm system when combined with complex event processing engines like Esper, Borealis or Cayuga. The performance data and our planned path forward are based on our experience during the 2011-2012 running of STAR.

  2. Emerging methods for the study of coastal ecosystem landscape structure and change

    USGS Publications Warehouse

    Brock, John C.; Danielson, Jeffrey J.; Purkis, Sam

    2013-01-01

    Coastal landscapes are heterogeneous, dynamic, and evolve over a range of time scales due to intertwined climatic, geologic, hydrologic, biologic, and meteorological processes, and are also heavily impacted by human development, commercial activities, and resource extraction. A diversity of complex coastal systems around the globe, spanning glaciated shorelines to tropical atolls, wetlands, and barrier islands are responding to multiple human and natural drivers. Interdisciplinary research based on remote-sensing observations linked to process studies and models is required to understand coastal ecosystem landscape structure and change. Moreover, new techniques for coastal mapping and monitoring are increasingly serving the needs of policy-makers and resource managers across local, regional, and national scales. Emerging remote-sensing methods associated with a diversity of instruments and platforms are a key enabling element of integrated coastal ecosystem studies. These investigations require both targeted and synoptic mapping, and involve the monitoring of formative processes such as hydrodynamics, sediment transport, erosion, accretion, flooding, habitat modification, land-cover change, and biogeochemical fluxes.

  3. Fining of Red Wine Monitored by Multiple Light Scattering.

    PubMed

    Ferrentino, Giovanna; Ramezani, Mohsen; Morozova, Ksenia; Hafner, Daniela; Pedri, Ulrich; Pixner, Konrad; Scampicchio, Matteo

    2017-07-12

    This work describes a new approach based on multiple light scattering to study red wine clarification processes. The whole spectral signal (1933 backscattering points along the length of each sample vial) were fitted by a multivariate kinetic model that was built with a three-step mechanism, implying (1) adsorption of wine colloids to fining agents, (2) aggregation into larger particles, and (3) sedimentation. Each step is characterized by a reaction rate constant. According to the first reaction, the results showed that gelatin was the most efficient fining agent, concerning the main objective, which was the clarification of the wine, and consequently the increase in its limpidity. Such a trend was also discussed in relation to the results achieved by nephelometry, total phenols, ζ-potential, color, sensory, and electronic nose analyses. Also, higher concentrations of the fining agent (from 5 to 30 g/100 L) or higher temperatures (from 10 to 20 °C) sped up the process. Finally, the advantage of using the whole spectral signal vs classical univariate approaches was demonstrated by comparing the uncertainty associated with the rate constants of the proposed kinetic model. Overall, multiple light scattering technique showed a great potential for studying fining processes compared to classical univariate approaches.

  4. The Effects of Acute Dopamine Precursor Depletion on the Cognitive Control Functions of Performance Monitoring and Conflict Processing: An Event-Related Potential (ERP) Study.

    PubMed

    Larson, Michael J; Clayson, Peter E; Primosch, Mark; Leyton, Marco; Steffensen, Scott C

    2015-01-01

    Studies using medications and psychiatric populations implicate dopamine in cognitive control and performance monitoring processes. However, side effects associated with medication or studying psychiatric groups may confound the relationship between dopamine and cognitive control. To circumvent such possibilities, we utilized a randomized, double-blind, placebo-controlled, within-subjects design wherein participants were administered a nutritionally-balanced amino acid mixture (BAL) and an amino acid mixture deficient in the dopamine precursors tyrosine (TYR) and phenylalanine (PHE) on two separate occasions. Order of sessions was randomly assigned. Cognitive control and performance monitoring were assessed using response times (RT), error rates, the N450, an event-related potential (ERP) index of conflict monitoring, the conflict slow potential (conflict SP), an ERP index of conflict resolution, and the error-related negativity (ERN) and error positivity (Pe), ERPs associated with performance monitoring. Participants were twelve males who completed a Stroop color-word task while ERPs were collected four hours following acute PHE and TYR depletion (APTD) or balanced (BAL) mixture ingestion in two separate sessions. N450 and conflict SP ERP amplitudes significantly differentiated congruent from incongruent trials, but did not differ as a function of APTD or BAL mixture ingestion. Similarly, ERN and Pe amplitudes showed significant differences between error and correct trials that were not different between APTD and BAL conditions. Findings indicate that acute dopamine precursor depletion does not significantly alter cognitive control and performance monitoring ERPs. Current results do not preclude the role of dopamine in these processes, but suggest that multiple methods for dopamine-related hypothesis testing are needed.

  5. Use of MIDI-fatty acid methyl ester analysis to monitor the transmission of Campylobacter during commercial poultry processing.

    PubMed

    Hinton, Arthur; Cason, J A; Hume, Michael E; Ingram, Kimberly D

    2004-08-01

    The presence of Campylobacter spp. on broiler carcasses and in scald water taken from a commercial poultry processing facility was monitored on a monthly basis from January through June. Campylobacter agar, Blaser, was used to enumerate Campylobacter in water samples from a multiple-tank scalder; on prescalded, picked, eviscerated, and chilled carcasses; and on processed carcasses stored at 4 degrees C for 7 or 14 days. The MIDI Sherlock microbial identification system was used to identify Campylobacter-like isolates based on the fatty acid methyl ester profile of the bacteria. The dendrogram program of the Sherlock microbial identification system was used to compare the fatty acid methyl ester profiles of the bacteria and determine the degree of relatedness between the isolates. Findings indicated that no Campylobacter were recovered from carcasses or scald tank water samples collected in January or February, but the pathogen was recovered from samples collected in March, April, May, and June. Processing generally produced a significant (P < 0.05) decrease in the number of Campylobacter recovered from broiler carcasses, and the number of Campylobacter recovered from refrigerated carcasses generally decreased during storage. Significantly (P < 0.05) fewer Campylobacter were recovered from the final tank of the multiple-tank scald system than from the first tank. MIDI similarity index values ranged from 0.104 to 0.928 based on MIDI-fatty acid methyl ester analysis of Campylobacterjejuni and Campylobacter coli isolates. Dendrograms of the fatty acid methyl ester profile of the isolates indicated that poultry flocks may introduce several strains of C. jejuni and C. coli into processing plants. Different populations of the pathogen may be carried into the processing plant by successive broiler flocks, and the same Campylobacter strain may be recovered from different poultry processing operations. However, Campylobacter apparently is unable to colonize equipment in the processing facility and contaminate broilers from flocks processed at later dates in the facility.

  6. Multiple Frequency Contrast Source Inversion Method for Vertical Electromagnetic Profiling: 2D Simulation Results and Analyses

    NASA Astrophysics Data System (ADS)

    Li, Jinghe; Song, Linping; Liu, Qing Huo

    2016-02-01

    A simultaneous multiple frequency contrast source inversion (CSI) method is applied to reconstructing hydrocarbon reservoir targets in a complex multilayered medium in two dimensions. It simulates the effects of a salt dome sedimentary formation in the context of reservoir monitoring. In this method, the stabilized biconjugate-gradient fast Fourier transform (BCGS-FFT) algorithm is applied as a fast solver for the 2D volume integral equation for the forward computation. The inversion technique with CSI combines the efficient FFT algorithm to speed up the matrix-vector multiplication and the stable convergence of the simultaneous multiple frequency CSI in the iteration process. As a result, this method is capable of making quantitative conductivity image reconstruction effectively for large-scale electromagnetic oil exploration problems, including the vertical electromagnetic profiling (VEP) survey investigated here. A number of numerical examples have been demonstrated to validate the effectiveness and capacity of the simultaneous multiple frequency CSI method for a limited array view in VEP.

  7. Age-Related Differences in Multiple Task Monitoring

    PubMed Central

    Todorov, Ivo; Del Missier, Fabio; Mäntylä, Timo

    2014-01-01

    Coordinating multiple tasks with narrow deadlines is particularly challenging for older adults because of age related decline in cognitive control functions. We tested the hypothesis that multiple task performance reflects age- and gender-related differences in executive functioning and spatial ability. Young and older adults completed a multitasking session with four monitoring tasks as well as separate tasks measuring executive functioning and spatial ability. For both age groups, men exceeded women in multitasking, measured as monitoring accuracy. Individual differences in executive functioning and spatial ability were independent predictors of young adults' monitoring accuracy, but only spatial ability was related to sex differences. For older adults, age and executive functioning, but not spatial ability, predicted multitasking performance. These results suggest that executive functions contribute to multiple task performance across the adult life span and that reliance on spatial skills for coordinating deadlines is modulated by age. PMID:25215609

  8. Age-related differences in multiple task monitoring.

    PubMed

    Todorov, Ivo; Del Missier, Fabio; Mäntylä, Timo

    2014-01-01

    Coordinating multiple tasks with narrow deadlines is particularly challenging for older adults because of age related decline in cognitive control functions. We tested the hypothesis that multiple task performance reflects age- and gender-related differences in executive functioning and spatial ability. Young and older adults completed a multitasking session with four monitoring tasks as well as separate tasks measuring executive functioning and spatial ability. For both age groups, men exceeded women in multitasking, measured as monitoring accuracy. Individual differences in executive functioning and spatial ability were independent predictors of young adults' monitoring accuracy, but only spatial ability was related to sex differences. For older adults, age and executive functioning, but not spatial ability, predicted multitasking performance. These results suggest that executive functions contribute to multiple task performance across the adult life span and that reliance on spatial skills for coordinating deadlines is modulated by age.

  9. Multicriteria relocation analysis of an off-site radioactive monitoring network for a nuclear power plant.

    PubMed

    Chang, Ni-Bin; Ning, Shu-Kuang; Chen, Jen-Chang

    2006-08-01

    Due to increasing environmental consciousness in most countries, every utility that owns a commercial nuclear power plant has been required to have both an on-site and off-site emergency response plan since the 1980s. A radiation monitoring network, viewed as part of the emergency response plan, can provide information regarding the radiation dosage emitted from a nuclear power plant in a regular operational period and/or abnormal measurements in an emergency event. Such monitoring information might help field operators and decision-makers to provide accurate responses or make decisions to protect the public health and safety. This study aims to conduct an integrated simulation and optimization analysis looking for the relocation strategy of a long-term regular off-site monitoring network at a nuclear power plant. The planning goal is to downsize the current monitoring network but maintain its monitoring capacity as much as possible. The monitoring sensors considered in this study include the thermoluminescence dosimetry (TLD) and air sampling system (AP) simultaneously. It is designed for detecting the radionuclide accumulative concentration, the frequency of violation, and the possible population affected by a long-term impact in the surrounding area regularly while it can also be used in an accidental release event. With the aid of the calibrated Industrial Source Complex-Plume Rise Model Enhancements (ISC-PRIME) simulation model to track down the possible radionuclide diffusion, dispersion, transport, and transformation process in the atmospheric environment, a multiobjective evaluation process can be applied to achieve the screening of monitoring stations for the nuclear power plant located at Hengchun Peninsula, South Taiwan. To account for multiple objectives, this study calculated preference weights to linearly combine objective functions leading to decision-making with exposure assessment in an optimization context. Final suggestions should be useful for narrowing the set of scenarios that decision-makers need to consider in this relocation process.

  10. Microprocessor Based Real-Time Monitoring of Multiple ECG Signals

    PubMed Central

    Nasipuri, M.; Basu, D.K.; Dattagupta, R.; Kundu, M.; Banerjee, S.

    1987-01-01

    A microprocessor based system capable of realtime monitoring of multiple ECG signals has been described. The system consists of a number of microprocessors connected in a hierarchical fashion and capable of working concurrently on ECG data collected from different channels. The system can monitor different arrhythmic abnormalities for at least 36 patients even for a heart rate of 500 beats/min.

  11. A data fusion approach for track monitoring from multiple in-service trains

    NASA Astrophysics Data System (ADS)

    Lederman, George; Chen, Siheng; Garrett, James H.; Kovačević, Jelena; Noh, Hae Young; Bielak, Jacobo

    2017-10-01

    We present a data fusion approach for enabling data-driven rail-infrastructure monitoring from multiple in-service trains. A number of researchers have proposed using vibration data collected from in-service trains as a low-cost method to monitor track geometry. The majority of this work has focused on developing novel features to extract information about the tracks from data produced by individual sensors on individual trains. We extend this work by presenting a technique to combine extracted features from multiple passes over the tracks from multiple sensors aboard multiple vehicles. There are a number of challenges in combining multiple data sources, like different relative position coordinates depending on the location of the sensor within the train. Furthermore, as the number of sensors increases, the likelihood that some will malfunction also increases. We use a two-step approach that first minimizes position offset errors through data alignment, then fuses the data with a novel adaptive Kalman filter that weights data according to its estimated reliability. We show the efficacy of this approach both through simulations and on a data-set collected from two instrumented trains operating over a one-year period. Combining data from numerous in-service trains allows for more continuous and more reliable data-driven monitoring than analyzing data from any one train alone; as the number of instrumented trains increases, the proposed fusion approach could facilitate track monitoring of entire rail-networks.

  12. Hemodynamic changes in a rat parietal cortex after endothelin-1-induced middle cerebral artery occlusion monitored by optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Liu, Jian; Ma, Yushu; Dou, Shidan; Wang, Yi; La, Dongsheng; Liu, Jianghong; Ma, Zhenhe

    2016-07-01

    A blockage of the middle cerebral artery (MCA) on the cortical branch will seriously affect the blood supply of the cerebral cortex. Real-time monitoring of MCA hemodynamic parameters is critical for therapy and rehabilitation. Optical coherence tomography (OCT) is a powerful imaging modality that can produce not only structural images but also functional information on the tissue. We use OCT to detect hemodynamic changes after MCA branch occlusion. We injected a selected dose of endothelin-1 (ET-1) at a depth of 1 mm near the MCA and let the blood vessels follow a process first of occlusion and then of slow reperfusion as realistically as possible to simulate local cerebral ischemia. During this period, we used optical microangiography and Doppler OCT to obtain multiple hemodynamic MCA parameters. The change trend of these parameters from before to after ET-1 injection clearly reflects the dynamic regularity of the MCA. These results show the mechanism of the cerebral ischemia-reperfusion process after a transient middle cerebral artery occlusion and confirm that OCT can be used to monitor hemodynamic parameters.

  13. [Input on monitoring and evaluation practices of government management of Brazilian Municipal Health Departments].

    PubMed

    Miranda, Alcides Silva de; Carvalho, André Luis Bonifácio de; Cavalcante, Caio Garcia Correia Sá

    2012-04-01

    What do the leaders of the Municipal Health Service (SMS) report and say about the systematic monitoring and evaluation of their own government management? The purpose of this paper is to provide input for the formulation of plausible hypotheses about such institutional processes and practices based on information produced in an exploratory study. This is a multiple case study with quantitative and qualitative analysis of answers to a semi-structured questionnaire given to government officials of a systematic sample of 577 Municipal Health Services (10.4% of the total in Brazil). They were selected and stratified by proportional distribution among states and by the population size of municipalities. In general, it shows that approximately half of the respondents use information from Health Monitoring Evaluations to orient decision-making, planning and other management approaches. This proportion tends to decrease in cities with smaller populations. There are specific and significant gaps in financial, personnel and crisis management. The evidence from the hypotheses highlights the fact that these processes are still at an early stage.

  14. Roadmap on semiconductor-cell biointerfaces

    NASA Astrophysics Data System (ADS)

    Tian, Bozhi; Xu, Shuai; Rogers, John A.; Cestellos-Blanco, Stefano; Yang, Peidong; Carvalho-de-Souza, João L.; Bezanilla, Francisco; Liu, Jia; Bao, Zhenan; Hjort, Martin; Cao, Yuhong; Melosh, Nicholas; Lanzani, Guglielmo; Benfenati, Fabio; Galli, Giulia; Gygi, Francois; Kautz, Rylan; Gorodetsky, Alon A.; Kim, Samuel S.; Lu, Timothy K.; Anikeeva, Polina; Cifra, Michal; Krivosudský, Ondrej; Havelka, Daniel; Jiang, Yuanwen

    2018-05-01

    This roadmap outlines the role semiconductor-based materials play in understanding the complex biophysical dynamics at multiple length scales, as well as the design and implementation of next-generation electronic, optoelectronic, and mechanical devices for biointerfaces. The roadmap emphasizes the advantages of semiconductor building blocks in interfacing, monitoring, and manipulating the activity of biological components, and discusses the possibility of using active semiconductor-cell interfaces for discovering new signaling processes in the biological world.

  15. Measurement accuracy of a stressed contact lens during its relaxation period

    NASA Astrophysics Data System (ADS)

    Compertore, David C.; Ignatovich, Filipp V.

    2018-02-01

    We examine the dioptric power and transmitted wavefront of a contact lens as it releases its handling stresses. Handling stresses are introduced as part of the contact lens loading process and are common across all contact lens measurement procedures and systems. The latest advances in vision correction require tighter quality control during the manufacturing of the contact lenses. The optical power of contact lenses is one of the critical characteristics for users. Power measurements are conducted in the hydrated state, where the lens is resting inside a solution-filled glass cuvette. In a typical approach, the contact lens must be subject to long settling times prior to any measurements. Alternatively, multiple measurements must be averaged. Apart from potential operator dependency of such approach, it is extremely time-consuming, and therefore it precludes higher rates of testing. Comprehensive knowledge about the settling process can be obtained by monitoring multiple parameters of the lens simultaneously. We have developed a system that combines co-aligned a Shack-Hartmann transmitted wavefront sensor and a time-domain low coherence interferometer to measure several optical and physical parameters (power, cylinder power, aberrations, center thickness, sagittal depth, and diameter) simultaneously. We monitor these parameters during the stress relaxation period and show correlations that can be used by manufacturers to devise methods for improved quality control procedures.

  16. First results of the multi-purpose real-time processing video camera system on the Wendelstein 7-X stellarator and implications for future devices

    NASA Astrophysics Data System (ADS)

    Zoletnik, S.; Biedermann, C.; Cseh, G.; Kocsis, G.; König, R.; Szabolics, T.; Szepesi, T.; Wendelstein 7-X Team

    2018-01-01

    A special video camera has been developed for the 10-camera overview video system of the Wendelstein 7-X (W7-X) stellarator considering multiple application needs and limitations resulting from this complex long-pulse superconducting stellarator experiment. The event detection intelligent camera (EDICAM) uses a special 1.3 Mpixel CMOS sensor with non-destructive read capability which enables fast monitoring of smaller Regions of Interest (ROIs) even during long exposures. The camera can perform simple data evaluation algorithms (minimum/maximum, mean comparison to levels) on the ROI data which can dynamically change the readout process and generate output signals. Multiple EDICAM cameras were operated in the first campaign of W7-X and capabilities were explored in the real environment. Data prove that the camera can be used for taking long exposure (10-100 ms) overview images of the plasma while sub-ms monitoring and even multi-camera correlated edge plasma turbulence measurements of smaller areas can be done in parallel. These latter revealed that filamentary turbulence structures extend between neighboring modules of the stellarator. Considerations emerging for future upgrades of this system and similar setups on future long-pulse fusion experiments such as ITER are discussed.

  17. Auditory decision aiding in supervisory control of multiple unmanned aerial vehicles.

    PubMed

    Donmez, Birsen; Cummings, M L; Graham, Hudson D

    2009-10-01

    This article is an investigation of the effectiveness of sonifications, which are continuous auditory alerts mapped to the state of a monitored task, in supporting unmanned aerial vehicle (UAV) supervisory control. UAV supervisory control requires monitoring a UAV across multiple tasks (e.g., course maintenance) via a predominantly visual display, which currently is supported with discrete auditory alerts. Sonification has been shown to enhance monitoring performance in domains such as anesthesiology by allowing an operator to immediately determine an entity's (e.g., patient) current and projected states, and is a promising alternative to discrete alerts in UAV control. However, minimal research compares sonification to discrete alerts, and no research assesses the effectiveness of sonification for monitoring multiple entities (e.g., multiple UAVs). The authors conducted an experiment with 39 military personnel, using a simulated setup. Participants controlled single and multiple UAVs and received sonifications or discrete alerts based on UAV course deviations and late target arrivals. Regardless of the number of UAVs supervised, the course deviation sonification resulted in reactions to course deviations that were 1.9 s faster, a 19% enhancement, compared with discrete alerts. However, course deviation sonifications interfered with the effectiveness of discrete late arrival alerts in general and with operator responses to late arrivals when supervising multiple vehicles. Sonifications can outperform discrete alerts when designed to aid operators to predict future states of monitored tasks. However, sonifications may mask other auditory alerts and interfere with other monitoring tasks that require divided attention. This research has implications for supervisory control display design.

  18. Pilot educational program to enhance empowering patient education of school-age children with diabetes

    PubMed Central

    2013-01-01

    Background Nurses have a crucial role in patient education of children with type 1 diabetes, but they often exhibit lack of knowledge of the patient education process. This study aimed to describe an educational program to enhance empowering patient education process for the blood glucose monitoring education of school-age children and nurses’ perceptions of using empowering techniques. Methods An empowering patient education process for the diabetes education of school-age children was developed. The researcher collected nurse’s perceptions of managing the educational program by semi-structured interviews. Ten nurses carried out the diabetes education, and 8 of them participated in the interview. Three nurses implemented the diabetes education twice and were interviewed twice. The data consisted of 11 descriptions of the blood glucose monitoring education. The interviewer analyzed the data deductively and inductively by content analysis. Results Nurses described successful managing of the empowering patient education process. The need assessment consisted of using multiple methods and clarifying the capabilities and challenges of children and their parents. Planning manifested itself in adequate preparation and multiple objectives stated together with the family. Implementation comprised the relevant content, and the use of suitable teaching materials and methods. Evaluation was performed with various methods and documented accurately. Nurses also faced some challenges related to management and leadership, ambivalence with traditional and empowering patient education, and families’ overall situation. Conclusion An example of developing evidence-based patient education program is presented, but besides education other factors supporting changes in work practices should be considered in further development. PMID:23641969

  19. 40 CFR 75.82 - Monitoring of Hg mass emissions and heat input at common and multiple stacks.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... heat input at common and multiple stacks. 75.82 Section 75.82 Protection of Environment ENVIRONMENTAL... Provisions § 75.82 Monitoring of Hg mass emissions and heat input at common and multiple stacks. (a) Unit... systems and perform the Hg emission testing described under § 75.81(b). If reporting of the unit heat...

  20. Consolidation of cloud computing in ATLAS

    NASA Astrophysics Data System (ADS)

    Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration

    2017-10-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.

  1. Millimeter wave sensor for monitoring effluents

    DOEpatents

    Gopalsami, Nachappa; Bakhtiari, Sasan; Raptis, Apostolos C.; Dieckman, Stephen L.

    1995-01-01

    A millimeter-wave sensor for detecting and measuring effluents from processing plants either remotely or on-site includes a high frequency signal source for transmitting frequency-modulated continuous waves in the millimeter or submillimeter range with a wide sweep capability and a computer-controlled detector for detecting a plurality of species of effluents on a real time basis. A high resolution spectrum of an effluent, or effluents, is generated by a deconvolution of the measured spectra resulting in a narrowing of the line widths by 2 or 3 orders of magnitude as compared with the pressure broadened spectra detected at atmospheric pressure for improved spectral specificity and measurement sensitivity. The sensor is particularly adapted for remote monitoring such as where access is limited or sensor cost restricts multiple sensors as well as for large area monitoring under nearly all weather conditions.

  2. Real-time multiple objects tracking on Raspberry-Pi-based smart embedded camera

    NASA Astrophysics Data System (ADS)

    Dziri, Aziz; Duranton, Marc; Chapuis, Roland

    2016-07-01

    Multiple-object tracking constitutes a major step in several computer vision applications, such as surveillance, advanced driver assistance systems, and automatic traffic monitoring. Because of the number of cameras used to cover a large area, these applications are constrained by the cost of each node, the power consumption, the robustness of the tracking, the processing time, and the ease of deployment of the system. To meet these challenges, the use of low-power and low-cost embedded vision platforms to achieve reliable tracking becomes essential in networks of cameras. We propose a tracking pipeline that is designed for fixed smart cameras and which can handle occlusions between objects. We show that the proposed pipeline reaches real-time processing on a low-cost embedded smart camera composed of a Raspberry-Pi board and a RaspiCam camera. The tracking quality and the processing speed obtained with the proposed pipeline are evaluated on publicly available datasets and compared to the state-of-the-art methods.

  3. Monitoring and predicting cognitive state and performance via physiological correlates of neuronal signals.

    PubMed

    Russo, Michael B; Stetz, Melba C; Thomas, Maria L

    2005-07-01

    Judgment, decision making, and situational awareness are higher-order mental abilities critically important to operational cognitive performance. Higher-order mental abilities rely on intact functioning of multiple brain regions, including the prefrontal, thalamus, and parietal areas. Real-time monitoring of individuals for cognitive performance capacity via an approach based on sampling multiple neurophysiologic signals and integrating those signals with performance prediction models potentially provides a method of supporting warfighters' and commanders' decision making and other operationally relevant mental processes and is consistent with the goals of augmented cognition. Cognitive neurophysiological assessments that directly measure brain function and subsequent cognition include positron emission tomography, functional magnetic resonance imaging, mass spectroscopy, near-infrared spectroscopy, magnetoencephalography, and electroencephalography (EEG); however, most direct measures are not practical to use in operational environments. More practical, albeit indirect measures that are generated by, but removed from the actual neural sources, are movement activity, oculometrics, heart rate, and voice stress signals. The goal of the papers in this section is to describe advances in selected direct and indirect cognitive neurophysiologic monitoring techniques as applied for the ultimate purpose of preventing operational performance failures. These papers present data acquired in a wide variety of environments, including laboratory, simulator, and clinical arenas. The papers discuss cognitive neurophysiologic measures such as digital signal processing wrist-mounted actigraphy; oculometrics including blinks, saccadic eye movements, pupillary movements, the pupil light reflex; and high-frequency EEG. These neurophysiological indices are related to cognitive performance as measured through standard test batteries and simulators with conditions including sleep loss, time on task, and aviation flight-induced fatigue.

  4. Data-based mechanistic modeling of dissolved organic carbon load through storms using continuous 15-minute resolution observations within UK upland watersheds

    NASA Astrophysics Data System (ADS)

    Jones, T.; Chappell, N. A.

    2013-12-01

    Few watershed modeling studies have addressed DOC dynamics through storm hydrographs (notable exceptions include Boyer et al., 1997 Hydrol Process; Jutras et al., 2011 Ecol Model; Xu et al., 2012 Water Resour Res). In part this has been a consequence of an incomplete understanding of the biogeochemical processes leading to DOC export to streams (Neff & Asner, 2001, Ecosystems) & an insufficient frequency of DOC monitoring to capture sometimes complex time-varying relationships between DOC & storm hydrographs (Kirchner et al., 2004, Hydrol Process). We present the results of a new & ongoing UK study that integrates two components - 1/ New observations of DOC concentrations (& derived load) continuously monitored at 15 minute intervals through multiple seasons for replicated watersheds; & 2/ A dynamic modeling technique that is able to quantify storage-decay effects, plus hysteretic, nonlinear, lagged & non-stationary relationships between DOC & controlling variables (including rainfall, streamflow, temperature & specific biogeochemical variables e.g., pH, nitrate). DOC concentration is being monitored continuously using the latest generation of UV spectrophotometers (i.e. S::CAN spectro::lysers) with in situ calibrations to laboratory analyzed DOC. The controlling variables are recorded simultaneously at the same stream stations. The watersheds selected for study are among the most intensively studied basins in the UK uplands, namely the Plynlimon & Llyn Brianne experimental basins. All contain areas of organic soils, with three having improved grasslands & three conifer afforested. The dynamic response characteristics (DRCs) that describe detailed DOC behaviour through sequences of storms are simulated using the latest identification routines for continuous time transfer function (CT-TF) models within the Matlab-based CAPTAIN toolbox (some incorporating nonlinear components). To our knowledge this is the first application of CT-TFs to modelling DOC processes. Furthermore this allows a data-based mechanistic (DBM) modelling philosophy to be followed where no assumptions about processes are defined a priori (given that dominant processes are often not known before analysis) & where the information contained in the time-series is used to identify multiple structures of models that are statistically robust. Within the final stage of DBM, biogeochemical & hydrological processes are interpreted from those models that are observable from the available stream time-series. We show that this approach can simulate the key features of DOC dynamics within & between storms & that some of the resultant response characteristics change with varying DOC processes in different seasons. Through the use of MISO (multiple-input single-output) models we demonstrate the relative importance of different variables (e.g., rainfall, temperature) in controlling DOC responses. The contrasting behaviour of the six experimental catchments is also reflected in differing response characteristics. These characteristics are shown to contribute to understanding of basin-integrated DOC export processes & to the ecosystem service impacts of DOC & color on commercial water treatment within the surrounding water supply basins.

  5. Integrating SAR with Optical and Thermal Remote Sensing for Operational Near Real-Time Volcano Monitoring

    NASA Astrophysics Data System (ADS)

    Meyer, F. J.; Webley, P.; Dehn, J.; Arko, S. A.; McAlpin, D. B.

    2013-12-01

    Volcanic eruptions are among the most significant hazards to human society, capable of triggering natural disasters on regional to global scales. In the last decade, remote sensing techniques have become established in operational forecasting, monitoring, and managing of volcanic hazards. Monitoring organizations, like the Alaska Volcano Observatory (AVO), are nowadays heavily relying on remote sensing data from a variety of optical and thermal sensors to provide time-critical hazard information. Despite the high utilization of these remote sensing data to detect and monitor volcanic eruptions, the presence of clouds and a dependence on solar illumination often limit their impact on decision making processes. Synthetic Aperture Radar (SAR) systems are widely believed to be superior to optical sensors in operational monitoring situations, due to the weather and illumination independence of their observations and the sensitivity of SAR to surface changes and deformation. Despite these benefits, the contributions of SAR to operational volcano monitoring have been limited in the past due to (1) high SAR data costs, (2) traditionally long data processing times, and (3) the low temporal sampling frequencies inherent to most SAR systems. In this study, we present improved data access, data processing, and data integration techniques that mitigate some of the above mentioned limitations and allow, for the first time, a meaningful integration of SAR into operational volcano monitoring systems. We will introduce a new database interface that was developed in cooperation with the Alaska Satellite Facility (ASF) and allows for rapid and seamless data access to all of ASF's SAR data holdings. We will also present processing techniques that improve the temporal frequency with which hazard-related products can be produced. These techniques take advantage of modern signal processing technology as well as new radiometric normalization schemes, both enabling the combination of multiple observation geometries in change detection procedures. Additionally, it will be shown how SAR-based hazard information can be integrated with data from optical satellites, thermal sensors, webcams and models to create near-real time volcano hazard information. We will introduce a prototype monitoring system that integrates SAR-based hazard information into the near real-time volcano hazard monitoring system of the Alaska Volcano Observatory. This prototype system was applied to historic eruptions of the volcanoes Okmok and Augustine, both located in the North Pacific. We will show that for these historic eruptions, the addition of SAR data lead to a significant improvement in activity detection and eruption monitoring, and improved the accuracy and timeliness of eruption alerts.

  6. Fast algorithm for spectral processing with application to on-line welding quality assurance

    NASA Astrophysics Data System (ADS)

    Mirapeix, J.; Cobo, A.; Jaúregui, C.; López-Higuera, J. M.

    2006-10-01

    A new technique is presented in this paper for the analysis of welding process emission spectra to accurately estimate in real-time the plasma electronic temperature. The estimation of the electronic temperature of the plasma, through the analysis of the emission lines from multiple atomic species, may be used to monitor possible perturbations during the welding process. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, sub-pixel algorithms are used to more accurately estimate the central wavelength of the peaks. Three different sub-pixel algorithms will be analysed and compared, and it will be shown that the LPO (linear phase operator) sub-pixel algorithm is a better solution within the proposed system. Experimental tests during TIG-welding using a fibre optic to capture the arc light, together with a low cost CCD-based spectrometer, show that some typical defects associated with perturbations in the electron temperature can be easily detected and identified with this technique. A typical processing time for multiple peak analysis is less than 20 ms running on a conventional PC.

  7. Automated detection of inaccurate and imprecise transitions in peptide quantification by multiple reaction monitoring mass spectrometry.

    PubMed

    Abbatiello, Susan E; Mani, D R; Keshishian, Hasmik; Carr, Steven A

    2010-02-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) of peptides with stable isotope-labeled internal standards (SISs) is increasingly being used to develop quantitative assays for proteins in complex biological matrices. These assays can be highly precise and quantitative, but the frequent occurrence of interferences requires that MRM-MS data be manually reviewed, a time-intensive process subject to human error. We developed an algorithm that identifies inaccurate transition data based on the presence of interfering signal or inconsistent recovery among replicate samples. The algorithm objectively evaluates MRM-MS data with 2 orthogonal approaches. First, it compares the relative product ion intensities of the analyte peptide to those of the SIS peptide and uses a t-test to determine if they are significantly different. A CV is then calculated from the ratio of the analyte peak area to the SIS peak area from the sample replicates. The algorithm identified problematic transitions and achieved accuracies of 94%-100%, with a sensitivity and specificity of 83%-100% for correct identification of errant transitions. The algorithm was robust when challenged with multiple types of interferences and problematic transitions. This algorithm for automated detection of inaccurate and imprecise transitions (AuDIT) in MRM-MS data reduces the time required for manual and subjective inspection of data, improves the overall accuracy of data analysis, and is easily implemented into the standard data-analysis work flow. AuDIT currently works with results exported from MRM-MS data-processing software packages and may be implemented as an analysis tool within such software.

  8. Automated Detection of Inaccurate and Imprecise Transitions in Peptide Quantification by Multiple Reaction Monitoring Mass Spectrometry

    PubMed Central

    Abbatiello, Susan E.; Mani, D. R.; Keshishian, Hasmik; Carr, Steven A.

    2010-01-01

    BACKGROUND Multiple reaction monitoring mass spectrometry (MRM-MS) of peptides with stable isotope–labeled internal standards (SISs) is increasingly being used to develop quantitative assays for proteins in complex biological matrices. These assays can be highly precise and quantitative, but the frequent occurrence of interferences requires that MRM-MS data be manually reviewed, a time-intensive process subject to human error. We developed an algorithm that identifies inaccurate transition data based on the presence of interfering signal or inconsistent recovery among replicate samples. METHODS The algorithm objectively evaluates MRM-MS data with 2 orthogonal approaches. First, it compares the relative product ion intensities of the analyte peptide to those of the SIS peptide and uses a t-test to determine if they are significantly different. A CV is then calculated from the ratio of the analyte peak area to the SIS peak area from the sample replicates. RESULTS The algorithm identified problematic transitions and achieved accuracies of 94%–100%, with a sensitivity and specificity of 83%–100% for correct identification of errant transitions. The algorithm was robust when challenged with multiple types of interferences and problematic transitions. CONCLUSIONS This algorithm for automated detection of inaccurate and imprecise transitions (AuDIT) in MRM-MS data reduces the time required for manual and subjective inspection of data, improves the overall accuracy of data analysis, and is easily implemented into the standard data-analysis work flow. AuDIT currently works with results exported from MRM-MS data-processing software packages and may be implemented as an analysis tool within such software. PMID:20022980

  9. Embedded sensor systems for health - providing the tools in future healthcare.

    PubMed

    Lindén, Maria; Björkman, Mats

    2014-01-01

    Wearable, embedded sensor systems for health applications are foreseen to be enablers in the future healthcare. They will provide ubiquitous monitoring of multiple parameters without restricting the person to stay at home or in the hospital. By following trend changes in the health status, early deteriorations will be detected and treatment can start earlier. Also health prevention will be supported. Such future healthcare requires technology development, including miniaturized sensors, smart textiles and wireless communication. The tremendous amount of data generated by these systems calls for both signal processing and decision support to guarantee the quality of data and avoid overflow of information. Safe and secure communications have to protect the integrity of the persons monitored.

  10. Implementation of Multiple Host Nodes in Wireless Sensing Node Network System for Landslide Monitoring

    NASA Astrophysics Data System (ADS)

    Abas, Faizulsalihin bin; Takayama, Shigeru

    2015-02-01

    This paper proposes multiple host nodes in Wireless Sensing Node Network System (WSNNS) for landslide monitoring. As landslide disasters damage monitoring system easily, one major demand in landslide monitoring is the flexibility and robustness of the system to evaluate the current situation in the monitored area. For various reasons WSNNS can provide an important contribution to reach that aim. In this system, acceleration sensors and GPS are deployed in sensing nodes. Location information by GPS, enable the system to estimate network topology and enable the system to perceive the location in emergency by monitoring the node mode. Acceleration sensors deployment, capacitate this system to detect slow mass movement that can lead to landslide occurrence. Once deployed, sensing nodes self-organize into an autonomous wireless ad hoc network. The measurement parameter data from sensing nodes is transmitted to Host System via host node and "Cloud" System. The implementation of multiple host nodes in Local Sensing Node Network System (LSNNS), improve risk- management of the WSNNS for real-time monitoring of landslide disaster.

  11. Advanced I&C for Fault-Tolerant Supervisory Control of Small Modular Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Daniel G.

    In this research, we have developed a supervisory control approach to enable automated control of SMRs. By design the supervisory control system has an hierarchical, interconnected, adaptive control architecture. A considerable advantage to this architecture is that it allows subsystems to communicate at different/finer granularity, facilitates monitoring of process at the modular and plant levels, and enables supervisory control. We have investigated the deployment of automation, monitoring, and data collection technologies to enable operation of multiple SMRs. Each unit's controller collects and transfers information from local loops and optimize that unit’s parameters. Information is passed from the each SMR unitmore » controller to the supervisory controller, which supervises the actions of SMR units and manage plant processes. The information processed at the supervisory level will provide operators the necessary information needed for reactor, unit, and plant operation. In conjunction with the supervisory effort, we have investigated techniques for fault-tolerant networks, over which information is transmitted between local loops and the supervisory controller to maintain a safe level of operational normalcy in the presence of anomalies. The fault-tolerance of the supervisory control architecture, the network that supports it, and the impact of fault-tolerance on multi-unit SMR plant control has been a second focus of this research. To this end, we have investigated the deployment of advanced automation, monitoring, and data collection and communications technologies to enable operation of multiple SMRs. We have created a fault-tolerant multi-unit SMR supervisory controller that collects and transfers information from local loops, supervise their actions, and adaptively optimize the controller parameters. The goal of this research has been to develop the methodologies and procedures for fault-tolerant supervisory control of small modular reactors. To achieve this goal, we have identified the following objectives. These objective are an ordered approach to the research: I) Development of a supervisory digital I&C system II) Fault-tolerance of the supervisory control architecture III) Automated decision making and online monitoring.« less

  12. Imaging multiple sclerosis and other neurodegenerative diseases

    PubMed Central

    Inglese, Matilde; Petracca, Maria

    2013-01-01

    Although the prevalence of neurodegenerative diseases is increasing as a consequence of the growing aging population, the exact pathophysiological mechanisms leading to these diseases remains obscure. Multiple sclerosis (MS), an autoimmune disease of the central nervous system and the most frequent cause of disability among young people after traumatic brain injury, is characterized by inflammatory/demyelinating and neurodegenerative processes that occurr earlier in life. The ability to make an early diagnosis of MS with the support of conventional MRI techniques, provides the opportunity to study neurodegeneration and the underlying pathophysiological processes in earlier stages than in classical neurodegenerative diseases. This review summarizes mechanisms of neurodegeneration common to MS and to Alzheimer disease, Parkinson disease, and amiotrophic lateral sclerosis, and provides a brief overview of the neuroimaging studies employing MRI and PET techniques to investigate and monitor neurodegeneration in both MS and classical neurodegenerative diseases. PMID:23117868

  13. A Hybrid Generalized Hidden Markov Model-Based Condition Monitoring Approach for Rolling Bearings

    PubMed Central

    Liu, Jie; Hu, Youmin; Wu, Bo; Wang, Yan; Xie, Fengyun

    2017-01-01

    The operating condition of rolling bearings affects productivity and quality in the rotating machine process. Developing an effective rolling bearing condition monitoring approach is critical to accurately identify the operating condition. In this paper, a hybrid generalized hidden Markov model-based condition monitoring approach for rolling bearings is proposed, where interval valued features are used to efficiently recognize and classify machine states in the machine process. In the proposed method, vibration signals are decomposed into multiple modes with variational mode decomposition (VMD). Parameters of the VMD, in the form of generalized intervals, provide a concise representation for aleatory and epistemic uncertainty and improve the robustness of identification. The multi-scale permutation entropy method is applied to extract state features from the decomposed signals in different operating conditions. Traditional principal component analysis is adopted to reduce feature size and computational cost. With the extracted features’ information, the generalized hidden Markov model, based on generalized interval probability, is used to recognize and classify the fault types and fault severity levels. Finally, the experiment results show that the proposed method is effective at recognizing and classifying the fault types and fault severity levels of rolling bearings. This monitoring method is also efficient enough to quantify the two uncertainty components. PMID:28524088

  14. Multiple sensor multifrequency eddy current monitor for solidification and growth

    NASA Technical Reports Server (NTRS)

    Wallace, John

    1990-01-01

    A compact cylindrical multisensor eddy current measuring system with integral furnace was develop to monitor II-VI crystal growth to provide interfacial information, solutal segregation, and conductivities of the growth materials. The use of an array of sensors surrounding the furnace element allows one to monitor the volume of interest. Coupling these data with inverse multifrequency analysis allows radial conductivity profiles to be generated at each sensor position. These outputs were incorporated to control the processes within the melt volume. The standard eddy current system functions with materials whose electric conductivities are as low as 2E2 Mhos/m. A need was seen to extend the measurement range to poorly conducting media so the unit was modified to allow measurement of materials conductivities 4 order of magnitude lower and bulk dielectric properties. Typically these included submicron thick films and semiinsulating GaAs. This system was used to monitor complex heat transfer in grey bodies as well as semiconductor and metallic solidification.

  15. Real-time data processing for in-line monitoring of a pharmaceutical coating process by optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Markl, Daniel; Ziegler, Jakob; Hannesschläger, Günther; Sacher, Stephan; Buchsbaum, Andreas; Leitner, Michael; Khinast, Johannes G.

    2014-05-01

    Coating of tablets is a widely applied unit operation in the pharmaceutical industry. Thickness and uniformity of the coating layer are crucial for efficacy as well as for compliance. Not only due to different initiatives it is thus essential to monitor and control the coating process in-line. Optical coherence tomography (OCT) was already shown in previous works to be a suitable candidate for in-line monitoring of coating processes. However, to utilize the full potential of the OCT technology an automatic evaluation of the OCT measurements is essential. The automatic evaluation is currently implemented in MATLAB and includes several steps: (1) extraction of features of each A-scan, (2) classification of Ascan measurements based on their features, (3) detection of interfaces (air/coating and coating/tablet core), (4) correction of distortions due to the curvature of the bi-convex tablets and the oblique orientation of the tablets, and (5) determining the coating thickness. The algorithm is tested on OCT data acquired by moving the sensor head of the OCT system across a static tablet bed. The coating thickness variations of single tablets (i.e., intra-tablet coating variability) can additionally be analyzed as OCT allows the measurement of the coating thickness on multiple displaced positions on one single tablet. Specifically, the information about those parameters emphasizes the high capability of the OCT technology to improve process understanding and to assure a high product quality.

  16. SHARD - a SeisComP3 module for Structural Health Monitoring

    NASA Astrophysics Data System (ADS)

    Weber, B.; Becker, J.; Ellguth, E.; Henneberger, R.; Herrnkind, S.; Roessler, D.

    2016-12-01

    Monitoring building and structure response to strong earthquake ground shaking or human-induced vibrations in real-time forms the backbone of modern structural health monitoring (SHM). The continuous data transmission, processing and analysis reduces drastically the time decision makers need to plan for appropriate response to possible damages of high-priority buildings and structures. SHARD is a web browser based module using the SeisComp3 framework to monitor the structural health of buildings and other structures by calculating standard engineering seismology parameters and checking their exceedance in real-time. Thresholds can be defined, e.g. compliant with national building codes (IBC2000, DIN4149 or EC8), for PGA/PGV/PGD, response spectra and drift ratios. In case thresholds are exceeded automatic or operator driven reports are generated and send to the decision makers. SHARD also determines waveform quality in terms of data delay and variance to report sensor status. SHARD is the perfect tool for civil protection to monitor simultaneously multiple city-wide critical infrastructure as hospitals, schools, governmental buildings and structures as bridges, dams and power substations.

  17. Understanding and monitoring the consequences of human impacts on intraspecific variation.

    PubMed

    Mimura, Makiko; Yahara, Tetsukazu; Faith, Daniel P; Vázquez-Domínguez, Ella; Colautti, Robert I; Araki, Hitoshi; Javadi, Firouzeh; Núñez-Farfán, Juan; Mori, Akira S; Zhou, Shiliang; Hollingsworth, Peter M; Neaves, Linda E; Fukano, Yuya; Smith, Gideon F; Sato, Yo-Ichiro; Tachida, Hidenori; Hendry, Andrew P

    2017-02-01

    Intraspecific variation is a major component of biodiversity, yet it has received relatively little attention from governmental and nongovernmental organizations, especially with regard to conservation plans and the management of wild species. This omission is ill-advised because phenotypic and genetic variations within and among populations can have dramatic effects on ecological and evolutionary processes, including responses to environmental change, the maintenance of species diversity, and ecological stability and resilience. At the same time, environmental changes associated with many human activities, such as land use and climate change, have dramatic and often negative impacts on intraspecific variation. We argue for the need for local, regional, and global programs to monitor intraspecific genetic variation. We suggest that such monitoring should include two main strategies: (i) intensive monitoring of multiple types of genetic variation in selected species and (ii) broad-brush modeling for representative species for predicting changes in variation as a function of changes in population size and range extent. Overall, we call for collaborative efforts to initiate the urgently needed monitoring of intraspecific variation.

  18. A Smartphone-Based Driver Safety Monitoring System Using Data Fusion

    PubMed Central

    Lee, Boon-Giin; Chung, Wan-Young

    2012-01-01

    This paper proposes a method for monitoring driver safety levels using a data fusion approach based on several discrete data types: eye features, bio-signal variation, in-vehicle temperature, and vehicle speed. The driver safety monitoring system was developed in practice in the form of an application for an Android-based smartphone device, where measuring safety-related data requires no extra monetary expenditure or equipment. Moreover, the system provides high resolution and flexibility. The safety monitoring process involves the fusion of attributes gathered from different sensors, including video, electrocardiography, photoplethysmography, temperature, and a three-axis accelerometer, that are assigned as input variables to an inference analysis framework. A Fuzzy Bayesian framework is designed to indicate the driver’s capability level and is updated continuously in real-time. The sensory data are transmitted via Bluetooth communication to the smartphone device. A fake incoming call warning service alerts the driver if his or her safety level is suspiciously compromised. Realistic testing of the system demonstrates the practical benefits of multiple features and their fusion in providing a more authentic and effective driver safety monitoring. PMID:23247416

  19. High-speed measurements of steel-plate deformations during laser surface processing.

    PubMed

    Jezersek, Matija; Gruden, Valter; Mozina, Janez

    2004-10-04

    In this paper we present a novel approach to monitoring the deformations of a steel plate's surface during various types of laser processing, e.g., engraving, marking, cutting, bending, and welding. The measuring system is based on a laser triangulation principle, where the laser projector generates multiple lines simultaneously. This enables us to measure the shape of the surface with a high sampling rate (80 Hz with our camera) and high accuracy (+/-7 microm). The measurements of steel-plate deformations for plates of different thickness and with different illumination patterns are presented graphically and in an animation.

  20. Bacterial Serine/Threonine Protein Kinases in Host-Pathogen Interactions*

    PubMed Central

    Canova, Marc J.; Molle, Virginie

    2014-01-01

    In bacterial pathogenesis, monitoring and adapting to the dynamically changing environment in the host and an ability to disrupt host immune responses are critical. The virulence determinants of pathogenic bacteria include the sensor/signaling proteins of the serine/threonine protein kinase (STPK) family that have a dual role of sensing the environment and subverting specific host defense processes. STPKs can sense a wide range of signals and coordinate multiple cellular processes to mount an appropriate response. Here, we review some of the well studied bacterial STPKs that are essential virulence factors and that modify global host responses during infection. PMID:24554701

  1. Bacterial serine/threonine protein kinases in host-pathogen interactions.

    PubMed

    Canova, Marc J; Molle, Virginie

    2014-04-04

    In bacterial pathogenesis, monitoring and adapting to the dynamically changing environment in the host and an ability to disrupt host immune responses are critical. The virulence determinants of pathogenic bacteria include the sensor/signaling proteins of the serine/threonine protein kinase (STPK) family that have a dual role of sensing the environment and subverting specific host defense processes. STPKs can sense a wide range of signals and coordinate multiple cellular processes to mount an appropriate response. Here, we review some of the well studied bacterial STPKs that are essential virulence factors and that modify global host responses during infection.

  2. Evaluation of a newly developed mid-infrared sensor for real-time monitoring of yeast fermentations.

    PubMed

    Schalk, Robert; Geoerg, Daniel; Staubach, Jens; Raedle, Matthias; Methner, Frank-Juergen; Beuermann, Thomas

    2017-05-01

    A mid-infrared (MIR) sensor using the attenuated total reflection (ATR) technique has been developed for real-time monitoring in biotechnology. The MIR-ATR sensor consists of an IR emitter as light source, a zinc selenide ATR prism as boundary to the process, and four thermopile detectors, each equipped with an optical bandpass filter. The suitability of the sensor for practical application was tested during aerobic batch-fermentations of Saccharomyces cerevisiae by simultaneous monitoring of glucose and ethanol. The performance of the sensor was compared to a commercial Fourier transform mid-infrared (FT-MIR) spectrometer by on-line measurements in a bypass loop. Sensor and spectrometer were calibrated by multiple linear regression (MLR) in order to link the measured absorbance in the transmission ranges of the four optical sensor channels to the analyte concentrations. For reference analysis, high-performance liquid chromatography (HPLC) was applied. Process monitoring using the sensor yielded in standard errors of prediction (SEP) of 6.15 g/L and 1.36 g/L for glucose and ethanol. In the case of the FT-MIR spectrometer the corresponding SEP values were 4.34 g/L and 0.61 g/L, respectively. The advantages of optical multi-channel mid-infrared sensors in comparison to FT-MIR spectrometer setups are the compactness, easy process implementation and lower price. Copyright © 2016 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  3. Judgments of Learning are Influenced by Multiple Cues In Addition to Memory for Past Test Accuracy.

    PubMed

    Hertzog, Christopher; Hines, Jarrod C; Touron, Dayna R

    When people try to learn new information (e.g., in a school setting), they often have multiple opportunities to study the material. One of the most important things to know is whether people adjust their study behavior on the basis of past success so as to increase their overall level of learning (for example, by emphasizing information they have not yet learned). Monitoring their learning is a key part of being able to make those kinds of adjustments. We used a recognition memory task to replicate prior research showing that memory for past test outcomes influences later monitoring, as measured by judgments of learning (JOLs; confidence that the material has been learned), but also to show that subjective confidence in whether the test answer and the amount of time taken to restudy the items also have independent effects on JOLs. We also show that there are individual differences in the effects of test accuracy and test confidence on JOLs, showing that some but not all people use past test experiences to guide monitoring of their new learning. Monitoring learning is therefore a complex process of considering multiple cues, and some people attend to those cues more effectively than others. Improving the quality of monitoring performance and learning could lead to better study behaviors and better learning. An individual's memory of past test performance (MPT) is often cited as the primary cue for judgments of learning (JOLs) following test experience during multi-trial learning tasks (Finn & Metcalfe, 2007; 2008). We used an associative recognition task to evaluate MPT-related phenomena, because performance monitoring, as measured by recognition test confidence judgments (CJs), is fallible and varies in accuracy across persons. The current study used multilevel regression models to show the simultaneous and independent influences of multiple cues on Trial 2 JOLs, in addition to performance accuracy (the typical measure of MPT in cued-recall experiments). These cues include recognition CJs, perceived recognition fluency, and Trial 2 study time allocation (an index of reprocessing fluency). Our results expand the scope of MPT-related phenomena in recognition memory testing to show independent effects of recognition test accuracy and CJs on second-trial JOLs, while also demonstrating individual differences in the effects of these cues on JOLs (as manifested in significant random effects for those regression effects in the model). The effect of study time on second-trial JOLs controlling on other variables, including Trial 1 recognition memory accuracy, also demonstrates that second-trial encoding behavior influence JOLs in addition to MPT.

  4. MO-D-213-01: Workflow Monitoring for a High Volume Radiation Oncology Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laub, S; Dunn, M; Galbreath, G

    2015-06-15

    Purpose: Implement a center wide communication system that increases interdepartmental transparency and accountability while decreasing redundant work and treatment delays by actively monitoring treatment planning workflow. Methods: Intake Management System (IMS), a program developed by ProCure Treatment Centers Inc., is a multi-function database that stores treatment planning process information. It was devised to work with the oncology information system (Mosaiq) to streamline interdepartmental workflow.Each step in the treatment planning process is visually represented and timelines for completion of individual tasks are established within the software. The currently active step of each patient’s planning process is highlighted either red or greenmore » according to whether the initially allocated amount of time has passed for the given process. This information is displayed as a Treatment Planning Process Monitor (TPPM), which is shown on screens in the relevant departments throughout the center. This display also includes the individuals who are responsible for each task.IMS is driven by Mosaiq’s quality checklist (QCL) functionality. Each step in the workflow is initiated by a Mosaiq user sending the responsible party a QCL assignment. IMS is connected to Mosaiq and the sending or completing of a QCL updates the associated field in the TPPM to the appropriate status. Results: Approximately one patient a week is identified during the workflow process as needing to have his/her treatment start date modified or resources re-allocated to address the most urgent cases. Being able to identify a realistic timeline for planning each patient and having multiple departments communicate their limitations and time constraints allows for quality plans to be developed and implemented without overburdening any one department. Conclusion: Monitoring the progression of the treatment planning process has increased transparency between departments, which enables efficient communication. Having built-in timelines allows easy prioritization of tasks and resources and facilitates effective time management.« less

  5. Understanding of safety monitoring in clinical trials by individuals with CF or their parents: A qualitative analysis.

    PubMed

    Kern-Goldberger, Andrew S; Hessels, Amanda J; Saiman, Lisa; Quittell, Lynne M

    2018-03-14

    Recruiting both pediatric and adult participants for clinical trials in CF is currently of paramount importance as numerous new therapies are being developed. However, recruitment is challenging as parents of children with CF and adults with CF cite safety concerns as a principal barrier to enrollment. In conjunction with the CF Foundation (CFF) Data Safety Monitoring Board (DSMB), a pilot brochure was developed to inform patients and parents of the multiple levels of safety monitoring; the CFF simultaneously created an infographic representing the safety monitoring process. This study explores the attitudes and beliefs of CF patients and families regarding safety monitoring and clinical trial participation, and elicits feedback regarding the educational materials. Semi-structured interviews were conducted using a pre-tested interview guide and audio-recorded during routine CF clinic visits. Participants included 5 parents of children with CF <16years old; 5 adolescents and young adults with CF 16-21years old; and 5 adults with CF ≥22years old from pediatric and adult CF centers. The study team performed systematic text condensation analysis of the recorded interviews using an iterative process. Four major thematic categories with subthemes emerged as supported by exemplar quotations: attitudes toward clinical trials, safety values, conceptualizing the safety monitoring process, and priorities for delivery of patient education. Participant feedback was used to revise the pilot brochure; text was shortened, unfamiliar words clarified (e.g., "pipeline"), abbreviations eliminated, and redundancy avoided. Qualitative analysis of CF patient and family interviews provided insights into barriers to participation in clinical trials, safety concerns, perspectives on safety monitoring and educational priorities. We plan a multicenter study to determine if the revised brochure reduces knowledge, attitude and practice barriers regarding participation in CF clinical trials. Copyright © 2018 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.

  6. Study protocol for evaluating the implementation and effectiveness of an emergency department longitudinal patient monitoring system using a mixed-methods approach.

    PubMed

    Ward, Marie; McAuliffe, Eilish; Wakai, Abel; Geary, Una; Browne, John; Deasy, Conor; Schull, Michael; Boland, Fiona; McDaid, Fiona; Coughlan, Eoin; O'Sullivan, Ronan

    2017-01-23

    Early detection of patient deterioration is a key element of patient safety as it allows timely clinical intervention and potential rescue, thus reducing the risks of serious patient safety incidents. Longitudinal patient monitoring systems have been widely recommended for use to detect clinical deterioration. However, there is conflicting evidence on whether they improve patient outcomes. This may in part be related to variation in the rigour with which they are implemented and evaluated. This study aims to evaluate the implementation and effectiveness of a longitudinal patient monitoring system designed for adult patients in the unique environment of the Emergency Department (ED). A novel participatory action research (PAR) approach is taken where socio-technical systems (STS) theory and analysis informs the implementation through the improvement methodology of 'Plan Do Study Act' (PDSA) cycles. We hypothesise that conducting an STS analysis of the ED before beginning the PDSA cycles will provide for a much richer understanding of the current situation and possible challenges to implementing the ED-specific longitudinal patient monitoring system. This methodology will enable both a process and an outcome evaluation of implementing the ED-specific longitudinal patient monitoring system. Process evaluations can help distinguish between interventions that have inherent faults and those that are badly executed. Over 1.2 million patients attend EDs annually in Ireland; the successful implementation of an ED-specific longitudinal patient monitoring system has the potential to affect the care of a significant number of such patients. To the best of our knowledge, this is the first study combining PAR, STS and multiple PDSA cycles to evaluate the implementation of an ED-specific longitudinal patient monitoring system and to determine (through process and outcome evaluation) whether this system can significantly improve patient outcomes by early detection and appropriate intervention for patients at risk of clinical deterioration.

  7. Parallel excitation-emission multiplexed fluorescence lifetime confocal microscopy for live cell imaging.

    PubMed

    Zhao, Ming; Li, Yu; Peng, Leilei

    2014-05-05

    We present a novel excitation-emission multiplexed fluorescence lifetime microscopy (FLIM) method that surpasses current FLIM techniques in multiplexing capability. The method employs Fourier multiplexing to simultaneously acquire confocal fluorescence lifetime images of multiple excitation wavelength and emission color combinations at 44,000 pixels/sec. The system is built with low-cost CW laser sources and standard PMTs with versatile spectral configuration, which can be implemented as an add-on to commercial confocal microscopes. The Fourier lifetime confocal method allows fast multiplexed FLIM imaging, which makes it possible to monitor multiple biological processes in live cells. The low cost and compatibility with commercial systems could also make multiplexed FLIM more accessible to biological research community.

  8. Multiple reaction monitoring targeted LC-MS analysis of potential cell death marker proteins for increased bioprocess control.

    PubMed

    Albrecht, Simone; Kaisermayer, Christian; Reinhart, David; Ambrose, Monica; Kunert, Renate; Lindeberg, Anna; Bones, Jonathan

    2018-05-01

    The monitoring of protein biomarkers for the early prediction of cell stress and death is a valuable tool for process characterization and efficient biomanufacturing control. A representative set of six proteins, namely GPDH, PRDX1, LGALS1, CFL1, TAGLN2 and MDH, which were identified in a previous CHO-K1 cell death model using discovery LC-MS E was translated into a targeted liquid chromatography multiple reaction monitoring mass spectrometry (LC-MRM-MS) platform and verified. The universality of the markers was confirmed in a cell growth model for which three Chinese hamster ovary host cell lines (CHO-K1, CHO-S, CHO-DG44) were grown in batch culture in two different types of basal media. LC-MRM-MS was also applied to spent media (n = 39) from four perfusion biomanufacturing series. Stable isotope-labelled peptide analogues and a stable isotope-labelled monoclonal antibody were used for improved protein quantitation and simultaneous monitoring of the workflow reproducibility. Significant increases in protein concentrations were observed for all viability marker proteins upon increased dead cell numbers and allowed for discrimination of spent media with dead cell densities below and above 1 × 10 6  dead cells/mL which highlights the potential of the selected viability marker proteins in bioprocess control. Graphical abstract Overview of the LC-MRM-MS workflow for the determination of proteomic markers in conditioned media from the bioreactor that correlate with CHO cell death.

  9. DEMONSTRATION OF LOW COST, LOW BURDEN EXPOSURE MONITORING STRATEGIES FOR USE IN LONGITUDINAL COHORT STUDIES

    EPA Science Inventory

    A large longitudinal cohort study designed to evaluate the association between children's exposures to environmental agents and health outcomes presents many challenges for exposure monitoring. Exposure of the child must be measured for multiple chemicals through multiple path...

  10. Active probing of cloud multiple scattering, optical depth, vertical thickness, and liquid water content using wide-angle imaging lidar

    NASA Astrophysics Data System (ADS)

    Love, Steven P.; Davis, Anthony B.; Rohde, Charles A.; Tellier, Larry; Ho, Cheng

    2002-09-01

    At most optical wavelengths, laser light in a cloud lidar experiment is not absorbed but merely scattered out of the beam, eventually escaping the cloud via multiple scattering. There is much information available in this light scattered far from the input beam, information ignored by traditional 'on-beam' lidar. Monitoring these off-beam returns in a fully space- and time-resolved manner is the essence of our unique instrument, Wide Angle Imaging Lidar (WAIL). In effect, WAIL produces wide-field (60-degree full-angle) 'movies' of the scattering process and records the cloud's radiative Green functions. A direct data product of WAIL is the distribution of photon path lengths resulting from multiple scattering in the cloud. Following insights from diffusion theory, we can use the measured Green functions to infer the physical thickness and optical depth of the cloud layer, and, from there, estimate the volume-averaged liquid water content. WAIL is notable in that it is applicable to optically thick clouds, a regime in which traditional lidar is reduced to ceilometry. Here we present recent WAIL data on various clouds and discuss the extension of WAIL to full diurnal monitoring by means of an ultra-narrow magneto-optic atomic line filter for daytime measurements.

  11. Strategic retrieval, confabulations, and delusions: theory and data.

    PubMed

    Gilboa, Asaf

    2010-01-01

    Based on Moscovitch and Winocur's "working with memory" framework, confabulation is described as a deficit in strategic retrieval processes. The present paper suggests that only a confluence of deficits on multiple memory-related processes leads to confabulation. These are divided into three categories. Core processes that are unique to confabulation and required for its evolution include: (1) an intuitive, rapid, preconscious "feeling of rightness" monitoring, (2) an elaborate conscious "editor" monitoring, and (3) control processes that mediate the decision whether to act upon a retrieved memory. The second category is deficits on constitutional processes which are required for confabulation to occur but are not unique to it. These include the formation of erroneous memory representation, (temporal) context confusion, and deficits in retrieval cue generation. Finally, associated Features of confabulations determine the content "flavour" and frequency of confabulation but are not required for their evolution. Some associated features are magnification of normal reconstructive memory processes such as reliance on generic/schematic representations, and positivity biases in memory, whereas others are abnormal such as perseveration or source memory deficits. Data on deficits in core processes in confabulation are presented. Next, the apparent correspondences between confabulation and delusion are discussed. Considering confabulation within a strategic memory framework may help elucidate both the commonalities and differences between the two symptoms. Delusions are affected by a convergence of abnormal perception and encoding of information, associated with aberrant cognitive schema structure and disordered belief monitoring. Whereas confabulation is primarily a disorder of retrieval, mnemonic aspects of delusions can be described as primarily a disorder of input and integration of information. It is suggested that delusions might share some of the associated features of confabulation but not its core and constitutional processes. Preliminary data in support of this view are presented.

  12. Seeing beyond monitors-Critical care nurses' multiple skills in patient observation: Descriptive qualitative study.

    PubMed

    Alastalo, Mika; Salminen, Leena; Lakanmaa, Riitta-Liisa; Leino-Kilpi, Helena

    2017-10-01

    The aim of this study was to provide a comprehensive description of multiple skills in patient observation in critical care nursing. Data from semi-structured interviews were analysed using thematic analysis. Experienced critical care nurses (n=20) from three intensive care units in two university hospitals in Finland. Patient observation skills consist of: information gaining skills, information processing skills, decision-making skills and co-operation skills. The first three skills are integrated in the patient observation process, in which gaining information is a prerequisite for processing information that precedes making decisions. Co-operation has a special role as it occurs throughout the process. This study provided a comprehensive description of patient observation skills related to the three-phased patient observation process. The findings contribute to clarifying this part of the competence. The description of patient observation skills may be applied in both clinical practice and education as it may serve as a framework for orientation, ensuring clinical skills and designing learning environments. Based on this study, patient observation skills can be recommended to be included in critical care nursing education, orientation and as a part of critical care nurses' competence evaluation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. The measurement of Protein Synthesis for Assessing Proteostasis in Studies of Slowed Aging

    PubMed Central

    Miller, Benjamin F.; Drake, Joshua C.; Naylor, Bradley; Price, John C.; Hamilton, Karyn L.

    2014-01-01

    Slowing the aging process can reduce the risk for multiple chronic diseases simultaneously. It is increasingly recognized that maintaining protein homeostasis (or proteostasis) is important for slowing the aging process. Since proteostasis is a dynamic process, monitoring it is not a simple task and requires use of appropriate methods. This review will introduce methods to assess protein and DNA synthesis using deuterium oxide (D2O), and how protein and DNA synthesis outcomes provide insight into proteostatic mechanisms. Finally, we provide a discussion on how these assessments of protein and DNA synthesis are “mechanistic” investigations and provide an appropriate framework for the further development of slowed aging treatments. PMID:25283966

  14. A boost of confidence: The role of the ventromedial prefrontal cortex in memory, decision-making, and schemas.

    PubMed

    Hebscher, Melissa; Gilboa, Asaf

    2016-09-01

    The ventromedial prefrontal cortex (vmPFC) has been implicated in a wide array of functions across multiple domains. In this review, we focus on the vmPFC's involvement in mediating strategic aspects of memory retrieval, memory-related schema functions, and decision-making. We suggest that vmPFC generates a confidence signal that informs decisions and memory-guided behaviour. Confidence is central to these seemingly diverse functions: (1) Strategic retrieval: lesions to the vmPFC impair an early, automatic, and intuitive monitoring process ("feeling of rightness"; FOR) often associated with confabulation (spontaneous reporting of erroneous memories). Critically, confabulators typically demonstrate high levels of confidence in their false memories, suggesting that faulty monitoring following vmPFC damage may lead to indiscriminate confidence signals. (2) Memory schemas: the vmPFC is critically involved in instantiating and maintaining contextually relevant schemas, broadly defined as higher level knowledge structures that encapsulate lower level representational elements. The correspondence between memory retrieval cues and these activated schemas leads to FOR monitoring. Stronger, more elaborate schemas produce stronger FOR and influence confidence in the veracity of memory candidates. (3) Finally, we review evidence on the vmPFC's role in decision-making, extending this role to decision-making during memory retrieval. During non-mnemonic and mnemonic decision-making the vmPFC automatically encodes confidence. Confidence signal in the vmPFC is revealed as a non-linear relationship between a first-order monitoring assessment and second-order action or choice. Attempting to integrate the multiple functions of the vmPFC, we propose a posterior-anterior organizational principle for this region. More posterior vmPFC regions are involved in earlier, automatic, subjective, and contextually sensitive functions, while more anterior regions are involved in controlled actions based on these earlier functions. Confidence signals reflect the non-linear relationship between first-order, posterior-mediated and second-order, anterior-mediated processes and are represented along the entire axis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Monitoring impacts of Tamarix leaf beetles (Diorhabda elongata) on the leaf phenology and water use of Tamarix spp. using ground and remote sensing methods

    NASA Astrophysics Data System (ADS)

    Nagler, P. L.; Brown, T.; Hultine, K. R.; van Riper, C.; Bean, D. A.; Murray, R.; Pearlstein, S.; Glenn, E. P.

    2010-12-01

    Tamarix leaf beetles (Diorhabda elongata) have been released in several locations on western U.S. rivers to control the introduced shrub, Tamarix ramosissima and related species. As they are expanding widely throughout the region, information is needed on their impact on Tamarix leaf phenology and water use over multiple cycles of annual defoliation. We used networked digital cameras (phenocams) and ground surveys to monitor the defoliation process from 2008-2010 at multiple sites on the Dolores River, and MODIS satellite imagery from 2000 to 2009 to monitor leaf phenology and evapotranspiration (ET) at beetle release sites on the Dolores, Lower Colorado, Carson, Walker and Bighorn Rivers. Enhanced Vegetation Index (EVI) values for selected MODIS pixels were used to estimate green foliage density before and after beetle releases at each site. EVI values were transformed into estimates of ET using an empirical algorithm relating ET to EVI and potential ET (ETo) at each site. Phenocam and ground observations show that beetle damage is temporary, and plants regenerate new leaves following an eight week defoliation period in summer. The original biocontrol model predicted that Tamarix mortality would reach 75-85% over several years of defoliation due to progressive weakening of the shrubs each year, but over the early stages of leaf beetle-Tamarix interactions studied here (3-8 years), our preliminary findings show actual reductions in EVI and ET of only 13-15% across sites due to the relatively brief period of defoliation and because not all plants at a site were defoliated. Also, baseline ET rates varied across sites but averaged only 329 mm yr-1 (23% of ETo), constraining the possibilities for water salvage through biocontrol of Tamarix. The spatial and temperol resolution of MODIS imagery were too coarse to capture the details of the defoliation process, and high-resolution imagery or expanded phenocam networks are needed for future monitoring programs.

  16. Chuckwalla Valley multiple-well monitoring site, Chuckwalla Valley, Riverside County

    USGS Publications Warehouse

    Everett, Rhett

    2013-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Bureau of Land Management, is evaluating the geohydrology and water availability of the Chuckwalla Valley, California. As part of this evaluation, the USGS installed the Chuckwalla Valley multiple-well monitoring site (CWV1) in the southeastern portion of the Chuckwalla Basin. Data collected at this site provide information about the geology, hydrology, geophysics, and geochemistry of the local aquifer system, thus enhancing the understanding of the geohydrologic framework of the Chuckwalla Valley. This report presents construction information for the CWV1 multiple-well monitoring site and initial geohydrologic data collected from the site.

  17. Monitoring single protease activities on triple-helical collagen molecules

    NASA Astrophysics Data System (ADS)

    Harzar, Raj; Froberg, James; Srivastava, D. K.; Choi, Yongki

    Matrix metalloproteinases (MMPs), a particular family of proteases, play a pivotal role in degrading the extracellular matrix (ECM). It has been known for more than 40 years that MMPs are closely involved in multiple human cancers during cell growth, invasion, and metastasis. However, the mechanisms of MMP activity are far from being understood. Here, we monitored enzymatic processing of MMPs with two complementary approaches, atomic force microscopy and nanocircuits measurements. AFM measurements demonstrated that incubation of collagen monomers with MMPs resulted in a single position cleavage, producing 3/4 and 1/4 collagen fragments. From electronic monitoring of single MMP nanocircuit measurements, we were able to capture a single cleavage event with a rate of 0.012 Hz, which were in good agreement with fluorescence assay measurements. This work was supported financially by the NIGMS/NIH (P30GM103332-02) and ND NASA EPSCoR RID Grant.

  18. Design, development, and field demonstration of a remotely deployable water quality monitoring system

    NASA Technical Reports Server (NTRS)

    Wallace, J. W.; Lovelady, R. W.; Ferguson, R. L.

    1981-01-01

    A prototype water quality monitoring system is described which offers almost continuous in situ monitoring. The two-man portable system features: (1) a microprocessor controlled central processing unit which allows preprogrammed sampling schedules and reprogramming in situ; (2) a subsurface unit for multiple depth capability and security from vandalism; (3) an acoustic data link for communications between the subsurface unit and the surface control unit; (4) eight water quality parameter sensors; (5) a nonvolatile magnetic bubble memory which prevents data loss in the event of power interruption; (6) a rechargeable power supply sufficient for 2 weeks of unattended operation; (7) a water sampler which can collect samples for laboratory analysis; (8) data output in direct engineering units on printed tape or through a computer compatible link; (9) internal electronic calibration eliminating external sensor adjustment; and (10) acoustic location and recovery systems. Data obtained in Saginaw Bay, Lake Huron are tabulated.

  19. Remote Diagnosis of the International Space Station Utilizing Telemetry Data

    NASA Technical Reports Server (NTRS)

    Deb, Somnath; Ghoshal, Sudipto; Malepati, Venkat; Domagala, Chuck; Patterson-Hine, Ann; Alena, Richard; Norvig, Peter (Technical Monitor)

    2000-01-01

    Modern systems such as fly-by-wire aircraft, nuclear power plants, manufacturing facilities, battlefields, etc., are all examples of highly connected network enabled systems. Many of these systems are also mission critical and need to be monitored round the clock. Such systems typically consist of embedded sensors in networked subsystems that can transmit data to central (or remote) monitoring stations. Moreover, many legacy are safety systems were originally not designed for real-time onboard diagnosis, but a critical and would benefit from such a solution. Embedding additional software or hardware in such systems is often considered too intrusive and introduces flight safety and validation concerns. Such systems can be equipped to transmit the sensor data to a remote-processing center for continuous health monitoring. At Qualtech Systems, we are developing a Remote Diagnosis Server (RDS) that can support multiple simultaneous diagnostic sessions from a variety of remote subsystems.

  20. A sensor data format incorporating battery charge information for smartphone-based mHealth applications

    NASA Astrophysics Data System (ADS)

    Escobar, Rodrigo; Akopian, David; Boppana, Rajendra

    2015-03-01

    Remote health monitoring systems involve energy-constrained devices, such as sensors and mobile gateways. Current data formats for communication of health data, such as DICOM and HL7, were not designed for multi-sensor applications or to enable the management of power-constrained devices in health monitoring processes. In this paper, a data format suitable for collection of multiple sensor data, including readings and other operational parameters is presented. By using the data format, the system management can assess energy consumptions and plan realistic monitoring scenarios. The proposed data format not only outperforms other known data formats in terms of readability, flexibility, interoperability and validation of compliant documents, but also enables energy assessment capability for realistic data collection scenarios and maintains or even reduces the overhead introduced due to formatting. Additionally, we provide analytical methods to estimate incremental energy consumption by various sensors and experiments to measure the actual battery drain on smartphones.

  1. Neuroprotection in a Novel Mouse Model of Multiple Sclerosis

    PubMed Central

    Lidster, Katie; Jackson, Samuel J.; Ahmed, Zubair; Munro, Peter; Coffey, Pete; Giovannoni, Gavin; Baker, Mark D.; Baker, David

    2013-01-01

    Multiple sclerosis is an immune-mediated, demyelinating and neurodegenerative disease that currently lacks any neuroprotective treatments. Innovative neuroprotective trial designs are required to hasten the translational process of drug development. An ideal target to monitor the efficacy of strategies aimed at treating multiple sclerosis is the visual system, which is the most accessible part of the human central nervous system. A novel C57BL/6 mouse line was generated that expressed transgenes for a myelin oligodendrocyte glycoprotein-specific T cell receptor and a retinal ganglion cell restricted-Thy1 promoter-controlled cyan fluorescent protein. This model develops spontaneous or induced optic neuritis, in the absence of paralytic disease normally associated with most rodent autoimmune models of multiple sclerosis. Demyelination and neurodegeneration could be monitored longitudinally in the living animal using electrophysiology, visual sensitivity, confocal scanning laser ophthalmoscopy and optical coherence tomography all of which are relevant to human trials. This model offers many advantages, from a 3Rs, economic and scientific perspective, over classical experimental autoimmune encephalomyelitis models that are associated with substantial suffering of animals. Optic neuritis in this model led to inflammatory damage of axons in the optic nerve and subsequent loss of retinal ganglion cells in the retina. This was inhibited by the systemic administration of a sodium channel blocker (oxcarbazepine) or intraocular treatment with siRNA targeting caspase-2. These novel approaches have relevance to the future treatment of neurodegeneration of MS, which has so far evaded treatment. PMID:24223903

  2. Regression-based pediatric norms for the brief visuospatial memory test: revised and the symbol digit modalities test.

    PubMed

    Smerbeck, A M; Parrish, J; Yeh, E A; Hoogs, M; Krupp, Lauren B; Weinstock-Guttman, B; Benedict, R H B

    2011-04-01

    The Brief Visuospatial Memory Test - Revised (BVMTR) and the Symbol Digit Modalities Test (SDMT) oral-only administration are known to be sensitive to cerebral disease in adult samples, but pediatric norms are not available. A demographically balanced sample of healthy control children (N = 92) ages 6-17 was tested with the BVMTR and SDMT. Multiple regression analysis (MRA) was used to develop demographically controlled normative equations. This analysis provided equations that were then used to construct demographically adjusted z-scores for the BVMTR Trial 1, Trial 2, Trial 3, Total Learning, and Delayed Recall indices, as well as the SDMT total correct score. To demonstrate the utility of this approach, a comparison group of children with acute disseminated encephalomyelitis (ADEM) or multiple sclerosis (MS) were also assessed. We find that these visual processing tests discriminate neurological patients from controls. As the tests are validated in adult multiple sclerosis, they are likely to be useful in monitoring pediatric onset multiple sclerosis patients as they transition into adulthood.

  3. Autonomous Performance Monitoring System: Monitoring and Self-Tuning (MAST)

    NASA Technical Reports Server (NTRS)

    Peterson, Chariya; Ziyad, Nigel A.

    2000-01-01

    Maintaining the long-term performance of software onboard a spacecraft can be a major factor in the cost of operations. In particular, the task of controlling and maintaining a future mission of distributed spacecraft will undoubtedly pose a great challenge, since the complexity of multiple spacecraft flying in formation grows rapidly as the number of spacecraft in the formation increases. Eventually, new approaches will be required in developing viable control systems that can handle the complexity of the data and that are flexible, reliable and efficient. In this paper we propose a methodology that aims to maintain the accuracy of flight software, while reducing the computational complexity of software tuning tasks. The proposed Monitoring and Self-Tuning (MAST) method consists of two parts: a flight software monitoring algorithm and a tuning algorithm. The dependency on the software being monitored is mostly contained in the monitoring process, while the tuning process is a generic algorithm independent of the detailed knowledge on the software. This architecture will enable MAST to be applicable to different onboard software controlling various dynamics of the spacecraft, such as attitude self-calibration, and formation control. An advantage of MAST over conventional techniques such as filter or batch least square is that the tuning algorithm uses machine learning approach to handle uncertainty in the problem domain, resulting in reducing over all computational complexity. The underlying concept of this technique is a reinforcement learning scheme based on cumulative probability generated by the historical performance of the system. The success of MAST will depend heavily on the reinforcement scheme used in the tuning algorithm, which guarantees the tuning solutions exist.

  4. Exploring Google Earth Engine platform for big data processing: classification of multi-temporal satellite imagery for crop mapping

    NASA Astrophysics Data System (ADS)

    Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii

    2017-02-01

    Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.

  5. The topograpy of demyelination and neurodegeneration in the multiple sclerosis brain

    PubMed Central

    Haider, Lukas; Hametner, Simon; Höftberger, Romana; Bagnato, Francesca; Grabner, Günther; Trattnig, Siegfried; Pfeifenbring, Sabine; Brück, Wolfgang

    2016-01-01

    Abstract Multiple sclerosis is a chronic inflammatory disease with primary demyelination and neurodegeneration in the central nervous system. In our study we analysed demyelination and neurodegeneration in a large series of multiple sclerosis brains and provide a map that displays the frequency of different brain areas to be affected by these processes. Demyelination in the cerebral cortex was related to inflammatory infiltrates in the meninges, which was pronounced in invaginations of the brain surface (sulci) and possibly promoted by low flow of the cerebrospinal fluid in these areas. Focal demyelinated lesions in the white matter occurred at sites with high venous density and additionally accumulated in watershed areas of low arterial blood supply. Two different patterns of neurodegeneration in the cortex were identified: oxidative injury of cortical neurons and retrograde neurodegeneration due to axonal injury in the white matter. While oxidative injury was related to the inflammatory process in the meninges and pronounced in actively demyelinating cortical lesions, retrograde degeneration was mainly related to demyelinated lesions and axonal loss in the white matter. Our data show that accumulation of lesions and neurodegeneration in the multiple sclerosis brain does not affect all brain regions equally and provides the pathological basis for the selection of brain areas for monitoring regional injury and atrophy development in future magnetic resonance imaging studies. PMID:26912645

  6. A Study of the Nature of Information Needed by Women with Multiple Sclerosis.

    ERIC Educational Resources Information Center

    Baker, Lynda M.

    1996-01-01

    Women with MS (Multiple Sclerosis), classified by the Miller Behavioral Style Scale as actively seeking information (monitors) or rejecting information (blunters), assessed pamphlets pertaining to the disease. More monitors than blunters rated the pamphlet relevant, regardless of the nature of the information. (PEN)

  7. SOUND SURVEY DESIGNS CAN FACILITATE INTEGRATING STREAM MONITORING DATA ACROSS MULTIPLE PROGRAMS

    EPA Science Inventory

    Multiple agencies in the Pacific Northwest monitor the condition of stream networks or their watersheds. Some agencies use a stream "network" perspective to report on the fraction or length of the network that either meets or violates particular criteria. Other agencies use a "wa...

  8. A Systematic Analysis of 2 Monoisocentric Techniques for the Treatment of Multiple Brain Metastases.

    PubMed

    Narayanasamy, Ganesh; Stathakis, Sotirios; Gutierrez, Alonso N; Pappas, Evangelos; Crownover, Richard; Floyd, John R; Papanikolaou, Niko

    2017-10-01

    In this treatment planning study, we compare the plan quality and delivery parameters for the treatment of multiple brain metastases using 2 monoisocentric techniques: the Multiple Metastases Element from Brainlab and the RapidArc volumetric-modulated arc therapy from Varian Medical Systems. Eight patients who were treated in our institution for multiple metastases (3-7 lesions) were replanned with Multiple Metastases Element using noncoplanar dynamic conformal arcs. The same patients were replanned with the RapidArc technique in Eclipse using 4 noncoplanar arcs. Both techniques were designed using a single isocenter. Plan quality metrics (conformity index, homogeneity index, gradient index, and R 50% ), monitor unit, and the planning time were recorded. Comparison of the Multiple Metastases Element and RapidArc plans was performed using Shapiro-Wilk test, paired Student t test, and Wilcoxon signed rank test. A paired Wilcoxon signed rank test between Multiple Metastases Element and RapidArc showed comparable plan quality metrics and dose to brain. Mean ± standard deviation values of conformity index were 1.8 ± 0.7 and 1.7 ± 0.6, homogeneity index were 1.3 ± 0.1 and 1.3 ± 0.1, gradient index were 5.0 ± 1.8 and 5.1 ± 1.9, and R 50% were 4.9 ± 1.8 and 5.0 ± 1.9 for Multiple Metastases Element and RapidArc plans, respectively. Mean brain dose was 2.3 and 2.7 Gy for Multiple Metastases Element and RapidArc plans, respectively. The mean value of monitor units in Multiple Metastases Element plan was 7286 ± 1065, which is significantly lower than the RapidArc monitor units of 9966 ± 1533 ( P < .05). For the planning of multiple brain lesions to be treated with stereotactic radiosurgery, Multiple Metastases Element planning software produced equivalent conformity, homogeneity, dose falloff, and brain V 12 Gy but required significantly lower monitor units, when compared to RapidArc plans.

  9. A Systematic Analysis of 2 Monoisocentric Techniques for the Treatment of Multiple Brain Metastases

    PubMed Central

    Stathakis, Sotirios; Gutierrez, Alonso N.; Pappas, Evangelos; Crownover, Richard; Floyd, John R.; Papanikolaou, Niko

    2016-01-01

    Background: In this treatment planning study, we compare the plan quality and delivery parameters for the treatment of multiple brain metastases using 2 monoisocentric techniques: the Multiple Metastases Element from Brainlab and the RapidArc volumetric-modulated arc therapy from Varian Medical Systems. Methods: Eight patients who were treated in our institution for multiple metastases (3-7 lesions) were replanned with Multiple Metastases Element using noncoplanar dynamic conformal arcs. The same patients were replanned with the RapidArc technique in Eclipse using 4 noncoplanar arcs. Both techniques were designed using a single isocenter. Plan quality metrics (conformity index, homogeneity index, gradient index, and R50%), monitor unit, and the planning time were recorded. Comparison of the Multiple Metastases Element and RapidArc plans was performed using Shapiro-Wilk test, paired Student t test, and Wilcoxon signed rank test. Results: A paired Wilcoxon signed rank test between Multiple Metastases Element and RapidArc showed comparable plan quality metrics and dose to brain. Mean ± standard deviation values of conformity index were 1.8 ± 0.7 and 1.7 ± 0.6, homogeneity index were 1.3 ± 0.1 and 1.3 ± 0.1, gradient index were 5.0 ± 1.8 and 5.1 ± 1.9, and R50% were 4.9 ± 1.8 and 5.0 ± 1.9 for Multiple Metastases Element and RapidArc plans, respectively. Mean brain dose was 2.3 and 2.7 Gy for Multiple Metastases Element and RapidArc plans, respectively. The mean value of monitor units in Multiple Metastases Element plan was 7286 ± 1065, which is significantly lower than the RapidArc monitor units of 9966 ± 1533 (P < .05). Conclusion: For the planning of multiple brain lesions to be treated with stereotactic radiosurgery, Multiple Metastases Element planning software produced equivalent conformity, homogeneity, dose falloff, and brain V12 Gy but required significantly lower monitor units, when compared to RapidArc plans. PMID:27612917

  10. Continuous gravimetric monitoring as an integrative tool for exploring hydrological processes in the Lomme Karst System (Belgium)

    NASA Astrophysics Data System (ADS)

    Watlet, A.; Van Camp, M. J.; Poulain, A.; Hallet, V.; Rochez, G.; Quinif, Y.; Meus, P.; Kaufmann, O.; Francis, O.

    2016-12-01

    Karst systems are highly heterogeneous which makes their hydrology difficult to understand. Geophysical techniques offer non-invasive and integrative methods that help interpreting such systems as a whole. Among these techniques, gravimetry has been increasingly used in the last decade to characterize the hydrological behavior of complex systems, e.g. karst environments or volcanoes. We present a continuous microgravimetric monitoring of 3 years in the karstic area of Rochefort (Belgium), that shows multiple occurrences of caves and karstic features. The gravity record includes measurements of a GWR superconducting gravimeter, a Micro-g LaCoste gPhone and an absolute FG5 gravimeter. Together with meteorological measurements and a surface/in-cave hydrogeological monitoring, we were able to improve the knowledge of hydrological processes. On the one hand, the data allowed identifying seasonal groundwater content changes in the unsaturated zone of the karst area, most likely to be linked to temporary groundwater storage occurring in the most karstified layers closed to the surface. Combined with additional geological information, modelling of the gravity signal based on the vertical potential of the gravitational attraction was then particularly useful to estimate the seasonal recharge leading to the temporary subsurface groundwater storage. On the other hand, the gravity monitoring of flash floods occurring in deeper layers after intense rainfall events informed on the effective porosity gradient of the limestones. Modelling was then helpful to identify the hydrogeological role played by the cave galleries with respect to the hosting limestones during flash floods. These results are also compared with measurements of an in-cave gravimetric monitoring performed with a gPhone spring gravimeter. An Electrical Resistivity Tomography monitoring is also conducted at site and brings additional information useful to verify the interpretation made with the gravimetric monitoring.

  11. Building capacity in biodiversity monitoring at the global scale

    USGS Publications Warehouse

    Schmeller, Dirk S.; Bohm, Monika; Arvanitidis, Christos; Barber-Meyer, Shannon; Brummitt, Neil; Chandler, Mark; Chatzinikolaou, Eva; Costello, Mark J.; Ding, Hui; García-Moreno, Jaime; Gill, Michael J.; Haase, Peter; Jones, Miranda; Juillard, Romain; Magnusson, William E.; Martin, Corinne S.; McGeoch, Melodie A.; Mihoub, Jean-Baptiste; Pettorelli, Nathalie; Proença, Vânia; Peng, Cui; Regan, Eugenie; Schmiedel, Ute; Simsika, John P.; Weatherdon, Lauren; Waterman, Carly; Xu, Haigen; Belnap, Jayne

    2017-01-01

    Human-driven global change is causing ongoing declines in biodiversity worldwide. In order to address these declines, decision-makers need accurate assessments of the status of and pressures on biodiversity. However, these are heavily constrained by incomplete and uneven spatial, temporal and taxonomic coverage. For instance, data from regions such as Europe and North America are currently used overwhelmingly for large-scale biodiversity assessments due to lesser availability of suitable data from other, more biodiversity-rich, regions. These data-poor regions are often those experiencing the strongest threats to biodiversity, however. There is therefore an urgent need to fill the existing gaps in global biodiversity monitoring. Here, we review current knowledge on best practice in capacity building for biodiversity monitoring and provide an overview of existing means to improve biodiversity data collection considering the different types of biodiversity monitoring data. Our review comprises insights from work in Africa, South America, Polar Regions and Europe; in government-funded, volunteer and citizen-based monitoring in terrestrial, freshwater and marine ecosystems. The key steps to effectively building capacity in biodiversity monitoring are: identifying monitoring questions and aims; identifying the key components, functions, and processes to monitor; identifying the most suitable monitoring methods for these elements, carrying out monitoring activities; managing the resultant data; and interpreting monitoring data. Additionally, biodiversity monitoring should use multiple approaches including extensive and intensive monitoring through volunteers and professional scientists but also harnessing new technologies. Finally, we call on the scientific community to share biodiversity monitoring data, knowledge and tools to ensure the accessibility, interoperability, and reporting of biodiversity data at a global scale.

  12. Adhesive Defect Monitoring of Glass Fiber Epoxy Plate Using an Impedance-Based Non-Destructive Testing Method for Multiple Structures

    PubMed Central

    Na, Wongi S.; Baek, Jongdae

    2017-01-01

    The emergence of composite materials has revolutionized the approach to building engineering structures. With the number of applications for composites increasing every day, maintaining structural integrity is of utmost importance. For composites, adhesive bonding is usually the preferred choice over the mechanical fastening method, and monitoring for delamination is an essential factor in the field of composite materials. In this study, a non-destructive method known as the electromechanical impedance method is used with an approach of monitoring multiple areas by specifying certain frequency ranges to correspond to a certain test specimen. Experiments are conducted using various numbers of stacks created by attaching glass fiber epoxy composite plates onto one another, and two different debonding damage types are introduced to evaluate the performance of the multiple monitoring electromechanical impedance method. PMID:28629194

  13. ASCS online fault detection and isolation based on an improved MPCA

    NASA Astrophysics Data System (ADS)

    Peng, Jianxin; Liu, Haiou; Hu, Yuhui; Xi, Junqiang; Chen, Huiyan

    2014-09-01

    Multi-way principal component analysis (MPCA) has received considerable attention and been widely used in process monitoring. A traditional MPCA algorithm unfolds multiple batches of historical data into a two-dimensional matrix and cut the matrix along the time axis to form subspaces. However, low efficiency of subspaces and difficult fault isolation are the common disadvantages for the principal component model. This paper presents a new subspace construction method based on kernel density estimation function that can effectively reduce the storage amount of the subspace information. The MPCA model and the knowledge base are built based on the new subspace. Then, fault detection and isolation with the squared prediction error (SPE) statistic and the Hotelling ( T 2) statistic are also realized in process monitoring. When a fault occurs, fault isolation based on the SPE statistic is achieved by residual contribution analysis of different variables. For fault isolation of subspace based on the T 2 statistic, the relationship between the statistic indicator and state variables is constructed, and the constraint conditions are presented to check the validity of fault isolation. Then, to improve the robustness of fault isolation to unexpected disturbances, the statistic method is adopted to set the relation between single subspace and multiple subspaces to increase the corrective rate of fault isolation. Finally fault detection and isolation based on the improved MPCA is used to monitor the automatic shift control system (ASCS) to prove the correctness and effectiveness of the algorithm. The research proposes a new subspace construction method to reduce the required storage capacity and to prove the robustness of the principal component model, and sets the relationship between the state variables and fault detection indicators for fault isolation.

  14. Nonconventional MRI biomarkers for in vivo monitoring of pathogenesis in multiple sclerosis.

    PubMed

    Londoño, Ana C; Mora, Carlos A

    2014-12-01

    To date, biomarkers based on nonconventional MRI have not been standardized for diagnosis and follow-up of patients with multiple sclerosis (MS). The sequential monitoring of pathogenesis in MS by imaging of the normal appearing brain tissue is an important research tool in understanding the early stages of MS. In this review, we focus on the importance of deciphering the physiopathogenesis of the disease cascade in vivo based on imaging biomarkers that allow a correlation with immunohistochemistry and molecular biology findings in order to provide earlier clinical diagnosis and better individualization of treatment and follow-up in patients with MS. Among the nonconventional imaging techniques available, we remark on the importance of proton magnetic resonance spectroscopy imaging because of its ability to assist in the simultaneous evaluation of different events in the pathogenesis of MS that cannot be determined by conventional MRI. Nonconventional MRI and the use of novel contrast agents are expected to elucidate the process of neuroinflammation and excitotoxicity in vivo that characterizes MS, thus leading to more specific neuroprotective and immunomodulatory therapies and reducing progression toward disability.

  15. Simultaneous Profiling of Lysoglycerophospholipids in Rice (Oryza sativa L.) Using Direct Infusion-Tandem Mass Spectrometry with Multiple Reaction Monitoring.

    PubMed

    Lim, Dong Kyu; Mo, Changyeun; Long, Nguyen Phuoc; Kim, Giyoung; Kwon, Sung Won

    2017-03-29

    White rice is the final product after the hull and bran layers have been removed during the milling process. Although lysoglycerophospholipids (lysoGPLs) only occupy a small proportion in white rice, they are essential for evaluating rice authenticity and quality. In this study, we developed a high-throughput and targeted lipidomics approach that involved direct infusion-tandem mass spectrometry with multiple reaction monitoring to simultaneously profile lysoGPLs in white rice. The method is capable of characterizing 17 lysoGPLs within 1 min. In addition, unsupervised and supervised analyses exhibited a considerably large diversity of lysoGPL concentrations in white rice from different origins. In particular, a classification model was built using identified lysoGPLs that can differentiate white rice from Korea, China, and Japan. Among the discriminatory lysoGPLs, for the lysoPE(16:0) and lysoPE(18:2) compositions, there were relatively small within-group variations, and they were considerably different among the three countries. In conclusion, our proposed method provides a rapid, high-throughput, and comprehensive format for profiling lysoGPLs in rice samples.

  16. A new approach to power quality and electricity reliability monitoring-case study illustrations of the capabilities of the I-GridTM system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Divan, Deepak; Brumsickle, William; Eto, Joseph

    2003-04-01

    This report describes a new approach for collecting information on power quality and reliability and making it available in the public domain. Making this information readily available in a form that is meaningful to electricity consumers is necessary for enabling more informed private and public decisions regarding electricity reliability. The system dramatically reduces the cost (and expertise) needed for customers to obtain information on the most significant power quality events, called voltage sags and interruptions. The system also offers widespread access to information on power quality collected from multiple sites and the potential for capturing information on the impacts ofmore » power quality problems, together enabling a wide variety of analysis and benchmarking to improve system reliability. Six case studies demonstrate selected functionality and capabilities of the system, including: Linking measured power quality events to process interruption and downtime; Demonstrating the ability to correlate events recorded by multiple monitors to narrow and confirm the causes of power quality events; and Benchmarking power quality and reliability on a firm and regional basis.« less

  17. Design and Weighting Methods for a Nationally Representative Sample of HIV-infected Adults Receiving Medical Care in the United States-Medical Monitoring Project

    PubMed Central

    Iachan, Ronaldo; H. Johnson, Christopher; L. Harding, Richard; Kyle, Tonja; Saavedra, Pedro; L. Frazier, Emma; Beer, Linda; L. Mattson, Christine; Skarbinski, Jacek

    2016-01-01

    Background: Health surveys of the general US population are inadequate for monitoring human immunodeficiency virus (HIV) infection because the relatively low prevalence of the disease (<0.5%) leads to small subpopulation sample sizes. Objective: To collect a nationally and locally representative probability sample of HIV-infected adults receiving medical care to monitor clinical and behavioral outcomes, supplementing the data in the National HIV Surveillance System. This paper describes the sample design and weighting methods for the Medical Monitoring Project (MMP) and provides estimates of the size and characteristics of this population. Methods: To develop a method for obtaining valid, representative estimates of the in-care population, we implemented a cross-sectional, three-stage design that sampled 23 jurisdictions, then 691 facilities, then 9,344 HIV patients receiving medical care, using probability-proportional-to-size methods. The data weighting process followed standard methods, accounting for the probabilities of selection at each stage and adjusting for nonresponse and multiplicity. Nonresponse adjustments accounted for differing response at both facility and patient levels. Multiplicity adjustments accounted for visits to more than one HIV care facility. Results: MMP used a multistage stratified probability sampling design that was approximately self-weighting in each of the 23 project areas and nationally. The probability sample represents the estimated 421,186 HIV-infected adults receiving medical care during January through April 2009. Methods were efficient (i.e., induced small, unequal weighting effects and small standard errors for a range of weighted estimates). Conclusion: The information collected through MMP allows monitoring trends in clinical and behavioral outcomes and informs resource allocation for treatment and prevention activities. PMID:27651851

  18. Fiber-Optic Surface Temperature Sensor Based on Modal Interference.

    PubMed

    Musin, Frédéric; Mégret, Patrice; Wuilpart, Marc

    2016-07-28

    Spatially-integrated surface temperature sensing is highly useful when it comes to controlling processes, detecting hazardous conditions or monitoring the health and safety of equipment and people. Fiber-optic sensing based on modal interference has shown great sensitivity to temperature variation, by means of cost-effective image-processing of few-mode interference patterns. New developments in the field of sensor configuration, as described in this paper, include an innovative cooling and heating phase discrimination functionality and more precise measurements, based entirely on the image processing of interference patterns. The proposed technique was applied to the measurement of the integrated surface temperature of a hollow cylinder and compared with a conventional measurement system, consisting of an infrared camera and precision temperature probe. As a result, the optical technique is in line with the reference system. Compared with conventional surface temperature probes, the optical technique has the following advantages: low heat capacity temperature measurement errors, easier spatial deployment, and replacement of multiple angle infrared camera shooting and the continuous monitoring of surfaces that are not visually accessible.

  19. Use of structured decision making to identify monitoring variables and management priorities for salt marsh ecosystems

    USGS Publications Warehouse

    Neckles, Hilary A.; Lyons, James E.; Guntenspergen, Glenn R.; Shriver, W. Gregory; Adamowicz, Susan C.

    2015-01-01

    Most salt marshes in the USA have been degraded by human activities, and coastal managers are faced with complex choices among possible actions to restore or enhance ecosystem integrity. We applied structured decision making (SDM) to guide selection of monitoring variables and management priorities for salt marshes within the National Wildlife Refuge System in the northeastern USA. In general, SDM is a systematic process for decomposing a decision into its essential elements. We first engaged stakeholders in clarifying regional salt marsh decision problems, defining objectives and attributes to evaluate whether objectives are achieved, and developing a pool of alternative management actions for achieving objectives. Through this process, we identified salt marsh attributes that were applicable to monitoring National Wildlife Refuges on a regional scale and that targeted management needs. We then analyzed management decisions within three salt marsh units at Prime Hook National Wildlife Refuge, coastal Delaware, as a case example of prioritizing management alternatives. Values for salt marsh attributes were estimated from 2 years of baseline monitoring data and expert opinion. We used linear value modeling to aggregate multiple attributes into a single performance score for each alternative, constrained optimization to identify alternatives that maximized total management benefits subject to refuge-wide cost constraints, and used graphical analysis to identify the optimal set of alternatives for the refuge. SDM offers an efficient, transparent approach for integrating monitoring into management practice and improving the quality of management decisions.

  20. A microfluidic device integrating dual CMOS polysilicon nanowire sensors for on-chip whole blood processing and simultaneous detection of multiple analytes.

    PubMed

    Kuan, Da-Han; Wang, I-Shun; Lin, Jiun-Rue; Yang, Chao-Han; Huang, Chi-Hsien; Lin, Yen-Hung; Lin, Chih-Ting; Huang, Nien-Tsu

    2016-08-02

    The hemoglobin-A1c test, measuring the ratio of glycated hemoglobin (HbA1c) to hemoglobin (Hb) levels, has been a standard assay in diabetes diagnosis that removes the day-to-day glucose level variation. Currently, the HbA1c test is restricted to hospitals and central laboratories due to the laborious, time-consuming whole blood processing and bulky instruments. In this paper, we have developed a microfluidic device integrating dual CMOS polysilicon nanowire sensors (MINS) for on-chip whole blood processing and simultaneous detection of multiple analytes. The micromachined polymethylmethacrylate (PMMA) microfluidic device consisted of a serpentine microchannel with multiple dam structures designed for non-lysed cells or debris trapping, uniform plasma/buffer mixing and dilution. The CMOS-fabricated polysilicon nanowire sensors integrated with the microfluidic device were designed for the simultaneous, label-free electrical detection of multiple analytes. Our study first measured the Hb and HbA1c levels in 11 clinical samples via these nanowire sensors. The results were compared with those of standard Hb and HbA1c measurement methods (Hb: the sodium lauryl sulfate hemoglobin detection method; HbA1c: cation-exchange high-performance liquid chromatography) and showed comparable outcomes. Finally, we successfully demonstrated the efficacy of the MINS device's on-chip whole blood processing followed by simultaneous Hb and HbA1c measurement in a clinical sample. Compared to current Hb and HbA1c sensing instruments, the MINS platform is compact and can simultaneously detect two analytes with only 5 μL of whole blood, which corresponds to a 300-fold blood volume reduction. The total assay time, including the in situ sample processing and analyte detection, was just 30 minutes. Based on its on-chip whole blood processing and simultaneous multiple analyte detection functionalities with a lower sample volume requirement and shorter process time, the MINS device can be effectively applied to real-time diabetes diagnostics and monitoring in point-of-care settings.

  1. Construction of EGFP-labeling system for visualizing the infection process of Xanthomonas axonopodis pv. citri in planta.

    PubMed

    Liu, Li-Ping; Deng, Zi-Niu; Qu, Jin-Wang; Yan, Jia-Wen; Catara, Vittoria; Li, Da-Zhi; Long, Gui-You; Li, Na

    2012-09-01

    Xanthomonas axonopodis pv. citri (Xac) is the causal agent of citrus bacterial canker, an economically important disease to world citrus industry. To monitor the infection process of Xac in different citrus plants, the enhanced green florescent protein (EGFP) visualizing system was constructed to visualize the propagation and localization in planta. First, the wild-type Xac was isolated from the diseased leaves of susceptible 'Bingtang' sweet orange, and then the isolated Xac was labeled with EGFP by triparental mating. After PCR identification, the growth kinetics and pathogenicity of the transformants were analyzed in comparison with the wild-type Xac. The EGFP-labeled bacteria were inoculated by spraying on the surface and infiltration in the mesophyll of 'Bingtang' sweet orange leaves. The bacterial cell multiplication and diffusion processes were observed directly under confocal laser scanning microscope at different intervals after inoculation. The results indicated that the EGFP-labeled Xac releasing clear green fluorescence light under fluorescent microscope showed the infection process and had the same pathogenicity as the wild type to citrus. Consequently, the labeled Xac demonstrated the ability as an efficient tool to monitor the pathogen infection.

  2. An all-optronic synthetic aperture lidar

    NASA Astrophysics Data System (ADS)

    Turbide, Simon; Marchese, Linda; Terroux, Marc; Babin, François; Bergeron, Alain

    2012-09-01

    Synthetic Aperture Radar (SAR) is a mature technology that overcomes the diffraction limit of an imaging system's real aperture by taking advantage of the platform motion to coherently sample multiple sections of an aperture much larger than the physical one. Synthetic Aperture Lidar (SAL) is the extension of SAR to much shorter wavelengths (1.5 μm vs 5 cm). This new technology can offer higher resolution images in day or night time as well as in certain adverse conditions. It could be a powerful tool for Earth monitoring (ship detection, frontier surveillance, ocean monitoring) from aircraft, unattended aerial vehicle (UAV) or spatial platforms. A continuous flow of high-resolution images covering large areas would however produce a large amount of data involving a high cost in term of post-processing computational time. This paper presents a laboratory demonstration of a SAL system complete with image reconstruction based on optronic processing. This differs from the more traditional digital approach by its real-time processing capability. The SAL system is discussed and images obtained from a non-metallic diffuse target at ranges up to 3m are shown, these images being processed by a real-time optronic SAR processor origiinally designed to reconstruct SAR images from ENVISAT/ASAR data.

  3. In-Line Monitoring of a Pharmaceutical Pan Coating Process by Optical Coherence Tomography.

    PubMed

    Markl, Daniel; Hannesschläger, Günther; Sacher, Stephan; Leitner, Michael; Buchsbaum, Andreas; Pescod, Russel; Baele, Thomas; Khinast, Johannes G

    2015-08-01

    This work demonstrates a new in-line measurement technique for monitoring the coating growth of randomly moving tablets in a pan coating process. In-line quality control is performed by an optical coherence tomography (OCT) sensor allowing nondestructive and contact-free acquisition of cross-section images of film coatings in real time. The coating thickness can be determined directly from these OCT images and no chemometric calibration models are required for quantification. Coating thickness measurements are extracted from the images by a fully automated algorithm. Results of the in-line measurements are validated using off-line OCT images, thickness calculations from tablet dimension measurements, and weight gain measurements. Validation measurements are performed on sample tablets periodically removed from the process during production. Reproducibility of the results is demonstrated by three batches produced under the same process conditions. OCT enables a multiple direct measurement of the coating thickness on individual tablets rather than providing the average coating thickness of a large number of tablets. This gives substantially more information about the coating quality, that is, intra- and intertablet coating variability, than standard quality control methods. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  4. Distributed torsion sensor based on cascaded coaxial cable Fabry-Perot interferometers

    NASA Astrophysics Data System (ADS)

    Cheng, Baokai; Zhu, Wenge; Hua, Liwei; Liu, Jie; Li, Yurong; Nygaard, Runar; Xiao, Hai

    2016-07-01

    Cascaded coaxial cable Fabry-Perot interferometers (FPI) are studied and demonstrated for distributed torsion measurement. Multiple weak reflectors are implemented on a coaxial cable so that any two consecutive reflectors can form a Fabry-Perot cavity. By fixing the cable sensor in a helical form on a shaft, the distributed torsion of the shaft can be measured by the cascaded Fabry-Perot cavities. A test on a single section shows that the sensor has a linear response with a sensitivity of 1.834 MHz (rad/m)-1 in the range of twisted rate from 0 to 8.726 rad m-1. The distributed torsion sensing capability is useful in drilling process monitoring, structure health monitoring and machine failure detection.

  5. Getting the Bigger Picture With Digital Surveillance

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Through a Space Act Agreement, Diebold, Inc., acquired the exclusive rights to Glenn Research Center's patented video observation technology, originally designed to accelerate video image analysis for various ongoing and future space applications. Diebold implemented the technology into its AccuTrack digital, color video recorder, a state-of- the-art surveillance product that uses motion detection for around-the- clock monitoring. AccuTrack captures digitally signed images and transaction data in real-time. This process replaces the onerous tasks involved in operating a VCR-based surveillance system, and subsequently eliminates the need for central viewing and tape archiving locations altogether. AccuTrack can monitor an entire bank facility, including four automated teller machines, multiple teller lines, and new account areas, all from one central location.

  6. Imaging of dynamic ion signaling during root gravitropism.

    PubMed

    Monshausen, Gabriele B

    2015-01-01

    Gravitropic signaling is a complex process that requires the coordinated action of multiple cell types and tissues. Ca(2+) and pH signaling are key components of gravitropic signaling cascades and can serve as useful markers to dissect the molecular machinery mediating plant gravitropism. To monitor dynamic ion signaling, imaging approaches combining fluorescent ion sensors and confocal fluorescence microscopy are employed, which allow the visualization of pH and Ca(2+) changes at the level of entire tissues, while also providing high spatiotemporal resolution. Here, I describe procedures to prepare Arabidopsis seedlings for live cell imaging and to convert a microscope for vertical stage fluorescence microscopy. With this imaging system, ion signaling can be monitored during all phases of the root gravitropic response.

  7. Multitasking: Effects of processing multiple auditory feature patterns

    PubMed Central

    Miller, Tova; Chen, Sufen; Lee, Wei Wei; Sussman, Elyse S.

    2016-01-01

    ERPs and behavioral responses were measured to assess how task-irrelevant sounds interact with task processing demands and affect the ability to monitor and track multiple sound events. Participants listened to four-tone sequential frequency patterns, and responded to frequency pattern deviants (reversals of the pattern). Irrelevant tone feature patterns (duration and intensity) and respective pattern deviants were presented together with frequency patterns and frequency pattern deviants in separate conditions. Responses to task-relevant and task-irrelevant feature pattern deviants were used to test processing demands for irrelevant sound input. Behavioral performance was significantly better when there were no distracting feature patterns. Errors primarily occurred in response to the to-be-ignored feature pattern deviants. Task-irrelevant elicitation of ERP components was consistent with the error analysis, indicating a level of processing for the irrelevant features. Task-relevant elicitation of ERP components was consistent with behavioral performance, demonstrating a “cost” of performance when there were two feature patterns presented simultaneously. These results provide evidence that the brain tracked the irrelevant duration and intensity feature patterns, affecting behavioral performance. Overall, our results demonstrate that irrelevant informational streams are processed at a cost, which may be considered a type of multitasking that is an ongoing, automatic processing of taskirrelevant sensory events. PMID:25939456

  8. Analysing the impact of multiple stressors in aquatic biomonitoring data: A 'cookbook' with applications in R.

    PubMed

    Feld, Christian K; Segurado, Pedro; Gutiérrez-Cánovas, Cayetano

    2016-12-15

    Multiple stressors threaten biodiversity and ecosystem integrity, imposing new challenges to ecosystem management and restoration. Ecosystem managers are required to address and mitigate the impact of multiple stressors, yet the knowledge required to disentangle multiple-stressor effects is still incomplete. Experimental studies have advanced the understanding of single and combined stressor effects, but there is a lack of a robust analytical framework, to address the impact of multiple stressors based on monitoring data. Since 2000, the monitoring of Europe's waters has resulted in a vast amount of biological and environmental (stressor) data of about 120,000 water bodies. For many reasons, this data is rarely exploited in the multiple-stressor context, probably because of its rather heterogeneous nature: stressors vary and are mixed with broad-scale proxies of environmental stress (e.g. land cover), missing values and zero-inflated data limit the application of statistical methods and biological indicators are often aggregated (e.g. taxon richness) and do not respond stressor-specific. Here, we present a 'cookbook' to analyse the biological response to multiple stressors using data from biomonitoring schemes. Our 'cookbook' includes guidance for the analytical process and the interpretation of results. The 'cookbook' is accompanied by scripts, which allow the user to run a stepwise analysis based on his/her own data in R, an open-source language and environment for statistical computing and graphics. Using simulated and real data, we show that the recommended procedure is capable of identifying stressor hierarchy (importance) and interaction in large datasets. We recommend a minimum number of 150 independent observations and a minimum stressor gradient length of 75% (of the most relevant stressor's gradient in nature), to be able to reliably rank the stressor's importance, detect relevant interactions and estimate their standardised effect size. We conclude with a brief discussion of the advantages and limitations of this protocol. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Investigation of hair dye deposition, hair color loss, and hair damage during multiple oxidative dyeing and shampooing cycles.

    PubMed

    Zhang, Guojin; McMullen, Roger L; Kulcsar, Lidia

    2016-01-01

    Color fastness is a major concern for consumers and manufacturers of oxidative hair dye products. Hair dye loss results from multiple wash cycles in which the hair dye is dissolved by water and leaches from the hair shaft. In this study, we carried out a series of measurements to help us better understand the kinetics of the leaching process and pathways associated with its escape from the fiber. Hair dye leaching kinetics was measured by suspending hair in a dissolution apparatus and monitoring the dye concentration in solution (leached dye) with an ultraviolet-visible spectrophotometer. The physical state of dye deposited in hair fibers was evaluated by a reflectance light microscopy technique, based on image stacking, allowing enhanced depth of field imaging. The dye distribution within the fiber was monitored by infrared spectroscopic imaging of hair fiber cross sections. Damage to the ultrafine structure of the hair cuticle (surface, endocuticle, and cell membrane complex) and cortex (cell membrane complex) was determined in hair cross sections and on the hair fiber surface with atomic force microscopy. Using differential scanning calorimetry, we investigated how consecutive coloring and leaching processes affect the internal proteins of hair. Further, to probe the surface properties of hair we utilized contact angle measurements. This study was conducted on both pigmented and nonpigmented hair to gain insight into the influence of melanin on the hair dye deposition and leaching processes. Both types of hair were colored utilizing a commercial oxidative hair dye product based on pyrazole chemistry.

  10. Neonatal non-contact respiratory monitoring based on real-time infrared thermography

    PubMed Central

    2011-01-01

    Background Monitoring of vital parameters is an important topic in neonatal daily care. Progress in computational intelligence and medical sensors has facilitated the development of smart bedside monitors that can integrate multiple parameters into a single monitoring system. This paper describes non-contact monitoring of neonatal vital signals based on infrared thermography as a new biomedical engineering application. One signal of clinical interest is the spontaneous respiration rate of the neonate. It will be shown that the respiration rate of neonates can be monitored based on analysis of the anterior naris (nostrils) temperature profile associated with the inspiration and expiration phases successively. Objective The aim of this study is to develop and investigate a new non-contact respiration monitoring modality for neonatal intensive care unit (NICU) using infrared thermography imaging. This development includes subsequent image processing (region of interest (ROI) detection) and optimization. Moreover, it includes further optimization of this non-contact respiration monitoring to be considered as physiological measurement inside NICU wards. Results Continuous wavelet transformation based on Debauches wavelet function was applied to detect the breathing signal within an image stream. Respiration was successfully monitored based on a 0.3°C to 0.5°C temperature difference between the inspiration and expiration phases. Conclusions Although this method has been applied to adults before, this is the first time it was used in a newborn infant population inside the neonatal intensive care unit (NICU). The promising results suggest to include this technology into advanced NICU monitors. PMID:22243660

  11. Distributed Monitoring Infrastructure for Worldwide LHC Computing Grid

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Babik, M.; Bhatt, K.; Chand, P.; Collados, D.; Duggal, V.; Fuente, P.; Hayashi, S.; Imamagic, E.; Joshi, P.; Kalmady, R.; Karnani, U.; Kumar, V.; Lapka, W.; Quick, R.; Tarragon, J.; Teige, S.; Triantafyllidis, C.

    2012-12-01

    The journey of a monitoring probe from its development phase to the moment its execution result is presented in an availability report is a complex process. It goes through multiple phases such as development, testing, integration, release, deployment, execution, data aggregation, computation, and reporting. Further, it involves people with different roles (developers, site managers, VO[1] managers, service managers, management), from different middleware providers (ARC[2], dCache[3], gLite[4], UNICORE[5] and VDT[6]), consortiums (WLCG[7], EMI[11], EGI[15], OSG[13]), and operational teams (GOC[16], OMB[8], OTAG[9], CSIRT[10]). The seamless harmonization of these distributed actors is in daily use for monitoring of the WLCG infrastructure. In this paper we describe the monitoring of the WLCG infrastructure from the operational perspective. We explain the complexity of the journey of a monitoring probe from its execution on a grid node to the visualization on the MyWLCG[27] portal where it is exposed to other clients. This monitoring workflow profits from the interoperability established between the SAM[19] and RSV[20] frameworks. We show how these two distributed structures are capable of uniting technologies and hiding the complexity around them, making them easy to be used by the community. Finally, the different supported deployment strategies, tailored not only for monitoring the entire infrastructure but also for monitoring sites and virtual organizations, are presented and the associated operational benefits highlighted.

  12. Monitoring Replication Protein A (RPA) dynamics in homologous recombination through site-specific incorporation of non-canonical amino acids

    PubMed Central

    Pokhrel, Nilisha; Origanti, Sofia; Davenport, Eric Parker; Gandhi, Disha; Kaniecki, Kyle; Mehl, Ryan A.; Greene, Eric C.; Dockendorff, Chris

    2017-01-01

    Abstract An essential coordinator of all DNA metabolic processes is Replication Protein A (RPA). RPA orchestrates these processes by binding to single-stranded DNA (ssDNA) and interacting with several other DNA binding proteins. Determining the real-time kinetics of single players such as RPA in the presence of multiple DNA processors to better understand the associated mechanistic events is technically challenging. To overcome this hurdle, we utilized non-canonical amino acids and bio-orthogonal chemistry to site-specifically incorporate a chemical fluorophore onto a single subunit of heterotrimeric RPA. Upon binding to ssDNA, this fluorescent RPA (RPAf) generates a quantifiable change in fluorescence, thus serving as a reporter of its dynamics on DNA in the presence of multiple other DNA binding proteins. Using RPAf, we describe the kinetics of facilitated self-exchange and exchange by Rad51 and mediator proteins during various stages in homologous recombination. RPAf is widely applicable to investigate its mechanism of action in processes such as DNA replication, repair and telomere maintenance. PMID:28934470

  13. Children's Comprehension Monitoring of Multiple Situational Dimensions of a Narrative

    ERIC Educational Resources Information Center

    Wassenburg, Stephanie I.; Beker, Katinka; van den Broek, Paul; van der Schoot, Menno

    2015-01-01

    Narratives typically consist of information on multiple aspects of a situation. In order to successfully create a coherent representation of the described situation, readers are required to monitor all these situational dimensions during reading. However, little is known about whether these dimensions differ in the ease with which they can be…

  14. Processing of meteorological data with ultrasonic thermoanemometers

    NASA Astrophysics Data System (ADS)

    Telminov, A. E.; Bogushevich, A. Ya.; Korolkov, V. A.; Botygin, I. A.

    2017-11-01

    The article describes a software system intended for supporting scientific researches of the atmosphere during the processing of data gathered by multi-level ultrasonic complexes for automated monitoring of meteorological and turbulent parameters in the ground layer of the atmosphere. The system allows to process files containing data sets of temperature instantaneous values, three orthogonal components of wind speed, humidity and pressure. The processing task execution is done in multiple stages. During the first stage, the system executes researcher's query for meteorological parameters. At the second stage, the system computes series of standard statistical meteorological field properties, such as averages, dispersion, standard deviation, asymmetry coefficients, excess, correlation etc. The third stage is necessary to prepare for computing the parameters of atmospheric turbulence. The computation results are displayed to user and stored at hard drive.

  15. Developments in Earth observation for the assessment and monitoring of inland, transitional, coastal and shelf-sea waters.

    PubMed

    Tyler, Andrew N; Hunter, Peter D; Spyrakos, Evangelos; Groom, Steve; Constantinescu, Adriana Maria; Kitchen, Jonathan

    2016-12-01

    The Earth's surface waters are a fundamental resource and encompass a broad range of ecosystems that are core to global biogeochemical cycling and food and energy production. Despite this, the Earth's surface waters are impacted by multiple natural and anthropogenic pressures and drivers of environmental change. The complex interaction between physical, chemical and biological processes in surface waters poses significant challenges for in situ monitoring and assessment and often limits our ability to adequately capture the dynamics of aquatic systems and our understanding of their status, functioning and response to pressures. Here we explore the opportunities that Earth observation (EO) has to offer to basin-scale monitoring of water quality over the surface water continuum comprising inland, transition and coastal water bodies, with a particular focus on the Danube and Black Sea region. This review summarises the technological advances in EO and the opportunities that the next generation satellites offer for water quality monitoring. We provide an overview of algorithms for the retrieval of water quality parameters and demonstrate how such models have been used for the assessment and monitoring of inland, transitional, coastal and shelf-sea systems. Further, we argue that very few studies have investigated the connectivity between these systems especially in large river-sea systems such as the Danube-Black Sea. Subsequently, we describe current capability in operational processing of archive and near real-time satellite data. We conclude that while the operational use of satellites for the assessment and monitoring of surface waters is still developing for inland and coastal waters and more work is required on the development and validation of remote sensing algorithms for these optically complex waters, the potential that these data streams offer for developing an improved, potentially paradigm-shifting understanding of physical and biogeochemical processes across large scale river-sea systems including the Danube-Black Sea is considerable. Copyright © 2016. Published by Elsevier B.V.

  16. Monitoring safety in a phase III real‐world effectiveness trial: use of novel methodology in the Salford Lung Study

    PubMed Central

    Harvey, Catherine; Brewster, Jill; Bakerly, Nawar Diar; Elkhenini, Hanaa F.; Stanciu, Roxana; Williams, Claire; Brereton, Jacqui; New, John P.; McCrae, John; McCorkindale, Sheila; Leather, David

    2016-01-01

    Abstract Background The Salford Lung Study (SLS) programme, encompassing two phase III pragmatic randomised controlled trials, was designed to generate evidence on the effectiveness of a once‐daily treatment for asthma and chronic obstructive pulmonary disease in routine primary care using electronic health records. Objective The objective of this study was to describe and discuss the safety monitoring methodology and the challenges associated with ensuring patient safety in the SLS. Refinements to safety monitoring processes and infrastructure are also discussed. The study results are outside the remit of this paper. The results of the COPD study were published recently and a more in‐depth exploration of the safety results will be the subject of future publications. Achievements The SLS used a linked database system to capture relevant data from primary care practices in Salford and South Manchester, two university hospitals and other national databases. Patient data were collated and analysed to create daily summaries that were used to alert a specialist safety team to potential safety events. Clinical research teams at participating general practitioner sites and pharmacies also captured safety events during routine consultations. Confidence in the safety monitoring processes over time allowed the methodology to be refined and streamlined without compromising patient safety or the timely collection of data. The information technology infrastructure also allowed additional details of safety information to be collected. Conclusion Integration of multiple data sources in the SLS may provide more comprehensive safety information than usually collected in standard randomised controlled trials. Application of the principles of safety monitoring methodology from the SLS could facilitate safety monitoring processes for future pragmatic randomised controlled trials and yield important complementary safety and effectiveness data. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd. PMID:27804174

  17. Monitoring safety in a phase III real-world effectiveness trial: use of novel methodology in the Salford Lung Study.

    PubMed

    Collier, Sue; Harvey, Catherine; Brewster, Jill; Bakerly, Nawar Diar; Elkhenini, Hanaa F; Stanciu, Roxana; Williams, Claire; Brereton, Jacqui; New, John P; McCrae, John; McCorkindale, Sheila; Leather, David

    2017-03-01

    The Salford Lung Study (SLS) programme, encompassing two phase III pragmatic randomised controlled trials, was designed to generate evidence on the effectiveness of a once-daily treatment for asthma and chronic obstructive pulmonary disease in routine primary care using electronic health records. The objective of this study was to describe and discuss the safety monitoring methodology and the challenges associated with ensuring patient safety in the SLS. Refinements to safety monitoring processes and infrastructure are also discussed. The study results are outside the remit of this paper. The results of the COPD study were published recently and a more in-depth exploration of the safety results will be the subject of future publications. The SLS used a linked database system to capture relevant data from primary care practices in Salford and South Manchester, two university hospitals and other national databases. Patient data were collated and analysed to create daily summaries that were used to alert a specialist safety team to potential safety events. Clinical research teams at participating general practitioner sites and pharmacies also captured safety events during routine consultations. Confidence in the safety monitoring processes over time allowed the methodology to be refined and streamlined without compromising patient safety or the timely collection of data. The information technology infrastructure also allowed additional details of safety information to be collected. Integration of multiple data sources in the SLS may provide more comprehensive safety information than usually collected in standard randomised controlled trials. Application of the principles of safety monitoring methodology from the SLS could facilitate safety monitoring processes for future pragmatic randomised controlled trials and yield important complementary safety and effectiveness data. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd.

  18. Systems Analyze Water Quality in Real Time

    NASA Technical Reports Server (NTRS)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  19. Coupling functions for lead and lead-free neutron monitors from the latitudinal measurements performed in 1982 in the research station Academician Kurchatov

    NASA Technical Reports Server (NTRS)

    Alekanyan, T. M.; Dorman, L. I.; Yanke, V. G.; Korotkov, V. K.

    1985-01-01

    The latitudinal behavior of intensities and multiplicities was registered by the neutron monitor 2 NM and the lead-free neutron monitor 3 SND (slow-neuron detector) in the equator-Kaliningrad line in the Atlantic Ocean. Coupling coefficients for 3 SND show the sensitivity of this detector to primary particles of cosmic rays of energies on the average lower than for 2 NM. As multiplicities increase, the coupling coefficients shift towards higher energies.

  20. Parallel excitation-emission multiplexed fluorescence lifetime confocal microscopy for live cell imaging

    PubMed Central

    Zhao, Ming; Li, Yu; Peng, Leilei

    2014-01-01

    We present a novel excitation-emission multiplexed fluorescence lifetime microscopy (FLIM) method that surpasses current FLIM techniques in multiplexing capability. The method employs Fourier multiplexing to simultaneously acquire confocal fluorescence lifetime images of multiple excitation wavelength and emission color combinations at 44,000 pixels/sec. The system is built with low-cost CW laser sources and standard PMTs with versatile spectral configuration, which can be implemented as an add-on to commercial confocal microscopes. The Fourier lifetime confocal method allows fast multiplexed FLIM imaging, which makes it possible to monitor multiple biological processes in live cells. The low cost and compatibility with commercial systems could also make multiplexed FLIM more accessible to biological research community. PMID:24921725

  1. Multiple channel optical data acquisition system

    DOEpatents

    Fasching, G.E.; Goff, D.R.

    1985-02-22

    A multiple channel optical data acquisition system is provided in which a plurality of remote sensors monitoring specific process variable are interrogated by means of a single optical fiber connecting the remote station/sensors to a base station. The remote station/sensors derive all power from light transmitted through the fiber from the base station. Each station/sensor is individually accessed by means of a light modulated address code sent over the fiber. The remote station/sensors use a single light emitting diode to both send and receive light signals to communicate with the base station and provide power for the remote station. The system described can power at least 100 remote station/sensors over an optical fiber one mile in length.

  2. Identification of delaminations in composite: structural health monitoring software based on spectral estimation and hierarchical genetic algorithm

    NASA Astrophysics Data System (ADS)

    Nag, A.; Mahapatra, D. Roy; Gopalakrishnan, S.

    2003-10-01

    A hierarchical Genetic Algorithm (GA) is implemented in a high peformance spectral finite element software for identification of delaminations in laminated composite beams. In smart structural health monitoring, the number of delaminations (or any other modes of damage) as well as their locations and sizes are no way completely known. Only known are the healthy structural configuration (mass, stiffness and damping matrices updated from previous phases of monitoring), sensor measurements and some information about the load environment. To handle such enormous complexity, a hierarchical GA is used to represent heterogeneous population consisting of damaged structures with different number of delaminations and their evolution process to identify the correct damage configuration in the structures under monitoring. We consider this similarity with the evolution process in heterogeneous population of species in nature to develop an automated procedure to decide on what possible damaged configuration might have produced the deviation in the measured signals. Computational efficiency of the identification task is demonstrated by considering a single delamination. The behavior of fitness function in GA, which is an important factor for fast convergence, is studied for single and multiple delaminations. Several advantages of the approach in terms of computational cost is discussed. Beside tackling different other types of damage configurations, further scope of research for development of hybrid soft-computing modules are highlighted.

  3. Dynamic subnanosecond time-of-flight detection for ultra-precise diffusion monitoring and optimization of biomarker preservation

    NASA Astrophysics Data System (ADS)

    Bauer, Daniel R.; Stevens, Benjamin; Taft, Jefferson; Chafin, David; Petre, Vinnie; Theiss, Abbey P.; Otter, Michael

    2014-03-01

    Recently, it has been demonstrated that the preservation of cancer biomarkers, such as phosphorylated protein epitopes, in formalin-fixed paraffin-embedded tissue is highly dependent on the localized concentration of the crosslinking agent. This study details a real-time diffusion monitoring system based on the acoustic time-of-flight (TOF) between pairs of 4 MHz focused transducers. Diffusion affects TOF because of the distinct acoustic velocities of formalin and interstitial fluid. Tissue is placed between the transducers and vertically translated to obtain TOF values at multiple locations with a spatial resolution of approximately 1 mm. Imaging is repeated for several hours until osmotic equilibrium is reached. A post-processing technique, analogous to digital acoustic interferometry, enables detection of subnanosecond TOF differences. Reference subtraction is used to compensate for environmental effects. Diffusion measurements with TOF monitoring ex vivo human tonsil tissue are well-correlated with a single exponential curve (R2>0.98) with a magnitude of up to 50 ns, depending on the tissue size (2-6 mm). The average exponential decay constant of 2 and 6 mm diameter samples are 20 and 315 minutes, respectively, although times varied significantly throughout the tissue (σmax=174 min). This technique can precisely monitor diffusion progression and could be used to mitigate effects from tissue heterogeneity and intersample variability, enabling improved preservation of cancer biomarkers distinctly sensitive to degradation during preanalytical tissue processing.

  4. Event-related potentials dissociate facilitation and interference effects in the numerical Stroop paradigm.

    PubMed

    Szucs, Dénes; Soltész, Fruzsina

    2007-11-05

    In the numerical Stroop paradigm (NSP) participants compare simultaneously presented Arabic digits based on either their numerical or on their physical size dimension. Responses are faster when the numerical and size dimensions are congruent with each other (facilitation), and responses are slower when the numerical and size dimensions are incongruent with each other (interference). We aimed to find out whether facilitation and interference appears during the course of perceptual or response processing. To this end, facilitation and interference effects in the amplitude of event-related brain potentials (ERPs) were examined. The onset of motor preparation was determined by monitoring the lateralized readiness potential. In numerical comparison one facilitation effect was related to perceptual processing at the level of the magnitude representation. A second facilitation effect and interference effects appeared during response processing. In size comparison facilitation and interference appeared exclusively during response processing. In both tasks, ERP interference effects were probably related to contextual analysis and to the conflict monitoring and selection for action activity of the anterior cingulate cortex. The results demonstrate that facilitation and interference effects in the NSP appear during multiple stages of processing, and that they are related to different cognitive processes. Therefore these effects should be clearly separated in studies of the NSP. A model of the processes involved in the NSP is provided and implications for studies of the NSP are drawn.

  5. Multi-allergen Quantitation and the Impact of Thermal Treatment in Industry-Processed Baked Goods by ELISA and Liquid Chromatography-Tandem Mass Spectrometry.

    PubMed

    Parker, Christine H; Khuda, Sefat E; Pereira, Marion; Ross, Mark M; Fu, Tong-Jen; Fan, Xuebin; Wu, Yan; Williams, Kristina M; DeVries, Jonathan; Pulvermacher, Brian; Bedford, Binaifer; Zhang, Xi; Jackson, Lauren S

    2015-12-16

    Undeclared food allergens account for 30-40% of food recalls in the United States. Compliance with ingredient labeling regulations and the implementation of effective manufacturing allergen control plans require the use of reliable methods for allergen detection and quantitation in complex food products. The objectives of this work were to (1) produce industry-processed model foods incurred with egg, milk, and peanut allergens, (2) compare analytical method performance for allergen quantitation in thermally processed bakery products, and (3) determine the effects of thermal treatment on allergen detection. Control and allergen-incurred cereal bars and muffins were formulated in a pilot-scale industry processing facility. Quantitation of egg, milk, and peanut in incurred baked goods was compared at various processing stages using commercial enzyme-linked immunosorbent assay (ELISA) kits and a novel multi-allergen liquid chromatography (LC)-tandem mass spectrometry (MS/MS) multiple-reaction monitoring (MRM) method. Thermal processing was determined to negatively affect the recovery and quantitation of egg, milk, and peanut to different extents depending on the allergen, matrix, and analytical test method. The Morinaga ELISA and LC-MS/MS quantitative methods reported the highest recovery across all monitored allergens, whereas the ELISA Systems, Neogen BioKits, Neogen Veratox, and R-Biopharm ELISA Kits underperformed in the determination of allergen content of industry-processed bakery products.

  6. 40 CFR 75.16 - Special provisions for monitoring emissions from common, bypass, and multiple stacks for SO2...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... maintain an SO2 continuous emission monitoring system and flow monitoring system in the duct to the common... emission monitoring system and flow monitoring system in the common stack and combine emissions for the... continuous emission monitoring system and flow monitoring system in the duct to the common stack from each...

  7. Role of data aggregation in biosurveillance detection strategies with applications from ESSENCE.

    PubMed

    Burkom, Howard S; Elbert, Y; Feldman, A; Lin, J

    2004-09-24

    Syndromic surveillance systems are used to monitor daily electronic data streams for anomalous counts of features of varying specificity. The monitored quantities might be counts of clinical diagnoses, sales of over-the-counter influenza remedies, school absenteeism among a given age group, and so forth. Basic data-aggregation decisions for these systems include determining which records to count and how to group them in space and time. This paper discusses the application of spatial and temporal data-aggregation strategies for multiple data streams to alerting algorithms appropriate to the surveillance region and public health threat of interest. Such a strategy was applied and evaluated for a complex, authentic, multisource, multiregion environment, including >2 years of data records from a system-evaluation exercise for the Defense Advanced Research Project Agency (DARPA). Multivariate and multiple univariate statistical process control methods were adapted and applied to the DARPA data collection. Comparative parametric analyses based on temporal aggregation were used to optimize the performance of these algorithms for timely detection of a set of outbreaks identified in the data by a team of epidemiologists. The sensitivity and timeliness of the most promising detection methods were tested at empirically calculated thresholds corresponding to multiple practical false-alert rates. Even at the strictest false-alert rate, all but one of the outbreaks were detected by the best method, and the best methods achieved a 1-day median time before alert over the set of test outbreaks. These results indicate that a biosurveillance system can provide a substantial alerting-timeliness advantage over traditional public health monitoring for certain outbreaks. Comparative analyses of individual algorithm results indicate further achievable improvement in sensitivity and specificity.

  8. Software for Remote Monitoring of Space-Station Payloads

    NASA Technical Reports Server (NTRS)

    Schneider, Michelle; Lippincott, Jeff; Chubb, Steve; Whitaker, Jimmy; Gillis, Robert; Sellers, Donna; Sims, Chris; Rice, James

    2003-01-01

    Telescience Resource Kit (TReK) is a suite of application programs that enable geographically dispersed users to monitor scientific payloads aboard the International Space Station (ISS). TReK provides local ground support services that can simultaneously receive, process, record, playback, and display data from multiple sources. TReK also provides interfaces to use the remote services provided by the Payload Operations Integration Center which manages all ISS payloads. An application programming interface (API) allows for payload users to gain access to all data processed by TReK and allows payload-specific tools and programs to be built or integrated with TReK. Used in conjunction with other ISS-provided tools, TReK provides the ability to integrate payloads with the operational ground system early in the lifecycle. This reduces the potential for operational problems and provides "cradle-to-grave" end-to-end operations. TReK contains user guides and self-paced tutorials along with training applications to allow the user to become familiar with the system.

  9. Visualization of multiple influences on ocellar flight control in giant honeybees with the data-mining tool Viscovery SOMine.

    PubMed

    Kastberger, G; Kranner, G

    2000-02-01

    Viscovery SOMine is a software tool for advanced analysis and monitoring of numerical data sets. It was developed for professional use in business, industry, and science and to support dependency analysis, deviation detection, unsupervised clustering, nonlinear regression, data association, pattern recognition, and animated monitoring. Based on the concept of self-organizing maps (SOMs), it employs a robust variant of unsupervised neural networks--namely, Kohonen's Batch-SOM, which is further enhanced with a new scaling technique for speeding up the learning process. This tool provides a powerful means by which to analyze complex data sets without prior statistical knowledge. The data representation contained in the trained SOM is systematically converted to be used in a spectrum of visualization techniques, such as evaluating dependencies between components, investigating geometric properties of the data distribution, searching for clusters, or monitoring new data. We have used this software tool to analyze and visualize multiple influences of the ocellar system on free-flight behavior in giant honeybees. Occlusion of ocelli will affect orienting reactivities in relation to flight target, level of disturbance, and position of the bee in the flight chamber; it will induce phototaxis and make orienting imprecise and dependent on motivational settings. Ocelli permit the adjustment of orienting strategies to environmental demands by enforcing abilities such as centering or flight kinetics and by providing independent control of posture and flight course.

  10. Monitoring and analysis of data from complex systems

    NASA Technical Reports Server (NTRS)

    Dollman, Thomas; Webster, Kenneth

    1991-01-01

    Some of the methods, systems, and prototypes that have been tested for monitoring and analyzing the data from several spacecraft and vehicles at the Marshall Space Flight Center are introduced. For the Huntsville Operations Support Center (HOSC) infrastructure, the Marshall Integrated Support System (MISS) provides a migration path to the state-of-the-art workstation environment. Its modular design makes it possible to implement the system in stages on multiple platforms without the need for all components to be in place at once. The MISS provides a flexible, user-friendly environment for monitoring and controlling orbital payloads. In addition, new capabilities and technology may be incorporated into MISS with greater ease. The use of information systems technology in advanced prototype phases, as adjuncts to mainline activities, is used to evaluate new computational techniques for monitoring and analysis of complex systems. Much of the software described (specially, HSTORESIS (Hubble Space Telescope Operational Readiness Expert Safemode Investigation System), DRS (Device Reasoning Shell), DART (Design Alternatives Rational Tool), elements of the DRA (Document Retrieval Assistant), and software for the PPS (Peripheral Processing System) and the HSPP (High-Speed Peripheral Processor)) is available with supporting documentation, and may be applicable to other system monitoring and analysis applications.

  11. Great Basin Integrated Landscape Monitoring Pilot Summary Report

    USGS Publications Warehouse

    Finn, Sean P.; Kitchell, Kate; Baer, Lori Anne; Bedford, David R.; Brooks, Matthew L.; Flint, Alan L.; Flint, Lorraine E.; Matchett, J.R.; Mathie, Amy; Miller, David M.; Pilliod, David S.; Torregrosa, Alicia; Woodward, Andrea

    2010-01-01

    The Great Basin Integrated Landscape Monitoring Pilot project (GBILM) was one of four regional pilots to implement the U.S. Geological Survey (USGS) Science Thrust on Integrated Landscape Monitoring (ILM) whose goal was to observe, understand, and predict landscape change and its implications on natural resources at multiple spatial and temporal scales and address priority natural resource management and policy issues. The Great Basin is undergoing rapid environmental change stemming from interactions among global climate trends, increasing human populations, expanding and accelerating land and water uses, invasive species, and altered fire regimes. GBLIM tested concepts and developed tools to store and analyze monitoring data, understand change at multiple scales, and forecast landscape change. The GBILM endeavored to develop and test a landscape-level monitoring approach in the Great Basin that integrates USGS disciplines, addresses priority management questions, catalogs and uses existing monitoring data, evaluates change at multiple scales, and contributes to development of regional monitoring strategies. GBILM functioned as an integrative team from 2005 to 2010, producing more than 35 science and data management products that addressed pressing ecosystem drivers and resource management agency needs in the region. This report summarizes the approaches and methods of this interdisciplinary effort, identifies and describes the products generated, and provides lessons learned during the project.

  12. Software for Storage and Management of Microclimatic Data for Preventive Conservation of Cultural Heritage

    PubMed Central

    Fernández-Navajas, Ángel; Merello, Paloma; Beltrán, Pedro; García-Diego, Fernando-Juan

    2013-01-01

    Cultural Heritage preventive conservation requires the monitoring of the parameters involved in the process of deterioration of artworks. Thus, both long-term monitoring of the environmental parameters as well as further analysis of the recorded data are necessary. The long-term monitoring at frequencies higher than 1 data point/day generates large volumes of data that are difficult to store, manage and analyze. This paper presents software which uses a free open source database engine that allows managing and interacting with huge amounts of data from environmental monitoring of cultural heritage sites. It is of simple operation and offers multiple capabilities, such as detection of anomalous data, inquiries, graph plotting and mean trajectories. It is also possible to export the data to a spreadsheet for analyses with more advanced statistical methods (principal component analysis, ANOVA, linear regression, etc.). This paper also deals with a practical application developed for the Renaissance frescoes of the Cathedral of Valencia. The results suggest infiltration of rainwater in the vault and weekly relative humidity changes related with the religious service schedules. PMID:23447005

  13. Method and apparatus of prefetching streams of varying prefetch depth

    DOEpatents

    Gara, Alan [Mount Kisco, NY; Ohmacht, Martin [Yorktown Heights, NY; Salapura, Valentina [Chappaqua, NY; Sugavanam, Krishnan [Mahopac, NY; Hoenicke, Dirk [Seebruck-Seeon, DE

    2012-01-24

    Method and apparatus of prefetching streams of varying prefetch depth dynamically changes the depth of prefetching so that the number of multiple streams as well as the hit rate of a single stream are optimized. The method and apparatus in one aspect monitor a plurality of load requests from a processing unit for data in a prefetch buffer, determine an access pattern associated with the plurality of load requests and adjust a prefetch depth according to the access pattern.

  14. Localization of Gunfire from Multiple Shooters (ARO Research Topic 5.2, Information Processing and Fusion; STIR Program)

    DTIC Science & Technology

    2016-03-03

    for each shot, as well as "raw" data that includes time-of-arrival (TOA) and direction-of-arrival (DOA) of the muzzle blast (MB) produced by the weapon...angle of arrival, muzzle blast, shock wave, bullet deceleration, fusion REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 10. SPONSOR...of the muzzle blast (MB) produced by the weapon and the shock wave (SW) produced by the supersonic bullet. The localization accuracy is improved

  15. Multiple-Targeted Graphene-based Nanocarrier for Intracellular Imaging of mRNAs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Ying; Li, Zhaohui; Liu, Misha

    Simultaneous detection and imaging of multiple intracellular messenger RNA (mRNAs) hold great significant for early cancer diagnostics and preventive medicine development. Herein, we propose a multiple-targeted graphene oxide (GO) nanocarrier that can simultaneously detect and image different type mRNAs in living cells. First of all, in vitro detection of multiple targets have been realized successfully based on the multiple-targeted GO nanocarrier with linear relationship ranging from 3 nM to 200 nM, as well as sensitive detection limit of 1.84 nM for manganese superoxide dismutase (Mn-SOD) mRNA and 2.45 nM for β-actin mRNA. Additionally, this nanosensing platform composed of fluorescent labeledmore » single strand DNA probes and GO nanocarrier can identify Mn-SOD mRNA and endogenous mRNA of β-actin in living cancer cells, showing rapid response, high specificity, nuclease stability, and good biocompatibility during the cell imaging. Thirdly, changes of the expression levels of mRNA in living cells before or after the drug treatment can be monitored successfully. By using multiple ssDNA as probes and GO nanocarrier as the cellular delivery cargo, the proposed simultaneous multiple-targeted sensing platform will be of great potential as a powerful tool for intracellular trafficking process from basic research to clinical diagnosis.« less

  16. Self-Monitoring Checklists for Inquiry Problem-Solving: Functional Problem-Solving Methods for Students with Intellectual Disability

    ERIC Educational Resources Information Center

    Miller, Bridget; Taber-Doughty, Teresa

    2014-01-01

    Three students with mild to moderate intellectual and multiple disability, enrolled in a self-contained functional curriculum class were taught to use a self-monitoring checklist and science notebook to increase independence in inquiry problem-solving skills. Using a single-subject multiple-probe design, all students acquired inquiry…

  17. Applications of the BIOPHYS Algorithm for Physically-Based Retrieval of Biophysical, Structural and Forest Disturbance Information

    NASA Technical Reports Server (NTRS)

    Peddle, Derek R.; Huemmrich, K. Fred; Hall, Forrest G.; Masek, Jeffrey G.; Soenen, Scott A.; Jackson, Chris D.

    2011-01-01

    Canopy reflectance model inversion using look-up table approaches provides powerful and flexible options for deriving improved forest biophysical structural information (BSI) compared with traditional statistical empirical methods. The BIOPHYS algorithm is an improved, physically-based inversion approach for deriving BSI for independent use and validation and for monitoring, inventory and quantifying forest disturbance as well as input to ecosystem, climate and carbon models. Based on the multiple-forward mode (MFM) inversion approach, BIOPHYS results were summarized from different studies (Minnesota/NASA COVER; Virginia/LEDAPS; Saskatchewan/BOREAS), sensors (airborne MMR; Landsat; MODIS) and models (GeoSail; GOMS). Applications output included forest density, height, crown dimension, branch and green leaf area, canopy cover, disturbance estimates based on multi-temporal chronosequences, and structural change following recovery from forest fires over the last century. Good correspondences with validation field data were obtained. Integrated analyses of multiple solar and view angle imagery further improved retrievals compared with single pass data. Quantifying ecosystem dynamics such as the area and percent of forest disturbance, early regrowth and succession provide essential inputs to process-driven models of carbon flux. BIOPHYS is well suited for large-area, multi-temporal applications involving multiple image sets and mosaics for assessing vegetation disturbance and quantifying biophysical structural dynamics and change. It is also suitable for integration with forest inventory, monitoring, updating, and other programs.

  18. Global change and terrestrial plant community dynamics

    DOE PAGES

    Franklin, Janet; Serra-Diaz, Josep M.; Syphard, Alexandra D.; ...

    2016-02-29

    Anthropogenic drivers of global change include rising atmospheric concentrations of carbon dioxide and other greenhouse gasses and resulting changes in the climate, as well as nitrogen deposition, biotic invasions, altered disturbance regimes, and land-use change. Predicting the effects of global change on terrestrial plant communities is crucial because of the ecosystem services vegetation provides, from climate regulation to forest products. In this article, we present a framework for detecting vegetation changes and attributing them to global change drivers that incorporates multiple lines of evidence from spatially extensive monitoring networks, distributed experiments, remotely sensed data, and historical records. Based on amore » literature review, we summarize observed changes and then describe modeling tools that can forecast the impacts of multiple drivers on plant communities in an era of rapid change. Observed responses to changes in temperature, water, nutrients, land use, and disturbance show strong sensitivity of ecosystem productivity and plant population dynamics to water balance and long-lasting effects of disturbance on plant community dynamics. Persistent effects of land-use change and human-altered fire regimes on vegetation can overshadow or interact with climate change impacts. Models forecasting plant community responses to global change incorporate shifting ecological niches, population dynamics, species interactions, spatially explicit disturbance, ecosystem processes, and plant functional responses. Lastly, monitoring, experiments, and models evaluating multiple change drivers are needed to detect and predict vegetation changes in response to 21st century global change.« less

  19. Global change and terrestrial plant community dynamics

    PubMed Central

    Franklin, Janet; Serra-Diaz, Josep M.; Syphard, Alexandra D.; Regan, Helen M.

    2016-01-01

    Anthropogenic drivers of global change include rising atmospheric concentrations of carbon dioxide and other greenhouse gasses and resulting changes in the climate, as well as nitrogen deposition, biotic invasions, altered disturbance regimes, and land-use change. Predicting the effects of global change on terrestrial plant communities is crucial because of the ecosystem services vegetation provides, from climate regulation to forest products. In this paper, we present a framework for detecting vegetation changes and attributing them to global change drivers that incorporates multiple lines of evidence from spatially extensive monitoring networks, distributed experiments, remotely sensed data, and historical records. Based on a literature review, we summarize observed changes and then describe modeling tools that can forecast the impacts of multiple drivers on plant communities in an era of rapid change. Observed responses to changes in temperature, water, nutrients, land use, and disturbance show strong sensitivity of ecosystem productivity and plant population dynamics to water balance and long-lasting effects of disturbance on plant community dynamics. Persistent effects of land-use change and human-altered fire regimes on vegetation can overshadow or interact with climate change impacts. Models forecasting plant community responses to global change incorporate shifting ecological niches, population dynamics, species interactions, spatially explicit disturbance, ecosystem processes, and plant functional responses. Monitoring, experiments, and models evaluating multiple change drivers are needed to detect and predict vegetation changes in response to 21st century global change. PMID:26929338

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franklin, Janet; Serra-Diaz, Josep M.; Syphard, Alexandra D.

    Anthropogenic drivers of global change include rising atmospheric concentrations of carbon dioxide and other greenhouse gasses and resulting changes in the climate, as well as nitrogen deposition, biotic invasions, altered disturbance regimes, and land-use change. Predicting the effects of global change on terrestrial plant communities is crucial because of the ecosystem services vegetation provides, from climate regulation to forest products. In this article, we present a framework for detecting vegetation changes and attributing them to global change drivers that incorporates multiple lines of evidence from spatially extensive monitoring networks, distributed experiments, remotely sensed data, and historical records. Based on amore » literature review, we summarize observed changes and then describe modeling tools that can forecast the impacts of multiple drivers on plant communities in an era of rapid change. Observed responses to changes in temperature, water, nutrients, land use, and disturbance show strong sensitivity of ecosystem productivity and plant population dynamics to water balance and long-lasting effects of disturbance on plant community dynamics. Persistent effects of land-use change and human-altered fire regimes on vegetation can overshadow or interact with climate change impacts. Models forecasting plant community responses to global change incorporate shifting ecological niches, population dynamics, species interactions, spatially explicit disturbance, ecosystem processes, and plant functional responses. Lastly, monitoring, experiments, and models evaluating multiple change drivers are needed to detect and predict vegetation changes in response to 21st century global change.« less

  1. Multiple Sensing Application on Wireless Sensor Network Simulation using NS3

    NASA Astrophysics Data System (ADS)

    Kurniawan, I. F.; Bisma, R.

    2018-01-01

    Hardware enhancement provides opportunity to install various sensor device on single monitoring node which then enables users to acquire multiple data simultaneously. Constructing multiple sensing application in NS3 is a challenging task since numbers of aspects such as wireless communication, packet transmission pattern, and energy model must be taken into account. Despite of numerous types of monitoring data available, this study only considers two types such as periodic, and event-based data. Periodical data will generate monitoring data follows configured interval, while event-based transmit data when certain determined condition is met. Therefore, this study attempts to cover mentioned aspects in NS3. Several simulations are performed with different number of nodes on arbitrary communication scheme.

  2. Monitoring Natural World Heritage Sites: optimization of the monitoring system in Bogda with GIS-based multi-criteria decision analysis.

    PubMed

    Wang, Zhaoguo; Du, Xishihui

    2016-07-01

    Natural World Heritage Sites (NWHSs) are invaluable treasure due to the uniqueness of each site. Proper monitoring and management can guarantee their protection from multiple threats. In this study, geographic information system (GIS)-based multi-criteria decision analysis (GIS-MCDA) was used to assess criteria layers acquired from the data available in the literature. A conceptual model for determining the priority area for monitoring in Bogda, China, was created based on outstanding universal values (OUV) and expert knowledge. Weights were assigned to each layer using the analytic hierarchy process (AHP) based on group decisions, encompassing three experts: one being a heritage site expert, another a forest ranger, and the other a heritage site manager. Subsequently, evaluation layers and constraint layers were used to generate a priority map and to determine the feasibility of monitoring in Bogda. Finally, a monitoring suitability map of Bogda was obtained by referencing priority and feasibility maps.The high-priority monitoring area is located in the montane forest belt, which exhibits high biodiversity and is the main tourist area of Bogda. The northern buffer zone of Bogda comprises the concentrated feasible monitoring areas, and the area closest to roads and monitoring facilities is highly feasible for NWHS monitoring. The suitability of an area in terms of monitoring is largely determined by the monitoring priority in that particular area. The majority of planned monitoring facilities are well distributed in both suitable and less suitable areas. Analysis results indicate that the protection of Bogda will be more scientifically based due to its effective and all-around planned monitoring system proposed by the declaration text of Xinjiang Tianshan, which is the essential file submitted to World Heritage Centre to inscribe as a NWHS.

  3. Application of open-path Fourier transform infrared spectroscopy for atmospheric monitoring of a CO2 back-production experiment at the Ketzin pilot site (Germany).

    PubMed

    Sauer, Uta; Borsdorf, H; Dietrich, P; Liebscher, A; Möller, I; Martens, S; Möller, F; Schlömer, S; Schütze, C

    2018-02-03

    During a controlled "back-production experiment" in October 2014 at the Ketzin pilot site, formerly injected CO 2 was retrieved from the storage formation and directly released to the atmosphere via a vent-off stack. Open-path Fourier transform infrared (OP FTIR) spectrometers, on-site meteorological parameter acquisition systems, and distributed CO 2 point sensors monitored gas dispersion processes in the near-surface part of the atmospheric boundary layer. The test site provides a complex and challenging mosaic-like surface setting for atmospheric monitoring which can also be found at other storage sites. The main aims of the atmospheric monitoring of this experiment were (1) to quantify temporal and spatial variations in atmospheric CO 2 concentrations around the emitting vent-off stack and (2) to test if and how atmospheric monitoring can cope with typical environmental and operational challenges. A low environmental risk was encountered during the whole CO 2 back-production experiment. The study confirms that turbulent wind conditions favor atmospheric mixing processes and are responsible for rapid dilution of the released CO 2 leading to decreased detectability at all sensors. In contrast, calm and extremely stable wind conditions (especially occurring during the night) caused an accumulation of gases in the near-ground atmospheric layer with the highest amplitudes in measured gas concentration. As an important benefit of OP FTIR spectroscopic measurements and their ability to detect multiple gas species simultaneously, emission sources could be identified to a much higher certainty. Moreover, even simulation models using simplified assumptions help to find suitable monitoring network designs and support data analysis for certain wind conditions in such a complex environment.

  4. A Hands-on Physical Analog Demonstration of Real-Time Volcano Deformation Monitoring with GNSS/GPS

    NASA Astrophysics Data System (ADS)

    Jones, J. R.; Schobelock, J.; Nguyen, T. T.; Rajaonarison, T. A.; Malloy, S.; Njinju, E. A.; Guerra, L.; Stamps, D. S.; Glesener, G. B.

    2017-12-01

    Teaching about volcano deformation and how scientists study these processes using GNSS/GPS may present some challenge since the volcanoes and/or GNSS/GPS equipment are not quite accessible to most teachers. Educators and curriculum materials specialists have developed and shared a number of activities and demonstrations to help students visualize volcanic processes and ways scientist use GNSS/GPS in their research. From resources provided by MEDL (the Modeling and Educational Demonstrations Laboratory) in the Department of Geosciences at Virginia Tech, we combined multiple materials and techniques from these previous works to produce a hands-on physical analog model from which students can learn about GNSS/GPS studies of volcano deformation. The model functions as both a qualitative and quantitative learning tool with good analogical affordances. In our presentation, we will describe multiple ways of teaching with the model, what kinds of materials can be used to build it, and ways we think the model could be enhanced with the addition of Vernier sensors for data collection.

  5. A non-contact method based on multiple signal classification algorithm to reduce the measurement time for accurately heart rate detection

    NASA Astrophysics Data System (ADS)

    Bechet, P.; Mitran, R.; Munteanu, M.

    2013-08-01

    Non-contact methods for the assessment of vital signs are of great interest for specialists due to the benefits obtained in both medical and special applications, such as those for surveillance, monitoring, and search and rescue. This paper investigates the possibility of implementing a digital processing algorithm based on the MUSIC (Multiple Signal Classification) parametric spectral estimation in order to reduce the observation time needed to accurately measure the heart rate. It demonstrates that, by proper dimensioning the signal subspace, the MUSIC algorithm can be optimized in order to accurately assess the heart rate during an 8-28 s time interval. The validation of the processing algorithm performance was achieved by minimizing the mean error of the heart rate after performing simultaneous comparative measurements on several subjects. In order to calculate the error the reference value of heart rate was measured using a classic measurement system through direct contact.

  6. Detecting abandoned objects using interacting multiple models

    NASA Astrophysics Data System (ADS)

    Becker, Stefan; Münch, David; Kieritz, Hilke; Hübner, Wolfgang; Arens, Michael

    2015-10-01

    In recent years, the wide use of video surveillance systems has caused an enormous increase in the amount of data that has to be stored, monitored, and processed. As a consequence, it is crucial to support human operators with automated surveillance applications. Towards this end an intelligent video analysis module for real-time alerting in case of abandoned objects in public spaces is proposed. The overall processing pipeline consists of two major parts. First, person motion is modeled using an Interacting Multiple Model (IMM) filter. The IMM filter estimates the state of a person according to a finite-state, discrete-time Markov chain. Second, the location of persons that stay at a fixed position defines a region of interest, in which a nonparametric background model with dynamic per-pixel state variables identifies abandoned objects. In case of a detected abandoned object, an alarm event is triggered. The effectiveness of the proposed system is evaluated on the PETS 2006 dataset and the i-Lids dataset, both reflecting prototypical surveillance scenarios.

  7. System Safety in an IT Service Organization

    NASA Astrophysics Data System (ADS)

    Parsons, Mike; Scutt, Simon

    Within Logica UK, over 30 IT service projects are considered safetyrelated. These include operational IT services for airports, railway infrastructure asset management, nationwide radiation monitoring and hospital medical records services. A recent internal audit examined the processes and documents used to manage system safety on these services and made a series of recommendations for improvement. This paper looks at the changes and the challenges to introducing them, especially where the service is provided by multiple units supporting both safety and non-safety related services from multiple locations around the world. The recommendations include improvements to service agreements, improved process definitions, routine safety assessment of changes, enhanced call logging, improved staff competency and training, and increased safety awareness. Progress is reported as of today, together with a road map for implementation of the improvements to the service safety management system. A proposal for service assurance levels (SALs) is discussed as a way forward to cover the wide variety of services and associated safety risks.

  8. Improving medical records filing in a municipal hospital in Ghana.

    PubMed

    Teviu, E A A; Aikins, M; Abdulai, T I; Sackey, S; Boni, P; Afari, E; Wurapa, F

    2012-09-01

    Medical records are kept in the interest of both the patient and clinician. Proper filing of patient's medical records ensures easy retrieval and contributes to decreased patient waiting time at the hospital and continuity of care. This paper reports on an intervention study to address the issue of misfiling and multiple patient folders in a health facility. Intervention study. Municipal Hospital, Goaso, Asunafo North District, Brong Ahafo Region, Ghana. Methods employed for data collection were records review, direct observation and tracking of folders. Interventions instituted were staff durbars, advocacy and communication, consultations, in-service trainings, procurement and monitoring. Factors contributing to issuance of multiple folders and misfiling were determined. Proportion of multiple folders was estimated. Results revealed direct and indirect factors contributing to issuance of multiple patient folders and misfiling. Interventions and monitoring reduce acquisition of numerous medical folders per patient and misfiling. After the intervention, there was significant reduction in the use of multiple folders (i.e., overall 97% reduction) and a high usage of single patient medical folders (i.e., 99%). In conclusion, a defined medical records filing system with adequate training, logistics and regular monitoring and supervision minimises issuance of multiple folders and misfiling.

  9. Bed occupancy monitoring: data processing and clinician user interface design.

    PubMed

    Pouliot, Melanie; Joshi, Vilas; Goubran, Rafik; Knoefel, Frank

    2012-01-01

    Unobtrusive and continuous monitoring of patients, especially at their place of residence, is becoming a significant part of the healthcare model. A variety of sensors are being used to monitor different patient conditions. Bed occupancy monitoring provides clinicians a quantitative measure of bed entry/exit patterns and may provide information relating to sleep quality. This paper presents a bed occupancy monitoring system using a bed pressure mat sensor. A clinical trial was performed involving 8 patients to collect bed occupancy data. The trial period for each patient ranged from 5-10 weeks. This data was analyzed using a participatory design methodology incorporating clinician feedback to obtain bed occupancy parameters. The parameters extracted include the number of bed exits per night, the bed exit weekly average (including minimum and maximum), the time of day of a particular exit, and the amount of uninterrupted bed occupancy per night. The design of a clinical user interface plays a significant role in the acceptance of such patient monitoring systems by clinicians. The clinician user interface proposed in this paper was designed to be intuitive, easy to navigate and not cause information overload. An iterative design methodology was used for the interface design. The interface design is extendible to incorporate data from multiple sensors. This allows the interface to be part of a comprehensive remote patient monitoring system.

  10. Satellite mapping of Nile Delta coastal changes

    NASA Technical Reports Server (NTRS)

    Blodget, H. W.; Taylor, P. T.; Roark, J. H.

    1989-01-01

    Multitemporal, multispectral scanner (MSS) landsat data have been used to monitor erosion and sedimentation along the Rosetta Promontory of the Nile Delta. These processes have accelerated significantly since the completion of the Aswan High Dam in 1964. Digital differencing of four MSS data sets, using standard algorithms, show that changes observed over a single year period generally occur as strings of single mixed pixels along the coast. Therefore, these can only be used qualitatively to indicate areas where changes occur. Areas of change recorded over a multi-year period are generally larger and thus identified by clusters of pixels; this reduces errors introduced by mixed pixels. Satellites provide a synoptic perspective utilizing data acquired at frequent time intervals. This permits multiple year monitoring of delta evolution on a regional scale.

  11. Bone Health Monitoring in Astronauts: Recommended Use of Quantitative Computed Tomography [QCT] for Clinical and Operational Decisions

    NASA Technical Reports Server (NTRS)

    Sibonga, J. D.; Truskowski, P.

    2010-01-01

    This slide presentation reviews the concerns that astronauts in long duration flights might have a greater risk of bone fracture as they age than the general population. A panel of experts was convened to review the information and recommend mechanisms to monitor the health of bones in astronauts. The use of Quantitative Computed Tomography (QCT) scans for risk surveillance to detect the clinical trigger and to inform countermeasure evaluation is reviewed. An added benefit of QCT is that it facilitates an individualized estimation of bone strength by Finite Element Modeling (FEM), that can inform approaches for bone rehabilitation. The use of FEM is reviewed as a process that arrives at a composite number to estimate bone strength, because it integrates multiple factors.

  12. [The design of a cardiac monitoring and analysing system with low power consumption].

    PubMed

    Chen, Zhen-cheng; Ni, Li-li; Zhu, Yan-gao; Wang, Hong-yan; Ma, Yan

    2002-07-01

    The paper deals with a portable analyzing monitor system with liquid crystal display (LCD), which is low in power consumption and suitable for China's specific conditions. Apart from the development of the overall scheme of the system, the paper introduces the design of the hardware and the software. The 80196 single chip microcomputer is used as the central microprocessor to process and real-time electrocardiac signal data. The system have the following functions: five types of arrhythmia analysis, alarm, freeze, and record of automatic paperfeeding. The portable system can be operated by alternate-current (AC) or direct-current (DC). Its hardware circuit is simplified and its software structure is optimized. Multiple low power consumption and LCD unit are adopted in its modular designs.

  13. Physical Layer Secret-Key Generation Scheme for Transportation Security Sensor Network

    PubMed Central

    Yang, Bin; Zhang, Jianfeng

    2017-01-01

    Wireless Sensor Networks (WSNs) are widely used in different disciplines, including transportation systems, agriculture field environment monitoring, healthcare systems, and industrial monitoring. The security challenge of the wireless communication link between sensor nodes is critical in WSNs. In this paper, we propose a new physical layer secret-key generation scheme for transportation security sensor network. The scheme is based on the cooperation of all the sensor nodes, thus avoiding the key distribution process, which increases the security of the system. Different passive and active attack models are analyzed in this paper. We also prove that when the cooperative node number is large enough, even when the eavesdropper is equipped with multiple antennas, the secret-key is still secure. Numerical results are performed to show the efficiency of the proposed scheme. PMID:28657588

  14. Physical Layer Secret-Key Generation Scheme for Transportation Security Sensor Network.

    PubMed

    Yang, Bin; Zhang, Jianfeng

    2017-06-28

    Wireless Sensor Networks (WSNs) are widely used in different disciplines, including transportation systems, agriculture field environment monitoring, healthcare systems, and industrial monitoring. The security challenge of the wireless communication link between sensor nodes is critical in WSNs. In this paper, we propose a new physical layer secret-key generation scheme for transportation security sensor network. The scheme is based on the cooperation of all the sensor nodes, thus avoiding the key distribution process, which increases the security of the system. Different passive and active attack models are analyzed in this paper. We also prove that when the cooperative node number is large enough, even when the eavesdropper is equipped with multiple antennas, the secret-key is still secure. Numerical results are performed to show the efficiency of the proposed scheme.

  15. Ultrasensitive surveillance of sensors and processes

    DOEpatents

    Wegerich, Stephan W.; Jarman, Kristin K.; Gross, Kenneth C.

    2001-01-01

    A method and apparatus for monitoring a source of data for determining an operating state of a working system. The method includes determining a sensor (or source of data) arrangement associated with monitoring the source of data for a system, activating a method for performing a sequential probability ratio test if the data source includes a single data (sensor) source, activating a second method for performing a regression sequential possibility ratio testing procedure if the arrangement includes a pair of sensors (data sources) with signals which are linearly or non-linearly related; activating a third method for performing a bounded angle ratio test procedure if the sensor arrangement includes multiple sensors and utilizing at least one of the first, second and third methods to accumulate sensor signals and determining the operating state of the system.

  16. Ultrasensitive surveillance of sensors and processes

    DOEpatents

    Wegerich, Stephan W.; Jarman, Kristin K.; Gross, Kenneth C.

    1999-01-01

    A method and apparatus for monitoring a source of data for determining an operating state of a working system. The method includes determining a sensor (or source of data) arrangement associated with monitoring the source of data for a system, activating a method for performing a sequential probability ratio test if the data source includes a single data (sensor) source, activating a second method for performing a regression sequential possibility ratio testing procedure if the arrangement includes a pair of sensors (data sources) with signals which are linearly or non-linearly related; activating a third method for performing a bounded angle ratio test procedure if the sensor arrangement includes multiple sensors and utilizing at least one of the first, second and third methods to accumulate sensor signals and determining the operating state of the system.

  17. Framework and implementation of a continuous network-wide health monitoring system for roadways

    NASA Astrophysics Data System (ADS)

    Wang, Ming; Birken, Ralf; Shahini Shamsabadi, Salar

    2014-03-01

    According to the 2013 ASCE report card America's infrastructure scores only a D+. There are more than four million miles of roads (grade D) in the U.S. requiring a broad range of maintenance activities. The nation faces a monumental problem of infrastructure management in the scheduling and implementation of maintenance and repair operations, and in the prioritization of expenditures within budgetary constraints. The efficient and effective performance of these operations however is crucial to ensuring roadway safety, preventing catastrophic failures, and promoting economic growth. There is a critical need for technology that can cost-effectively monitor the condition of a network-wide road system and provide accurate, up-to-date information for maintenance activity prioritization. The Versatile Onboard Traffic Embedded Roaming Sensors (VOTERS) project provides a framework and the sensing capability to complement periodical localized inspections to continuous network-wide health monitoring. Research focused on the development of a cost-effective, lightweight package of multi-modal sensor systems compatible with this framework. An innovative software infrastructure is created that collects, processes, and evaluates these large time-lapse multi-modal data streams. A GIS-based control center manages multiple inspection vehicles and the data for further analysis, visualization, and decision making. VOTERS' technology can monitor road conditions at both the surface and sub-surface levels while the vehicle is navigating through daily traffic going about its normal business, thereby allowing for network-wide frequent assessment of roadways. This deterioration process monitoring at unprecedented time and spatial scales provides unique experimental data that can be used to improve life-cycle cost analysis models.

  18. Multiscale ecosystem monitoring: an application of scaling data to answer multiple ecological questions

    USDA-ARS?s Scientific Manuscript database

    Standardized monitoring data collection efforts using a probabilistic sample design, such as in the Bureau of Land Management’s (BLM) Assessment, Inventory, and Monitoring (AIM) Strategy, provide a core suite of ecological indicators, maximize data collection efficiency, and promote reuse of monitor...

  19. The topograpy of demyelination and neurodegeneration in the multiple sclerosis brain.

    PubMed

    Haider, Lukas; Zrzavy, Tobias; Hametner, Simon; Höftberger, Romana; Bagnato, Francesca; Grabner, Günther; Trattnig, Siegfried; Pfeifenbring, Sabine; Brück, Wolfgang; Lassmann, Hans

    2016-03-01

    Multiple sclerosis is a chronic inflammatory disease with primary demyelination and neurodegeneration in the central nervous system. In our study we analysed demyelination and neurodegeneration in a large series of multiple sclerosis brains and provide a map that displays the frequency of different brain areas to be affected by these processes. Demyelination in the cerebral cortex was related to inflammatory infiltrates in the meninges, which was pronounced in invaginations of the brain surface (sulci) and possibly promoted by low flow of the cerebrospinal fluid in these areas. Focal demyelinated lesions in the white matter occurred at sites with high venous density and additionally accumulated in watershed areas of low arterial blood supply. Two different patterns of neurodegeneration in the cortex were identified: oxidative injury of cortical neurons and retrograde neurodegeneration due to axonal injury in the white matter. While oxidative injury was related to the inflammatory process in the meninges and pronounced in actively demyelinating cortical lesions, retrograde degeneration was mainly related to demyelinated lesions and axonal loss in the white matter. Our data show that accumulation of lesions and neurodegeneration in the multiple sclerosis brain does not affect all brain regions equally and provides the pathological basis for the selection of brain areas for monitoring regional injury and atrophy development in future magnetic resonance imaging studies. © The Author (2016). Published by Oxford University Press on behalf of the Guarantors of Brain.

  20. Camera-Based Microswitch Technology to Monitor Mouth, Eyebrow, and Eyelid Responses of Children with Profound Multiple Disabilities

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; Bellini, Domenico; Oliva, Doretta; Singh, Nirbhay N.; O'Reilly, Mark F.; Lang, Russell; Didden, Robert

    2011-01-01

    A camera-based microswitch technology was recently used to successfully monitor small eyelid and mouth responses of two adults with profound multiple disabilities (Lancioni et al., Res Dev Disab 31:1509-1514, 2010a). This technology, in contrast with the traditional optic microswitches used for those responses, did not require support frames on…

  1. Geologic, hydrologic, and water-quality data from multiple-well monitoring sites in the Central and West Coast basins, Los Angeles County, California, 1995-2000

    USGS Publications Warehouse

    Land, Michael; Everett, R.R.; Crawford, S.M.

    2002-01-01

    In 1995, the U.S. Geological Survey (USGS), in cooperation with the HYPERLINK 'http://wrd.org' Water Replenishment District of Southern California (WRDSC), began a study to examine ground-water resources in the Central and West Coast Basins in Los Angeles County, California. The study characterizes the geohydrology and geochemistry of the regional ground-water flow system and provides extensive data for evaluating ground-water management issues. This report is a compilation of geologic, hydrologic, and water-quality data collected from 24 recently constructed multiple-well monitoring sites for the period 1995?2000. Descriptions of the collected drill cuttings were compiled into lithologic logs, which are summarized along with geophysical logs?including gamma-ray, spontaneous potential, resistivity, electromagnetic induction, and temperature tool logs?for each monitoring site. At selected sites, cores were analyzed for magnetic orientation, physical and thermal properties, and mineralogy. Field and laboratory estimates of hydraulic conductivity are presented for most multiple-well monitoring sites. Periodic water-level measurements are also reported. Water-quality information for major ions, nutrients, trace elements, deuterium and oxygen-18, and tritium is presented for the multiple-well monitoring locations, and for selected existing production and observation wells. In addition, boron-11, carbon-13, carbon-14, sulfur-34, and strontium-87/86 data are presented for selected wells.

  2. Accelerated Aging Experiments for Capacitor Health Monitoring and Prognostics

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Celaya, Jose Ramon; Biswas, Gautam; Goebel, Kai

    2012-01-01

    This paper discusses experimental setups for health monitoring and prognostics of electrolytic capacitors under nominal operation and accelerated aging conditions. Electrolytic capacitors have higher failure rates than other components in electronic systems like power drives, power converters etc. Our current work focuses on developing first-principles-based degradation models for electrolytic capacitors under varying electrical and thermal stress conditions. Prognostics and health management for electronic systems aims to predict the onset of faults, study causes for system degradation, and accurately compute remaining useful life. Accelerated life test methods are often used in prognostics research as a way to model multiple causes and assess the effects of the degradation process through time. It also allows for the identification and study of different failure mechanisms and their relationships under different operating conditions. Experiments are designed for aging of the capacitors such that the degradation pattern induced by the aging can be monitored and analyzed. Experimental setups and data collection methods are presented to demonstrate this approach.

  3. 4-D High-Resolution Seismic Reflection Monitoring of Miscible CO2 Injected into a Carbonate Reservoir

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard D. Miller; Abdelmoneam E. Raef; Alan P. Byrnes

    2005-09-01

    The objective of this research project is to acquire, process, and interpret multiple high-resolution 3-D compressional wave and 2-D, 2-C shear wave seismic data to observe changes in fluid characteristics in an oil field before, during, and after the miscible carbon dioxide (CO{sub 2}) flood that began around December 1, 2003, as part of the DOE-sponsored Class Revisit Project (DOE DE-AC26-00BC15124). Unique and key to this imaging activity is the high-resolution nature of the seismic data, minimal deployment design, and the temporal sampling throughout the flood. The 900-m-deep test reservoir is located in central Kansas oomoldic limestones of the Lansing-Kansasmore » City Group, deposited on a shallow marine shelf in Pennsylvanian time. After 18 months of seismic monitoring, one baseline and six monitor surveys clearly imaged changes that appear consistent with movement of CO{sub 2} as modeled with fluid simulators.« less

  4. A wirelessly programmable actuation and sensing system for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Long, James; Büyüköztürk, Oral

    2016-04-01

    Wireless sensor networks promise to deliver low cost, low power and massively distributed systems for structural health monitoring. A key component of these systems, particularly when sampling rates are high, is the capability to process data within the network. Although progress has been made towards this vision, it remains a difficult task to develop and program 'smart' wireless sensing applications. In this paper we present a system which allows data acquisition and computational tasks to be specified in Python, a high level programming language, and executed within the sensor network. Key features of this system include the ability to execute custom application code without firmware updates, to run multiple users' requests concurrently and to conserve power through adjustable sleep settings. Specific examples of sensor node tasks are given to demonstrate the features of this system in the context of structural health monitoring. The system comprises of individual firmware for nodes in the wireless sensor network, and a gateway server and web application through which users can remotely submit their requests.

  5. Coordinated Monitoring Systems for Early Care and Education. OPRE Research Brief 2016-19

    ERIC Educational Resources Information Center

    Maxwell, Kelly; Sosinsky, Laura; Tout, Kathryn; Hegseth, Danielle

    2016-01-01

    Early care and education providers are subject to monitoring by multiple agencies and organizations. In this brief provides an overview of monitoring and the major early care and education monitoring systems. It then offers possible goals for a coordinated monitoring system and describes some approaches to addressing those goals. Eleven dimensions…

  6. The Joint Experiment for Crop Assessment and Monitoring (JECAM) Initiative: Developing methods and best practices for global agricultural monitoring

    NASA Astrophysics Data System (ADS)

    Champagne, C.; Jarvis, I.; Defourny, P.; Davidson, A.

    2014-12-01

    Agricultural systems differ significantly throughout the world, making a 'one size fits all' approach to remote sensing and monitoring of agricultural landscapes problematic. The Joint Experiment for Crop Assessment and Monitoring (JECAM) was established in 2009 to bring together the global scientific community to work towards a set of best practices and recommendations for using earth observation data to map, monitor and report on agricultural productivity globally across an array of diverse agricultural systems. These methods form the research and development component of the Group on Earth Observation Global Agricultural Monitoring (GEOGLAM) initiative to harmonize global monitoring efforts and increase market transparency. The JECAM initiative brings together researchers from a large number of globally distributed, well monitored agricultural test sites that cover a range of crop types, cropping systems and climate regimes. Each test site works independently as well as together across multiple sites to test methods, sensors and field data collection techniques to derive key agricultural parameters, including crop type, crop condition, crop yield and soil moisture. The outcome of this project will be a set of best practices that cover the range of remote sensing monitoring and reporting needs, including satellite data acquisition, pre-processing techniques, information retrieval and ground data validation. These outcomes provide the research and development foundation for GEOGLAM and will help to inform the development of the GEOGLAM "system of systems" for global agricultural monitoring. The outcomes of the 2014 JECAM science meeting will be discussed as well as examples of methods being developed by JECAM scientists.

  7. Bioregional monitoring design and occupancy estimation for two Sierra Nevadan amphibian taxa

    EPA Science Inventory

    Land-management agencies need quantitative, statistically rigorous monitoring data, often at large spatial and temporal scales, to support resource-management decisions. Monitoring designs typically must accommodate multiple ecological, logistical, political, and economic objec...

  8. Photogrammetric discharge monitoring of small tropical mountain rivers - A case study at Rivière des Pluies, Réunion island

    NASA Astrophysics Data System (ADS)

    Stumpf, André; Augereau, Emmanuel; Delacourt, Christophe; Bonnier, Julien

    2016-04-01

    Reliable discharge measurements are indispensable for an effective management of natural water resources and floods. Limitations of classical current meter profiling and stage-discharge ratings have stimulated the development of more accurate and efficient gauging techniques. While new discharge measurements technologies such as acoustic doppler current profilers and large-scale image particle velocimetry (LSPIV) have been developed and tested in numerous studies, the continuous monitoring of small mountain rivers and discharge dynamics during strong meteorological events remains challenging. More specifically LSPIV studies are often focused on short-term measurements during flood events and there are still very few studies that address its use for long-term monitoring of small mountain rivers. To fill this gap this study targets the development and testing of largely autonomous photogrammetric discharge measurement system with a special focus on the application to small mountain river with high discharge variability and a mobile riverbed in the tropics. It proposes several enhancements among previous LSPIV methods regarding camera calibration, more efficient processing in image geometry, the automatic detection of the water level as well as the statistical calibration and estimation of the discharge from multiple profiles. To account for changes in the bed topography the riverbed is surveyed repeatedly during the dry seasons using multi-view photogrammetry or terrestrial laser scanners. The presented case study comprises the analysis of several thousand videos spanning over two and a half year (2013-2015) to test the robustness and accuracy of different processing steps. An analysis of the obtained results suggests that the quality of the camera calibration reaches a sub-pixel accuracy. The median accuracy of the watermask detections is F1=0.82, whereas the precision is systematically higher than the recall. The resulting underestimation of the water surface area and level leads to a systematic underestimation of the discharge and error rates of up to 25 %. However, the bias can be effectively removed using a least-square cross-calibration which reduces the error to a MAE of 6.39% and a maximum error of 16.18%. Those error rates are significantly lower than the uncertainties among multiple profiles (30%) and illustrate the importance of the spatial averaging from multiple measurements. The study suggests that LSPIV can already be considered as a valuable tool for the monitoring of torrential flows, whereas further research is still needed to fully integrate night-time observation and stereo-photogrammetric capabilities.

  9. Unattended Multiplicity Shift Register

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newell, Matt; Jones, David C.

    2017-01-16

    The Unattended Multiplicity Shift Register (UMSR) is a specialized pulse counter used primarily to count neutron events originating in neutron detection instruments. While the counter can be used to count any TTL input pulses, its unique ability to record time correlated events and the multiplicity distributions of these events makes it an ideal instrument for counting neutron events in the nuclear fields of material safeguards, waste assay and process monitoring and control. The UMSR combines the Los Alamos National Laboratory (LANL) simple and robust shift register design with a Commercial-Off-The-Shelf (COTS) processor and Ethernet communications. The UMSR is fully compatiblemore » with existing International Atomic Energy Agency (IAEA) neutron data acquisition instruments such as the Advance Multiplicity Shift Register (AMSR) and JSR-15. The UMSR has three input channels: a multiplicity shift register input and two auxiliary inputs. The UMSR provides 0V to 2kV of programmable High Voltage (HV) bias and both a 12V and a 5V detector power supply output. A serial over USB communication line to the UMSR allows the use of existing versions of INCC or MIC software while the Ethernet port is compatible with the new IAEA RAINSTORM communication protocol.« less

  10. Designing monitoring programs in an adaptive management context for regional multiple species conservation plans

    USGS Publications Warehouse

    Atkinson, A.J.; Trenham, P.C.; Fisher, R.N.; Hathaway, S.A.; Johnson, B.S.; Torres, S.G.; Moore, Y.C.

    2004-01-01

    critical management uncertainties; and 3) implementing long-term monitoring and adaptive management. Ultimately, the success of regional conservation planning depends on the ability of monitoring programs to confront the challenges of adaptively managing and monitoring complex ecosystems and diverse arrays of sensitive species.

  11. A western state perspective on monitoring and managing neotropical migratory birds

    Treesearch

    Frank Howe

    1993-01-01

    Neotropical migratory bird monitoring programs can contribute greatly to a more holistic and proactive management approach for state agencies. It is, however, imperative that these monitoring programs be scientifically designed and clearly communicated to managers. Information from monitoring programs can be used to develop multiple-species habitat management...

  12. Multi-scale functional mapping of tidal marsh vegetation for restoration monitoring

    NASA Astrophysics Data System (ADS)

    Tuxen Bettman, Karin

    2007-12-01

    Nearly half of the world's natural wetlands have been destroyed or degraded, and in recent years, there have been significant endeavors to restore wetland habitat throughout the world. Detailed mapping of restoring wetlands can offer valuable information about changes in vegetation and geomorphology, which can inform the restoration process and ultimately help to improve chances of restoration success. I studied six tidal marshes in the San Francisco Estuary, CA, US, between 2003 and 2004 in order to develop techniques for mapping tidal marshes at multiple scales by incorporating specific restoration objectives for improved longer term monitoring. I explored a "pixel-based" remote sensing image analysis method for mapping vegetation in restored and natural tidal marshes, describing the benefits and limitations of this type of approach (Chapter 2). I also performed a multi-scale analysis of vegetation pattern metrics for a recently restored tidal marsh in order to target the metrics that are consistent across scales and will be robust measures of marsh vegetation change (Chapter 3). Finally, I performed an "object-based" image analysis using the same remotely sensed imagery, which maps vegetation type and specific wetland functions at multiple scales (Chapter 4). The combined results of my work highlight important trends and management implications for monitoring wetland restoration using remote sensing, and will better enable restoration ecologists to use remote sensing for tidal marsh monitoring. Several findings important for tidal marsh restoration monitoring were made. Overall results showed that pixel-based methods are effective at quantifying landscape changes in composition and diversity in recently restored marshes, but are limited in their use for quantifying smaller, more fine-scale changes. While pattern metrics can highlight small but important changes in vegetation composition and configuration across years, scientists should exercise caution when using metrics in their studies or to validate restoration management decisions, and multi-scale analyses should be performed before metrics are used in restoration science for important management decisions. Lastly, restoration objectives, ecosystem function, and scale can each be integrated into monitoring techniques using remote sensing for improved restoration monitoring.

  13. Open Source Platform Application to Groundwater Characterization and Monitoring

    NASA Astrophysics Data System (ADS)

    Ntarlagiannis, D.; Day-Lewis, F. D.; Falzone, S.; Lane, J. W., Jr.; Slater, L. D.; Robinson, J.; Hammett, S.

    2017-12-01

    Groundwater characterization and monitoring commonly rely on the use of multiple point sensors and human labor. Due to the number of sensors, labor, and other resources needed, establishing and maintaining an adequate groundwater monitoring network can be both labor intensive and expensive. To improve and optimize the monitoring network design, open source software and hardware components could potentially provide the platform to control robust and efficient sensors thereby reducing costs and labor. This work presents early attempts to create a groundwater monitoring system incorporating open-source software and hardware that will control the remote operation of multiple sensors along with data management and file transfer functions. The system is built around a Raspberry PI 3, that controls multiple sensors in order to perform on-demand, continuous or `smart decision' measurements while providing flexibility to incorporate additional sensors to meet the demands of different projects. The current objective of our technology is to monitor exchange of ionic tracers between mobile and immobile porosity using a combination of fluid and bulk electrical-conductivity measurements. To meet this objective, our configuration uses four sensors (pH, specific conductance, pressure, temperature) that can monitor the fluid electrical properties of interest and guide the bulk electrical measurement. This system highlights the potential of using open source software and hardware components for earth sciences applications. The versatility of the system makes it ideal for use in a large number of applications, and the low cost allows for high resolution (spatially and temporally) monitoring.

  14. A Space Weather Forecasting System with Multiple Satellites Based on a Self-Recognizing Network

    PubMed Central

    Tokumitsu, Masahiro; Ishida, Yoshiteru

    2014-01-01

    This paper proposes a space weather forecasting system at geostationary orbit for high-energy electron flux (>2 MeV). The forecasting model involves multiple sensors on multiple satellites. The sensors interconnect and evaluate each other to predict future conditions at geostationary orbit. The proposed forecasting model is constructed using a dynamic relational network for sensor diagnosis and event monitoring. The sensors of the proposed model are located at different positions in space. The satellites for solar monitoring equip with monitoring devices for the interplanetary magnetic field and solar wind speed. The satellites orbit near the Earth monitoring high-energy electron flux. We investigate forecasting for typical two examples by comparing the performance of two models with different numbers of sensors. We demonstrate the prediction by the proposed model against coronal mass ejections and a coronal hole. This paper aims to investigate a possibility of space weather forecasting based on the satellite network with in-situ sensing. PMID:24803190

  15. A space weather forecasting system with multiple satellites based on a self-recognizing network.

    PubMed

    Tokumitsu, Masahiro; Ishida, Yoshiteru

    2014-05-05

    This paper proposes a space weather forecasting system at geostationary orbit for high-energy electron flux (>2 MeV). The forecasting model involves multiple sensors on multiple satellites. The sensors interconnect and evaluate each other to predict future conditions at geostationary orbit. The proposed forecasting model is constructed using a dynamic relational network for sensor diagnosis and event monitoring. The sensors of the proposed model are located at different positions in space. The satellites for solar monitoring equip with monitoring devices for the interplanetary magnetic field and solar wind speed. The satellites orbit near the Earth monitoring high-energy electron flux. We investigate forecasting for typical two examples by comparing the performance of two models with different numbers of sensors. We demonstrate the prediction by the proposed model against coronal mass ejections and a coronal hole. This paper aims to investigate a possibility of space weather forecasting based on the satellite network with in-situ sensing.

  16. Miniaturized Planar Room Temperature Ionic Liquid Electrochemical Gas Sensor for Rapid Multiple Gas Pollutants Monitoring.

    PubMed

    Wan, Hao; Yin, Heyu; Lin, Lu; Zeng, Xiangqun; Mason, Andrew J

    2018-02-01

    The growing impact of airborne pollutants and explosive gases on human health and occupational safety has escalated the demand of sensors to monitor hazardous gases. This paper presents a new miniaturized planar electrochemical gas sensor for rapid measurement of multiple gaseous hazards. The gas sensor features a porous polytetrafluoroethylene substrate that enables fast gas diffusion and room temperature ionic liquid as the electrolyte. Metal sputtering was utilized for platinum electrodes fabrication to enhance adhesion between the electrodes and the substrate. Together with carefully selected electrochemical methods, the miniaturized gas sensor is capable of measuring multiple gases including oxygen, methane, ozone and sulfur dioxide that are important to human health and safety. Compared to its manually-assembled Clark-cell predecessor, this sensor provides better sensitivity, linearity and repeatability, as validated for oxygen monitoring. With solid performance, fast response and miniaturized size, this sensor is promising for deployment in wearable devices for real-time point-of-exposure gas pollutant monitoring.

  17. Multiplexed 3D FRET imaging in deep tissue of live embryos

    PubMed Central

    Zhao, Ming; Wan, Xiaoyang; Li, Yu; Zhou, Weibin; Peng, Leilei

    2015-01-01

    Current deep tissue microscopy techniques are mostly restricted to intensity mapping of fluorophores, which significantly limit their applications in investigating biochemical processes in vivo. We present a deep tissue multiplexed functional imaging method that probes multiple Förster resonant energy transfer (FRET) sensors in live embryos with high spatial resolution. The method simultaneously images fluorescence lifetimes in 3D with multiple excitation lasers. Through quantitative analysis of triple-channel intensity and lifetime images, we demonstrated that Ca2+ and cAMP levels of live embryos expressing dual FRET sensors can be monitored simultaneously at microscopic resolution. The method is compatible with a broad range of FRET sensors currently available for probing various cellular biochemical functions. It opens the door to imaging complex cellular circuitries in whole live organisms. PMID:26387920

  18. Method for determining and displaying the spacial distribution of a spectral pattern of received light

    DOEpatents

    Bennett, C.L.

    1996-07-23

    An imaging Fourier transform spectrometer is described having a Fourier transform infrared spectrometer providing a series of images to a focal plane array camera. The focal plane array camera is clocked to a multiple of zero crossing occurrences as caused by a moving mirror of the Fourier transform infrared spectrometer and as detected by a laser detector such that the frame capture rate of the focal plane array camera corresponds to a multiple of the zero crossing rate of the Fourier transform infrared spectrometer. The images are transmitted to a computer for processing such that representations of the images as viewed in the light of an arbitrary spectral ``fingerprint`` pattern can be displayed on a monitor or otherwise stored and manipulated by the computer. 2 figs.

  19. Wireless structural monitoring for homeland security applications

    NASA Astrophysics Data System (ADS)

    Kiremidjian, Garo K.; Kiremidjian, Anne S.; Lynch, Jerome P.

    2004-07-01

    This paper addresses the development of a robust, low-cost, low power, and high performance autonomous wireless monitoring system for civil assets such as large facilities, new construction, bridges, dams, commercial buildings, etc. The role of the system is to identify the onset, development, location and severity of structural vulnerability and damage. The proposed system represents an enabling infrastructure for addressing structural vulnerabilities specifically associated with homeland security. The system concept is based on dense networks of "intelligent" wireless sensing units. The fundamental properties of a wireless sensing unit include: (a) interfaces to multiple sensors for measuring structural and environmental data (such as acceleration, displacements, pressure, strain, material degradation, temperature, gas agents, biological agents, humidity, corrosion, etc.); (b) processing of sensor data with embedded algorithms for assessing damage and environmental conditions; (c) peer-to-peer wireless communications for information exchange among units(thus enabling joint "intelligent" processing coordination) and storage of data and processed information in servers for information fusion; (d) ultra low power operation; (e) cost-effectiveness and compact size through the use of low-cost small-size off-the-shelf components. An integral component of the overall system concept is a decision support environment for interpretation and dissemination of information to various decision makers.

  20. Mass spectrometric directed system for the continuous-flow synthesis and purification of diphenhydramine.

    PubMed

    Loren, Bradley P; Wleklinski, Michael; Koswara, Andy; Yammine, Kathryn; Hu, Yanyang; Nagy, Zoltan K; Thompson, David H; Cooks, R Graham

    2017-06-01

    A highly integrated approach to the development of a process for the continuous synthesis and purification of diphenhydramine is reported. Mass spectrometry (MS) is utilized throughout the system for on-line reaction monitoring, off-line yield quantitation, and as a reaction screening module that exploits reaction acceleration in charged microdroplets for high throughput route screening. This effort has enabled the discovery and optimization of multiple routes to diphenhydramine in glass microreactors using MS as a process analytical tool (PAT). The ability to rapidly screen conditions in charged microdroplets was used to guide optimization of the process in a microfluidic reactor. A quantitative MS method was developed and used to measure the reaction kinetics. Integration of the continuous-flow reactor/on-line MS methodology with a miniaturized crystallization platform for continuous reaction monitoring and controlled crystallization of diphenhydramine was also achieved. Our findings suggest a robust approach for the continuous manufacture of pharmaceutical drug products, exemplified in the particular case of diphenhydramine, and optimized for efficiency and crystal size, and guided by real-time analytics to produce the agent in a form that is readily adapted to continuous synthesis.

  1. Test Data Monitor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosas, Joseph

    The National Security Campus (NSC) collects a large amount of test data used The National Security Campus (NSC) collects a large amount of test data used to accept high value and high rigor product. The data has been used historically to support root cause analysis when anomalies are detected in down-stream processes. The opportunity to use the data for predictive failure analysis however, had never been exploited. The primary goal of the Test Data Monitor (TDM) software is to provide automated capabilities to analyze data in near-real-time and report trends that foreshadow actual product failures. To date, the aerospace industrymore » as a whole is challenged at utilizing collected data to the degree that modern technology allows. As a result of the innovation behind TDM, Honeywell is able to monitor millions of data points through a multitude of SPC algorithms continuously and autonomously so that our personnel resources can more efficiently and accurately direct their attention to suspect processes or features. TDM’s capabilities have been recognized by our U.S. Department of Energy National Nuclear Security Administration (NNSA) sponsor for potential use at other sites within the NNSA. This activity supports multiple initiatives including expectations of the NNSA and broader corporate goals that center around data-based quality controls on production.« less

  2. Radio monitoring of protoplanetary discs

    NASA Astrophysics Data System (ADS)

    Ubach, C.; Maddison, S. T.; Wright, C. M.; Wilner, D. J.; Lommen, D. J. P.; Koribalski, B.

    2017-04-01

    Protoplanetary disc systems observed at radio wavelengths often show excess emission above that expected from a simple extrapolation of thermal dust emission observed at short millimetre wavelengths. Monitoring the emission at radio wavelengths can be used to help disentangle the physical mechanisms responsible for this excess, including free-free emission from a wind or jet, and chromospheric emission associated with stellar activity. We present new results from a radio monitoring survey conducted with Australia Telescope Compact Array over the course of several years with observation intervals spanning days, months and years, where the flux variability of 11 T Tauri stars in the Chamaeleon and Lupus star-forming regions was measured at 7 and 15 mm, and 3 and 6 cm. Results show that most sources are variable to some degree at 7 mm, indicating the presence of emission mechanisms other than thermal dust in some sources. Additionally, evidence of grain growth to centimetre-sized pebbles was found for some sources that also have signs of variable flux at 7 mm. We conclude that multiple processes contributing to the emission are common in T Tauri stars at 7 mm and beyond, and that a detection at a single epoch at radio wavelengths should not be used to determine all processes contributing to the emission.

  3. Feasibility study design and methods for a home-based, square-stepping exercise program among older adults with multiple sclerosis: The SSE-MS project.

    PubMed

    Sebastião, Emerson; McAuley, Edward; Shigematsu, Ryosuke; Motl, Robert W

    2017-09-01

    We propose a randomized controlled trial (RCT) examining the feasibility of square-stepping exercise (SSE) delivered as a home-based program for older adults with multiple sclerosis (MS). We will assess feasibility in the four domains of process, resources, management and scientific outcomes. The trial will recruit older adults (aged 60 years and older) with mild-to-moderate MS-related disability who will be randomized into intervention or attention control conditions. Participants will complete assessments before and after completion of the conditions delivered over a 12-week period. Participants in the intervention group will have biweekly meetings with an exercise trainer in the Exercise Neuroscience Research Laboratory and receive verbal and visual instruction on step patterns for the SSE program. Participants will receive a mat for home-based practice of the step patterns, an instruction manual, and a logbook and pedometer for monitoring compliance. Compliance will be further monitored through weekly scheduled Skype calls. This feasibility study will inform future phase II and III RCTs that determine the actual efficacy and effectiveness of a home-based exercise program for older adults with MS.

  4. A Hierarchical and Dynamic Seascape Framework for Scaling and Comparing Ocean Biodiversity Observations

    NASA Astrophysics Data System (ADS)

    Kavanaugh, M.; Muller-Karger, F. E.; Montes, E.; Santora, J. A.; Chavez, F.; Messié, M.; Doney, S. C.

    2016-02-01

    The pelagic ocean is a complex system in which physical, chemical and biological processes interact to shape patterns on multiple spatial and temporal scales and levels of ecological organization. Monitoring and management of marine seascapes must consider a hierarchical and dynamic mosaic, where the boundaries, extent, and location of features change with time. As part of a Marine Biodiversity Observing Network demonstration project, we conducted a multiscale classification of dynamic coastal seascapes in the northeastern Pacific and Gulf of Mexico using multivariate satellite and modeled data. Synoptic patterns were validated using mooring and ship-based observations that spanned multiple trophic levels and were collected as part of several long-term monitoring programs, including the Monterey Bay and Florida Keys National Marine Sanctuaries. Seascape extent and habitat diversity varied as a function of both seasonal and interannual forcing. We discuss the patterns of in situ observations in the context of seascape dynamics and the effect on rarefaction, spatial patchiness, and tracking and comparing ecosystems through time. A seascape framework presents an effective means to translate local biodiversity measurements to broader spatiotemporal scales, scales relevant for modeling the effects of global change and enabling whole-ecosystem management in the dynamic ocean.

  5. Comparison of spectroscopy technologies for improved monitoring of cell culture processes in miniature bioreactors

    PubMed Central

    van den Berg, Frans; Racher, Andrew J.; Martin, Elaine B.; Jaques, Colin

    2017-01-01

    Cell culture process development requires the screening of large numbers of cell lines and process conditions. The development of miniature bioreactor systems has increased the throughput of such studies; however, there are limitations with their use. One important constraint is the limited number of offline samples that can be taken compared to those taken for monitoring cultures in large‐scale bioreactors. The small volume of miniature bioreactor cultures (15 mL) is incompatible with the large sample volume (600 µL) required for bioanalysers routinely used. Spectroscopy technologies may be used to resolve this limitation. The purpose of this study was to compare the use of NIR, Raman, and 2D‐fluorescence to measure multiple analytes simultaneously in volumes suitable for daily monitoring of a miniature bioreactor system. A novel design‐of‐experiment approach is described that utilizes previously analyzed cell culture supernatant to assess metabolite concentrations under various conditions while providing optimal coverage of the desired design space. Multivariate data analysis techniques were used to develop predictive models. Model performance was compared to determine which technology is more suitable for this application. 2D‐fluorescence could more accurately measure ammonium concentration (RMSECV 0.031 g L−1) than Raman and NIR. Raman spectroscopy, however, was more robust at measuring lactate and glucose concentrations (RMSECV 1.11 and 0.92 g L−1, respectively) than the other two techniques. The findings suggest that Raman spectroscopy is more suited for this application than NIR and 2D‐fluorescence. The implementation of Raman spectroscopy increases at‐line measuring capabilities, enabling daily monitoring of key cell culture components within miniature bioreactor cultures. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:337–346, 2017 PMID:28271638

  6. LC-MS Data Processing with MAVEN: A Metabolomic Analysis and Visualization Engine

    PubMed Central

    Clasquin, Michelle F.; Melamud, Eugene; Rabinowitz, Joshua D.

    2014-01-01

    MAVEN is an open-source software program for interactive processing of LC-MS-based metabolomics data. MAVEN enables rapid and reliable metabolite quantitation from multiple reaction monitoring data or high-resolution full-scan mass spectrometry data. It automatically detects and reports peak intensities for isotope-labeled metabolites. Menu-driven, click-based navigation allows visualization of raw and analyzed data. Here we provide a User Guide for MAVEN. Step-by-step instructions are provided for data import, peak alignment across samples, identification of metabolites that differ strongly between biological conditions, quantitation and visualization of isotope-labeling patterns, and export of tables of metabolite-specific peak intensities. Together, these instructions describe a workflow that allows efficient processing of raw LC-MS data into a form ready for biological analysis. PMID:22389014

  7. LC-MS data processing with MAVEN: a metabolomic analysis and visualization engine.

    PubMed

    Clasquin, Michelle F; Melamud, Eugene; Rabinowitz, Joshua D

    2012-03-01

    MAVEN is an open-source software program for interactive processing of LC-MS-based metabolomics data. MAVEN enables rapid and reliable metabolite quantitation from multiple reaction monitoring data or high-resolution full-scan mass spectrometry data. It automatically detects and reports peak intensities for isotope-labeled metabolites. Menu-driven, click-based navigation allows visualization of raw and analyzed data. Here we provide a User Guide for MAVEN. Step-by-step instructions are provided for data import, peak alignment across samples, identification of metabolites that differ strongly between biological conditions, quantitation and visualization of isotope-labeling patterns, and export of tables of metabolite-specific peak intensities. Together, these instructions describe a workflow that allows efficient processing of raw LC-MS data into a form ready for biological analysis.

  8. An electrophysiological investigation of memory encoding, depth of processing, and word frequency in humans.

    PubMed

    Guo, Chunyan; Zhu, Ying; Ding, Jinhong; Fan, Silu; Paller, Ken A

    2004-02-12

    Memory encoding can be studied by monitoring brain activity correlated with subsequent remembering. To understand brain potentials associated with encoding, we compared multiple factors known to affect encoding. Depth of processing was manipulated by requiring subjects to detect animal names (deep encoding) or boldface (shallow encoding) in a series of Chinese words. Recognition was more accurate with deep than shallow encoding, and for low- compared to high-frequency words. Potentials were generally more positive for subsequently recognized versus forgotten words; for deep compared to shallow processing; and, for remembered words only, for low- than for high-frequency words. Latency and topographic differences between these potentials suggested that several factors influence the effectiveness of encoding and can be distinguished using these methods, even with Chinese logographic symbols.

  9. Monitoring Geothermal Features in Yellowstone National Park with ATLAS Multispectral Imagery

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph; Berglund, Judith

    2000-01-01

    The National Park Service (NPS) must produce an Environmental Impact Statement for each proposed development in the vicinity of known geothermal resource areas (KGRAs) in Yellowstone National Park. In addition, the NPS monitors indicator KGRAs for environmental quality and is still in the process of mapping many geothermal areas. The NPS currently maps geothermal features with field survey techniques. High resolution aerial multispectral remote sensing in the visible, NIR, SWIR, and thermal spectral regions could enable YNP geothermal features to be mapped more quickly and in greater detail In response, Yellowstone Ecosystems Studies, in partnership with NASA's Commercial Remote Sensing Program, is conducting a study on the use of Airborne Terrestrial Applications Sensor (ATLAS) multispectral data for monitoring geothermal features in the Upper Geyser Basin. ATLAS data were acquired at 2.5 meter resolution on August 17, 2000. These data were processed into land cover classifications and relative temperature maps. For sufficiently large features, the ATLAS data can map geothermal areas in terms of geyser pools and hot springs, plus multiple categories of geothermal runoff that are apparently indicative of temperature gradients and microbial matting communities. In addition, the ATLAS maps clearly identify geyserite areas. The thermal bands contributed to classification success and to the computation of relative temperature. With masking techniques, one can assess the influence of geothermal features on the Firehole River. Preliminary results appear to confirm ATLAS data utility for mapping and monitoring geothermal features. Future work will include classification refinement and additional validation.

  10. Optical fiber network sensor system for monitoring methane concentration

    NASA Astrophysics Data System (ADS)

    Zhang, Zhi-wei; Zhang, Ji-long

    2011-08-01

    With regard to the high accuracy optic-fiber sensor for monitoring methane concentration, the choice of light source depends on methane peak values. Besides, the environment of mine should be considered, that is to say other gas should be considered, such as vapor, CO and CO2 etc, without absorbent spectrum in the decided wavelength. It has been reported that vapor, CO and CO2 have no obvious absorption in 0.85μm, 1.3μm and 1.66μm area, CH4 has no obvious absorption in 0.85μm area. So diode laser with 1.3μm or 1.66μm peak wavelength is chosen as the optic-fiber sensor's light source for detecting methane concentration. On the basis of the principle of optic absorption varied with methane concentration at its characteristic absorbent wavelength, the advantage of optic-fiber sensor technology and the circumstance characteristic of the coal mine. An optic-fiber sensor system is presented for monitoring methane concentration. Space Division Multiple Access Technology (SDMAT) and long optical path absorbent pool technology are combined in the study. Considering the circumstance characteristic of the coal mine, the optic-fiber network sensors for detecting methane concentration from mix gas of vapor, CO, CH4 and CO2 are used. It introduces the principle of an optic-fiber sensor system for monitoring methane concentration in coal mine. It contains the structure block diagram of monitoring system, the system is mainly made up of diode laser for monitoring methane concentration, Y-shaped photo-coupler with coupled rate 50:50, optical switch 1×2, gas absorbent cell, the computer data process and control system and photoelectric transformer. In this study, in order to decrease to the influence of the dark-current of photodiode, intensity in light sources and temperature drifts of processing circuit on the system accuracy in measurement, a beam of light is broken down into two beams in the coupler of Y-shaped coupler, the one acts as the reference optical path, the other is known as the sensing optical path. The experimental result shows that diode laser with 1654.141nm in wavelength is taken as the optic source for detecting methane concentration, the detective limit of the sensor is below 4.274mg/m3 when the optical path of absorbent pool is 20 centimeters, and the prevision and stability could satisfy practical application. The whole instrument can also reach on-line measurement with multiple points on different spot.

  11. A mission-oriented orbit design method of remote sensing satellite for region monitoring mission based on evolutionary algorithm

    NASA Astrophysics Data System (ADS)

    Shen, Xin; Zhang, Jing; Yao, Huang

    2015-12-01

    Remote sensing satellites play an increasingly prominent role in environmental monitoring and disaster rescue. Taking advantage of almost the same sunshine condition to same place and global coverage, most of these satellites are operated on the sun-synchronous orbit. However, it brings some problems inevitably, the most significant one is that the temporal resolution of sun-synchronous orbit satellite can't satisfy the demand of specific region monitoring mission. To overcome the disadvantages, two methods are exploited: the first one is to build satellite constellation which contains multiple sunsynchronous satellites, just like the CHARTER mechanism has done; the second is to design non-predetermined orbit based on the concrete mission demand. An effective method for remote sensing satellite orbit design based on multiobjective evolution algorithm is presented in this paper. Orbit design problem is converted into a multi-objective optimization problem, and a fast and elitist multi-objective genetic algorithm is utilized to solve this problem. Firstly, the demand of the mission is transformed into multiple objective functions, and the six orbit elements of the satellite are taken as genes in design space, then a simulate evolution process is performed. An optimal resolution can be obtained after specified generation via evolution operation (selection, crossover, and mutation). To examine validity of the proposed method, a case study is introduced: Orbit design of an optical satellite for regional disaster monitoring, the mission demand include both minimizing the average revisit time internal of two objectives. The simulation result shows that the solution for this mission obtained by our method meet the demand the users' demand. We can draw a conclusion that the method presented in this paper is efficient for remote sensing orbit design.

  12. Monitoring forest cover loss using multiple data streams, a case study of a tropical dry forest in Bolivia

    NASA Astrophysics Data System (ADS)

    Dutrieux, Loïc Paul; Verbesselt, Jan; Kooistra, Lammert; Herold, Martin

    2015-09-01

    Automatically detecting forest disturbances as they occur can be extremely challenging for certain types of environments, particularly those presenting strong natural variations. Here, we use a generic structural break detection framework (BFAST) to improve the monitoring of forest cover loss by combining multiple data streams. Forest change monitoring is performed using Landsat data in combination with MODIS or rainfall data to further improve the modelling and monitoring. We tested the use of the Normalized Difference Vegetation Index (NDVI) from the Moderate Resolution Imaging Spectroradiometer (MODIS) with varying spatial aggregation window sizes as well as a rainfall derived index as external regressors. The method was evaluated on a dry tropical forest area in lowland Bolivia where forest cover loss is known to occur, and we validated the results against a set of ground truth samples manually interpreted using the TimeSync environment. We found that the addition of an external regressor allows to take advantage of the difference in spatial extent between human induced and naturally induced variations and only detect the processes of interest. Of all configurations, we found the 13 by 13 km MODIS NDVI window to be the most successful, with an overall accuracy of 87%. Compared with a single pixel approach, the proposed method produced better time-series model fits resulting in increases of overall accuracy (from 82% to 87%), and decrease in omission and commission errors (from 33% to 24% and from 3% to 0% respectively). The presented approach seems particularly relevant for areas with high inter-annual natural variability, such as forests regularly experiencing exceptional drought events.

  13. Applications of open-path Fourier transform infrared for identification of volatile organic compound pollution sources and characterization of source emission behaviors.

    PubMed

    Lin, Chitsan; Liou, Naiwei; Sun, Endy

    2008-06-01

    An open-path Fourier transform infrared spectroscopy (OP-FTIR) system was set up for 3-day continuous line-averaged volatile organic compound (VOC) monitoring in a paint manufacturing plant. Seven VOCs (toluene, m-xylene, p-xylene, styrene, methanol, acetone, and 2-butanone) were identified in the ambient environment. Daytime-only batch operation mode was well explained by the time-series concentration plots. Major sources of methanol, m-xylene, acetone, and 2-butanone were identified in the southeast direction where paint solvent manufacturing processes are located. However, an attempt to uncover sources of styrene was not successful because the method detection limit (MDL) of the OP-FTIR system was not sensitive enough to produce conclusive data. In the second scenario, the OP-FTIR system was set up in an industrial complex to distinguish the origins of several VOCs. Eight major VOCs were identified in the ambient environment. The pollutant detected wind-rose percentage plots that clearly showed that ethylene, propylene, 2-butanone, and toluene mainly originated from the tank storage area, whereas the source of n-butane was mainly from the butadiene manufacturing processes of the refinery plant, and ammonia was identified as an accompanying reduction product in the gasoline desulfuration process. Advantages of OP-FTIR include its ability to simultaneously and continuously analyze many compounds, and its long path length monitoring has also shown advantages in obtaining more comprehensive data than the traditional multiple, single-point monitoring methods.

  14. The design of a real-time formative evaluation of the implementation process of lifestyle interventions at two worksites using a 7-step strategy (BRAVO@Work).

    PubMed

    Wierenga, Debbie; Engbers, Luuk H; van Empelen, Pepijn; Hildebrandt, Vincent H; van Mechelen, Willem

    2012-08-07

    Worksite health promotion programs (WHPPs) offer an attractive opportunity to improve the lifestyle of employees. Nevertheless, broad scale and successful implementation of WHPPs in daily practice often fails. In the present study, called BRAVO@Work, a 7-step implementation strategy was used to develop, implement and embed a WHPP in two different worksites with a focus on multiple lifestyle interventions.This article describes the design and framework for the formative evaluation of this 7-step strategy under real-time conditions by an embedded scientist with the purpose to gain insight into whether this this 7-step strategy is a useful and effective implementation strategy. Furthermore, we aim to gain insight into factors that either facilitate or hamper the implementation process, the quality of the implemented lifestyle interventions and the degree of adoption, implementation and continuation of these interventions. This study is a formative evaluation within two different worksites with an embedded scientist on site to continuously monitor the implementation process. Each worksite (i.e. a University of Applied Sciences and an Academic Hospital) will assign a participating faculty or a department, to implement a WHPP focusing on lifestyle interventions using the 7-step strategy. The primary focus will be to describe the natural course of development, implementation and maintenance of a WHPP by studying [a] the use and adherence to the 7-step strategy, [b] barriers and facilitators that influence the natural course of adoption, implementation and maintenance, and [c] the implementation process of the lifestyle interventions. All data will be collected using qualitative (i.e. real-time monitoring and semi-structured interviews) and quantitative methods (i.e. process evaluation questionnaires) applying data triangulation. Except for the real-time monitoring, the data collection will take place at baseline and after 6, 12 and 18 months. This is one of the few studies to extensively and continuously monitor the natural course of the implementation process of a WHPP by a formative evaluation using a mix of quantitative and qualitative methods on different organizational levels (i.e. management, project group, employees) with an embedded scientist on site. NTR2861.

  15. Application of a mechanistic model as a tool for on-line monitoring of pilot scale filamentous fungal fermentation processes-The importance of evaporation effects.

    PubMed

    Mears, Lisa; Stocks, Stuart M; Albaek, Mads O; Sin, Gürkan; Gernaey, Krist V

    2017-03-01

    A mechanistic model-based soft sensor is developed and validated for 550L filamentous fungus fermentations operated at Novozymes A/S. The soft sensor is comprised of a parameter estimation block based on a stoichiometric balance, coupled to a dynamic process model. The on-line parameter estimation block models the changing rates of formation of product, biomass, and water, and the rate of consumption of feed using standard, available on-line measurements. This parameter estimation block, is coupled to a mechanistic process model, which solves the current states of biomass, product, substrate, dissolved oxygen and mass, as well as other process parameters including k L a, viscosity and partial pressure of CO 2 . State estimation at this scale requires a robust mass model including evaporation, which is a factor not often considered at smaller scales of operation. The model is developed using a historical data set of 11 batches from the fermentation pilot plant (550L) at Novozymes A/S. The model is then implemented on-line in 550L fermentation processes operated at Novozymes A/S in order to validate the state estimator model on 14 new batches utilizing a new strain. The product concentration in the validation batches was predicted with an average root mean sum of squared error (RMSSE) of 16.6%. In addition, calculation of the Janus coefficient for the validation batches shows a suitably calibrated model. The robustness of the model prediction is assessed with respect to the accuracy of the input data. Parameter estimation uncertainty is also carried out. The application of this on-line state estimator allows for on-line monitoring of pilot scale batches, including real-time estimates of multiple parameters which are not able to be monitored on-line. With successful application of a soft sensor at this scale, this allows for improved process monitoring, as well as opening up further possibilities for on-line control algorithms, utilizing these on-line model outputs. Biotechnol. Bioeng. 2017;114: 589-599. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  16. Incidence of multiple sclerosis among European Economic Area populations, 1985-2009: the framework for monitoring

    PubMed Central

    2013-01-01

    Background A debate surrounding multiple sclerosis epidemiology has centred on time-related incidence increases and the need of monitoring. The purpose of this study is to reassess multiple sclerosis incidence in the European Economic Area. Methods We conducted a systematic review of literature from 1965 onwards and integrated elements of original research, including requested or completed data by surveys authors and specific analyses. Results The review of 5323 documents yielded ten studies for age- and sex-specific analyses, and 21 studies for time-trend analysis of single data sets. After 1985, the incidence of multiple sclerosis ranged from 1.12 to 6.96 per 100,000 population, was higher in females, tripled with latitude, and doubled with study midpoint year. The north registered increasing trends from the 1960s and 1970s, with a historic drop in the Faroe Islands, and fairly stable data in the period 1980-2000; incidence rose in Italian and French populations in the period 1970-2000, in Evros (Greece) in the 1980s, and in the French West Indies in around 2000. Conclusions We conclude that the increase in multiple sclerosis incidence is only apparent, and that it is not specific to women. Monitoring of multiple sclerosis incidence might be appropriate for the European Economic Area. PMID:23758972

  17. [A wireless mobile monitoring system based on bluetooth technology].

    PubMed

    Sun, Shou-jun; Wu, Kai; Wu, Xiao-Ming

    2006-09-01

    This paper presents a wireless mobile monitoring system based on Bluetooth technology. This system realizes the remote mobile monitoring of multiple physiological parameters, and has the characters of easy use, low cost, good reliability and strong capability of anti-jamming.

  18. Efficient species-level monitoring at the landscape scale

    Treesearch

    Barry R. Noon; Larissa L. Bailey; Thomas D. Sisk; Kevin S. McKelvey

    2012-01-01

    Monitoring the population trends of multiple animal species at a landscape scale is prohibitively expensive. However, advances in survey design, statistical methods, and the ability to estimate species presence on the basis of detection­nondetection data have greatly increased the feasibility of species-level monitoring. For example, recent advances in monitoring make...

  19. Large-scale monitoring of shorebird populations using count data and N-mixture models: Black Oystercatcher (Haematopus bachmani) surveys by land and sea

    USGS Publications Warehouse

    Lyons, James E.; Andrew, Royle J.; Thomas, Susan M.; Elliott-Smith, Elise; Evenson, Joseph R.; Kelly, Elizabeth G.; Milner, Ruth L.; Nysewander, David R.; Andres, Brad A.

    2012-01-01

    Large-scale monitoring of bird populations is often based on count data collected across spatial scales that may include multiple physiographic regions and habitat types. Monitoring at large spatial scales may require multiple survey platforms (e.g., from boats and land when monitoring coastal species) and multiple survey methods. It becomes especially important to explicitly account for detection probability when analyzing count data that have been collected using multiple survey platforms or methods. We evaluated a new analytical framework, N-mixture models, to estimate actual abundance while accounting for multiple detection biases. During May 2006, we made repeated counts of Black Oystercatchers (Haematopus bachmani) from boats in the Puget Sound area of Washington (n = 55 sites) and from land along the coast of Oregon (n = 56 sites). We used a Bayesian analysis of N-mixture models to (1) assess detection probability as a function of environmental and survey covariates and (2) estimate total Black Oystercatcher abundance during the breeding season in the two regions. Probability of detecting individuals during boat-based surveys was 0.75 (95% credible interval: 0.42–0.91) and was not influenced by tidal stage. Detection probability from surveys conducted on foot was 0.68 (0.39–0.90); the latter was not influenced by fog, wind, or number of observers but was ~35% lower during rain. The estimated population size was 321 birds (262–511) in Washington and 311 (276–382) in Oregon. N-mixture models provide a flexible framework for modeling count data and covariates in large-scale bird monitoring programs designed to understand population change.

  20. Multiple imputation for assessment of exposures to drinking water contaminants: evaluation with the Atrazine Monitoring Program.

    PubMed

    Jones, Rachael M; Stayner, Leslie T; Demirtas, Hakan

    2014-10-01

    Drinking water may contain pollutants that harm human health. The frequency of pollutant monitoring may occur quarterly, annually, or less frequently, depending upon the pollutant, the pollutant concentration, and community water system. However, birth and other health outcomes are associated with narrow time-windows of exposure. Infrequent monitoring impedes linkage between water quality and health outcomes for epidemiological analyses. To evaluate the performance of multiple imputation to fill in water quality values between measurements in community water systems (CWSs). The multiple imputation method was implemented in a simulated setting using data from the Atrazine Monitoring Program (AMP, 2006-2009 in five Midwestern states). Values were deleted from the AMP data to leave one measurement per month. Four patterns reflecting drinking water monitoring regulations were used to delete months of data in each CWS: three patterns were missing at random and one pattern was missing not at random. Synthetic health outcome data were created using a linear and a Poisson exposure-response relationship with five levels of hypothesized association, respectively. The multiple imputation method was evaluated by comparing the exposure-response relationships estimated based on multiply imputed data with the hypothesized association. The four patterns deleted 65-92% months of atrazine observations in AMP data. Even with these high rates of missing information, our procedure was able to recover most of the missing information when the synthetic health outcome was included for missing at random patterns and for missing not at random patterns with low-to-moderate exposure-response relationships. Multiple imputation appears to be an effective method for filling in water quality values between measurements. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Utilizing Time Domain Reflectometry on monitoring bedload in a mountain stream

    NASA Astrophysics Data System (ADS)

    Miyata, S.; Fujita, M.

    2015-12-01

    Understanding bedload transport processes in steep mountain streams is essential for disaster mitigation as well as predicting reservoir capacity and restoration of river ecosystem. Despite various monitoring methods proposed previously, precise bedload monitoring in steep streams still remains difficulty. This study aimed to develop a bedload monitoring system by continuous measurement of thickness and porosity of sediment under water that can be applicable to retention basins and pools in steep streams. When a probe of TDR (Time Domain Reflectometry) measurement system is inserted as to penetrate two adjacent layers with different dielectric constants, analysis of TDR waveform enables us to determine position of the layer boundary and ratio of materials in the layer. Methodology of analyzing observed TDR waveforms were established based on results of a series of column experiment, in which a single TDR probe with length of 40 cm was installed in a column filled with water and, then, sand was supplied gradually. Flume experiment was performed to apply the TDR system on monitoring sediment volume under flowing water conditions. Eight probes with lengths of 27 cm were distributed equally in a model retention basin (i.e., container), into which water and bedload were flowed from a connected flume. The model retention basin was weighed by a load cell and the sediment volume was calculated. A semi-automatic waveform analysis was developed to calculate continuously thicknesses and porosities of the sediment at the eight probes. Relative errors of sediment volume and bedload (=time differential of the volume) were 13 % at maximum, suggesting that the TDR system proposed in this study with multiple probes is applicable to bedload monitoring in retention basins of steep streams. Combination of this system and other indirect bedload monitoring method (e.g., geophone) potentially make a breakthrough for understanding sediment transport processes in steep mountain streams.

  2. Fiber-Optic Based Compact Gas Leak Detection System

    NASA Technical Reports Server (NTRS)

    deGroot, Wim A.

    1995-01-01

    A propellant leak detection system based on Raman scattering principles is introduced. The proposed system is flexible and versatile as the result of the use of optical fibers. It is shown that multiple species can be monitored simultaneously. In this paper oxygen, nitrogen, carbon monoxide, and hydrogen are detected and monitored. The current detection sensitivity for both hydrogen and carbon monoxide is 1% partial pressure at ambient conditions. The sensitivity for oxygen and nitrogen is 0.5% partial pressure. The response time to changes in species concentration is three minutes. This system can be used to monitor multiple species at several locations.

  3. Design and research of built-in sample cell with multiple optical reflections

    NASA Astrophysics Data System (ADS)

    Liu, Jianhui; Wang, Shuyao; Lv, Jinwei; Liu, Shuyang; Zhou, Tao; Jia, Xiaodong

    2017-10-01

    In the field of trace gas measurement, with the characteristics of high sensitivity, high selectivity and rapid detection, tunable diode laser absorption spectroscopy (TDLAS) is widely used in industrial process and trace gas pollution monitoring. Herriott cell is a common form of multiple reflections of the sample cell, the structure of the Herriott cell is relatively simple, which be used to application of trace gas absorption spectroscopy. In the pragmatic situation, the gas components are complicated, and the continuous testing process for a long time can lead to different degree of pollution and corrosion for the reflector in the sample cell. If the mirror is not cleaned up in time, it will have a great influence on the detection accuracy. In order to solve this problem in the process of harsh environment detection, this paper presents a design of the built-in sample cell to avoid the contact of gas and the mirror, thereby effectively reducing corrosion pollution. If there is optical pollution, direct replacement of the built-in optical sample cell can easily to be disassembled, and cleaned. The advantage of this design is long optical path, high precision, cost savings and so on.

  4. A multiple process solution to the logical problem of language acquisition*

    PubMed Central

    MACWHINNEY, BRIAN

    2006-01-01

    Many researchers believe that there is a logical problem at the center of language acquisition theory. According to this analysis, the input to the learner is too inconsistent and incomplete to determine the acquisition of grammar. Moreover, when corrective feedback is provided, children tend to ignore it. As a result, language learning must rely on additional constraints from universal grammar. To solve this logical problem, theorists have proposed a series of constraints and parameterizations on the form of universal grammar. Plausible alternatives to these constraints include: conservatism, item-based learning, indirect negative evidence, competition, cue construction, and monitoring. Careful analysis of child language corpora has cast doubt on claims regarding the absence of positive exemplars. Using demonstrably available positive data, simple learning procedures can be formulated for each of the syntactic structures that have traditionally motivated invocation of the logical problem. Within the perspective of emergentist theory (MacWhinney, 2001), the operation of a set of mutually supportive processes is viewed as providing multiple buffering for developmental outcomes. However, the fact that some syntactic structures are more difficult to learn than others can be used to highlight areas of intense grammatical competition and processing load. PMID:15658750

  5. Monitoring Replication Protein A (RPA) dynamics in homologous recombination through site-specific incorporation of non-canonical amino acids.

    PubMed

    Pokhrel, Nilisha; Origanti, Sofia; Davenport, Eric Parker; Gandhi, Disha; Kaniecki, Kyle; Mehl, Ryan A; Greene, Eric C; Dockendorff, Chris; Antony, Edwin

    2017-09-19

    An essential coordinator of all DNA metabolic processes is Replication Protein A (RPA). RPA orchestrates these processes by binding to single-stranded DNA (ssDNA) and interacting with several other DNA binding proteins. Determining the real-time kinetics of single players such as RPA in the presence of multiple DNA processors to better understand the associated mechanistic events is technically challenging. To overcome this hurdle, we utilized non-canonical amino acids and bio-orthogonal chemistry to site-specifically incorporate a chemical fluorophore onto a single subunit of heterotrimeric RPA. Upon binding to ssDNA, this fluorescent RPA (RPAf) generates a quantifiable change in fluorescence, thus serving as a reporter of its dynamics on DNA in the presence of multiple other DNA binding proteins. Using RPAf, we describe the kinetics of facilitated self-exchange and exchange by Rad51 and mediator proteins during various stages in homologous recombination. RPAf is widely applicable to investigate its mechanism of action in processes such as DNA replication, repair and telomere maintenance. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. Liquid Marble Coalescence and Triggered Microreaction Driven by Acoustic Levitation.

    PubMed

    Chen, Zhen; Zang, Duyang; Zhao, Liang; Qu, Mengfei; Li, Xu; Li, Xiaoguang; Li, Lixin; Geng, Xingguo

    2017-06-27

    Liquid marbles show promising potential for application in the microreactor field. Control of the coalescence between two or among multiple liquid marbles is critical; however, the successful merging of two isolated marbles is difficult because of their mechanically robust particle shells. In this work, the coalescence of multiple liquid marbles was achieved via acoustic levitation. The dynamic behaviors of the liquid marbles were monitored by a high-speed camera. Driven by the sound field, the liquid marbles moved toward each other, collided, and eventually coalesced into a larger single marble. The underlying mechanisms of this process were probed via sound field simulation and acoustic radiation pressure calculation. The results indicated that the pressure gradient on the liquid marble surface favors the formation of a liquid bridge between the liquid marbles, resulting in their coalescence. A preliminary indicator reaction was induced by the coalescence of dual liquid marbles, which suggests that expected chemical reactions can be successfully triggered with multiple reagents contained in isolated liquid marbles via acoustic levitation.

  7. Metabolite profiling of the fermentation process of "yamahai-ginjo-shikomi" Japanese sake.

    PubMed

    Tatsukami, Yohei; Morisaka, Hironobu; Aburaya, Shunsuke; Aoki, Wataru; Kohsaka, Chihiro; Tani, Masafumi; Hirooka, Kiyoo; Yamamoto, Yoshihiro; Kitaoka, Atsushi; Fujiwara, Hisashi; Wakai, Yoshinori; Ueda, Mitsuyoshi

    2018-01-01

    Sake is a traditional Japanese alcoholic beverage prepared by multiple parallel fermentation of rice. The fermentation process of "yamahai-ginjo-shikomi" sake is mainly performed by three microbes, Aspergillus oryzae, Saccharomyces cerevisiae, and Lactobacilli; the levels of various metabolites fluctuate during the fermentation of sake. For evaluation of the fermentation process, we monitored the concentration of moderate-sized molecules (m/z: 200-1000) dynamically changed during the fermentation process of "yamahai-ginjo-shikomi" Japanese sake. This analysis revealed that six compounds were the main factors with characteristic differences in the fermentation process. Among the six compounds, four were leucine- or isoleucine-containing peptides and the remaining two were predicted to be small molecules. Quantification of these compounds revealed that their quantities changed during the month of fermentation process. Our metabolomic approach revealed the dynamic changes observed in moderate-sized molecules during the fermentation process of sake, and the factors found in this analysis will be candidate molecules that indicate the progress of "yamahai-ginjo-shikomi" sake fermentation.

  8. Connecting World Heritage Nominations and Monitoring with the Support of the Silk Roads Cultural Heritage Resource Information System

    NASA Astrophysics Data System (ADS)

    Vileikis, O.; Dumont, B.; Serruys, E.; Van Balen, K.; Tigny, V.; De Maeyer, P.

    2013-07-01

    Serial transnational World Heritage nominations are challenging the way cultural heritage has been managed and evaluated in the past. Serial transnational World Heritage nominations are unique in that they consist of multiple sites listed as one property, distributed in different countries, involving a large diversity of stakeholders in the process. As a result, there is a need for precise baseline information for monitoring, reporting and decision making. This type of nomination requires different methodologies and tools to improve the monitoring cycle from the beginning of the nomination towards the periodic reporting. The case study of the Silk Roads Cultural Heritage Resource Information System (CHRIS) illustrates the use of a Geographical Content Management System (Geo-CMS) supporting the serial transnational World Heritage nomination and the monitoring of the Silk Roads in the five Central Asian countries. The Silk Roads CHRIS is an initiative supported by UNESCO World Heritage Centre (WHC) and the Belgian Federal Science Policy Office (BELSPO), and developed by a consortium headed by the Raymond Lemaire International Centre for Conservation (RLICC) at the KULeuven. The Silk Roads CHRIS has been successfully assisting in the preparation of the nomination dossiers of the Republics of Kazakhstan, Tajikistan and Uzbekistan and will be used as a tool for monitoring tool in the Central Asian countries.

  9. Intraoperative laser speckle contrast imaging for monitoring cerebral blood flow: results from a 10-patient pilot study

    NASA Astrophysics Data System (ADS)

    Richards, Lisa M.; Weber, Erica L.; Parthasarathy, Ashwin B.; Kappeler, Kaelyn L.; Fox, Douglas J.; Dunn, Andrew K.

    2012-02-01

    Monitoring cerebral blood flow (CBF) during neurosurgery can provide important physiological information for a variety of surgical procedures. Although multiple intraoperative vascular monitoring technologies are currently available, a quantitative method that allows for continuous monitoring is still needed. Laser speckle contrast imaging (LSCI) is an optical imaging method with high spatial and temporal resolution that has been widely used to image CBF in animal models in vivo. In this pilot clinical study, we adapted a Zeiss OPMI Pentero neurosurgical microscope to obtain LSCI images by attaching a camera and a laser diode. This LSCI adapted instrument has been used to acquire full field flow images from 10 patients during tumor resection procedures. The patient's ECG was recorded during acquisition and image registration was performed in post-processing to account for pulsatile motion artifacts. Digital photographs confirmed alignment of vasculature and flow images in four cases, and a relative change in blood flow was observed in two patients after bipolar cautery. The LSCI adapted instrument has the capability to produce real-time, full field CBF image maps with excellent spatial resolution and minimal intervention to the surgical procedure. Results from this study demonstrate the feasibility of using LSCI to monitor blood flow during neurosurgery.

  10. Subsidence monitoring and prediction of high-speed railway in Beijing with multitemporal TerraSAR-X data

    NASA Astrophysics Data System (ADS)

    Fan, Zelin; Zhang, Yonghong; Wu, Hong'an; Kang, Yonghui; Jiang, Decai

    2018-02-01

    The uneven settlement of high-speed railway (HSR) brings about great threat to the safe operation of trains. Therefore, the subsidence monitoring and prediction of HSR has important significance. In this paper, an improved multitemporal InSAR method combing PS-InSAR and SBAS-InSAR, Multiple-master Coherent Target Small-Baseline InSAR (MCTSB-InSAR), is used to monitor the subsidence of partial section of the Beijing-Tianjin HSR (BTHSR) and the Beijing-Shanghai HSR (BSHSR) in Beijing area. Thirty-one TerraSAR-X images from June 2011 to December 2016 are processed with the MCTSB-InSAR, and the subsidence information of the region covering 56km*32km in Beijing is dug out. Moreover, the monitoring results is validated by the leveling measurements in this area, with the accuracy of 4.4 mm/year. On the basis of above work, we extract the subsidence information of partial section of BTHSR and BSHSR in the research area. Finally, we adopt the idea of timing analysis, and employ the back-propagation (BP) neural network to simulate the relationship between former settlement and current settlement. Training data sets and test data sets are constructed respectively based on the monitoring results. The experimental results show that the prediction model has good prediction accuracy and applicability.

  11. Integrating modeling, monitoring, and management to reduce critical uncertainties in water resource decision making.

    PubMed

    Peterson, James T; Freeman, Mary C

    2016-12-01

    Stream ecosystems provide multiple, valued services to society, including water supply, waste assimilation, recreation, and habitat for diverse and productive biological communities. Managers striving to sustain these services in the face of changing climate, land uses, and water demands need tools to assess the potential effectiveness of alternative management actions, and often, the resulting tradeoffs between competing objectives. Integrating predictive modeling with monitoring data in an adaptive management framework provides a process by which managers can reduce model uncertainties and thus improve the scientific bases for subsequent decisions. We demonstrate an integration of monitoring data with a dynamic, metapopulation model developed to assess effects of streamflow alteration on fish occupancy in a southeastern US stream system. Although not extensive (collected over three years at nine sites), the monitoring data allowed us to assess and update support for alternative population dynamic models using model probabilities and Bayes rule. We then use the updated model weights to estimate the effects of water withdrawal on stream fish communities and demonstrate how feedback in the form of monitoring data can be used to improve water resource decision making. We conclude that investment in more strategic monitoring, guided by a priori model predictions under alternative hypotheses and an adaptive sampling design, could substantially improve the information available to guide decision-making and management for ecosystem services from lotic systems. Published by Elsevier Ltd.

  12. Large-scale progenitor cell expansion for multiple donors in a monitored hollow fibre bioreactor.

    PubMed

    Lambrechts, Toon; Papantoniou, Ioannis; Rice, Brent; Schrooten, Jan; Luyten, Frank P; Aerts, Jean-Marie

    2016-09-01

    With the increasing scale in stem cell production, a robust and controlled cell expansion process becomes essential for the clinical application of cell-based therapies. The objective of this work was the assessment of a hollow fiber bioreactor (Quantum Cell Expansion System from Terumo BCT) as a cell production unit for the clinical-scale production of human periosteum derived stem cells (hPDCs). We aimed to demonstrate comparability of bioreactor production to standard culture flask production based on a product characterization in line with the International Society of Cell Therapy in vitro benchmarks and supplemented with a compelling quantitative in vivo bone-forming potency assay. Multiple process read-outs were implemented to track process performance and deal with donor-to-donor-related variation in nutrient needs and harvest timing. The data show that the hollow fiber bioreactor is capable of robustly expanding autologous hPDCs on a clinical scale (yield between 316 million and 444 million cells starting from 20 million after ± 8 days of culture) while maintaining their in vitro quality attributes compared with the standard flask-based culture. The in vivo bone-forming assay on average resulted in 10.3 ± 3.7% and 11.0 ± 3.8% newly formed bone for the bioreactor and standard culture flask respectively. The analysis showed that the Quantum system provides a reproducible cell expansion process in terms of yields and culture conditions for multiple donors. Copyright © 2016 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  13. Final Technical Report. Project Boeing SGS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bell, Thomas E.

    Boeing and its partner, PJM Interconnection, teamed to bring advanced “defense-grade” technologies for cyber security to the US regional power grid through demonstration in PJM’s energy management environment. Under this cooperative project with the Department of Energy, Boeing and PJM have developed and demonstrated a host of technologies specifically tailored to the needs of PJM and the electric sector as a whole. The team has demonstrated to the energy industry a combination of processes, techniques and technologies that have been successfully implemented in the commercial, defense, and intelligence communities to identify, mitigate and continuously monitor the cyber security of criticalmore » systems. Guided by the results of a Cyber Security Risk-Based Assessment completed in Phase I, the Boeing-PJM team has completed multiple iterations through the Phase II Development and Phase III Deployment phases. Multiple cyber security solutions have been completed across a variety of controls including: Application Security, Enhanced Malware Detection, Security Incident and Event Management (SIEM) Optimization, Continuous Vulnerability Monitoring, SCADA Monitoring/Intrusion Detection, Operational Resiliency, Cyber Range simulations and hands on cyber security personnel training. All of the developed and demonstrated solutions are suitable for replication across the electric sector and/or the energy sector as a whole. Benefits identified include; Improved malware and intrusion detection capability on critical SCADA networks including behavioral-based alerts resulting in improved zero-day threat protection; Improved Security Incident and Event Management system resulting in better threat visibility, thus increasing the likelihood of detecting a serious event; Improved malware detection and zero-day threat response capability; Improved ability to systematically evaluate and secure in house and vendor sourced software applications; Improved ability to continuously monitor and maintain secure configuration of network devices resulting in reduced vulnerabilities for potential exploitation; Improved overall cyber security situational awareness through the integration of multiple discrete security technologies into a single cyber security reporting console; Improved ability to maintain the resiliency of critical systems in the face of a targeted cyber attack of other significant event; Improved ability to model complex networks for penetration testing and advanced training of cyber security personnel« less

  14. Healthcare Energy End-Use Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheppy, M.; Pless, S.; Kung, F.

    NREL partnered with two hospitals (MGH and SUNY UMU) to collect data on the energy used for multiple thermal and electrical end-use categories, including preheat, heating, and reheat; humidification; service water heating; cooling; fans; pumps; lighting; and select plug and process loads. Additional data from medical office buildings were provided for an analysis focused on plug loads. Facility managers, energy managers, and engineers in the healthcare sector will be able to use these results to more effectively prioritize and refine the scope of investments in new metering and energy audits.

  15. COMPARISON OF MULTIPLE POINT AND COMPOSITE SAMPLING FOR THE PURPOSE OF MONITORING BATHING WATER QUALITY

    EPA Science Inventory

    The USEPA Beaches Environmental Assessment and Coastal Health Act (BEACH Act) requires states to develop monitoring and notification programs for recreational waters using approved bacterial indicators. Implementation of an appropriate monitoring program can, under some circumsta...

  16. Effects of speech intelligibility level on concurrent visual task performance.

    PubMed

    Payne, D G; Peters, L J; Birkmire, D P; Bonto, M A; Anastasi, J S; Wenger, M J

    1994-09-01

    Four experiments were performed to determine if changes in the level of speech intelligibility in an auditory task have an impact on performance in concurrent visual tasks. The auditory task used in each experiment was a memory search task in which subjects memorized a set of words and then decided whether auditorily presented probe items were members of the memorized set. The visual tasks used were an unstable tracking task, a spatial decision-making task, a mathematical reasoning task, and a probability monitoring task. Results showed that performance on the unstable tracking and probability monitoring tasks was unaffected by the level of speech intelligibility on the auditory task, whereas accuracy in the spatial decision-making and mathematical processing tasks was significantly worse at low speech intelligibility levels. The findings are interpreted within the framework of multiple resource theory.

  17. FAST TRACK COMMUNICATION Generation of stable multi-jets by flow-limited field-injection electrostatic spraying and their control via I-V characteristics

    NASA Astrophysics Data System (ADS)

    Gu, W.; Heil, P. E.; Choi, H.; Kim, K.

    2010-12-01

    The I-V characteristics of flow-limited field-injection electrostatic spraying (FFESS) were investigated, exposing a new way to predict and control the specific spraying modes from single-jet to multi-jet. Monitoring the I-V characteristics revealed characteristic drops in the current upon formation of an additional jet in the multi-jet spraying mode. For fixed jet numbers, space-charge-limited current behaviour was measured which was attributed to space charge in the dielectric liquids between the needle electrode and the nozzle opening. The present work establishes that FFESS can, in particular, generate stable multiple jets and that their control is possible through monitoring the I-V characteristics. This can allow for automatic control of the FFESS process and expedite its future scientific and industrial applications.

  18. Health-Enabled Smart Sensor Fusion Technology

    NASA Technical Reports Server (NTRS)

    Wang, Ray

    2012-01-01

    A process was designed to fuse data from multiple sensors in order to make a more accurate estimation of the environment and overall health in an intelligent rocket test facility (IRTF), to provide reliable, high-confidence measurements for a variety of propulsion test articles. The object of the technology is to provide sensor fusion based on a distributed architecture. Specifically, the fusion technology is intended to succeed in providing health condition monitoring capability at the intelligent transceiver, such as RF signal strength, battery reading, computing resource monitoring, and sensor data reading. The technology also provides analytic and diagnostic intelligence at the intelligent transceiver, enhancing the IEEE 1451.x-based standard for sensor data management and distributions, as well as providing appropriate communications protocols to enable complex interactions to support timely and high-quality flow of information among the system elements.

  19. Evaluating Dense 3d Reconstruction Software Packages for Oblique Monitoring of Crop Canopy Surface

    NASA Astrophysics Data System (ADS)

    Brocks, S.; Bareth, G.

    2016-06-01

    Crop Surface Models (CSMs) are 2.5D raster surfaces representing absolute plant canopy height. Using multiple CMSs generated from data acquired at multiple time steps, a crop surface monitoring is enabled. This makes it possible to monitor crop growth over time and can be used for monitoring in-field crop growth variability which is useful in the context of high-throughput phenotyping. This study aims to evaluate several software packages for dense 3D reconstruction from multiple overlapping RGB images on field and plot-scale. A summer barley field experiment located at the Campus Klein-Altendorf of University of Bonn was observed by acquiring stereo images from an oblique angle using consumer-grade smart cameras. Two such cameras were mounted at an elevation of 10 m and acquired images for a period of two months during the growing period of 2014. The field experiment consisted of nine barley cultivars that were cultivated in multiple repetitions and nitrogen treatments. Manual plant height measurements were carried out at four dates during the observation period. The software packages Agisoft PhotoScan, VisualSfM with CMVS/PMVS2 and SURE are investigated. The point clouds are georeferenced through a set of ground control points. Where adequate results are reached, a statistical analysis is performed.

  20. A novel instrument for studying the flow behaviour of erythrocytes through microchannels simulating human blood capillaries.

    PubMed

    Sutton, N; Tracey, M C; Johnston, I D; Greenaway, R S; Rampling, M W

    1997-05-01

    A novel instrument has been developed to study the microrheology of erythrocytes as they flow through channels of dimensions similar to human blood capillaries. The channels are produced in silicon substrates using microengineering technology. Accurately defined, physiological driving pressures and temperatures are employed whilst precise, real-time image processing allows individual cells to be monitored continuously during their transit. The instrument characterises each cell in a sample of ca. 1000 in terms of its volume and flow velocity profile during its transit through a channel. The unique representation of the data in volume/velocity space provides new insight into the microrheological behaviour of blood. The image processing and subsequent data analysis enable the system to reject anomalous events such as multiple cell transits, thereby ensuring integrity of the resulting data. By employing an array of microfluidic flow channels we can integrate a number of different but precise and highly reproducible channel sizes and geometries within one array, thereby allowing multiple, concurrent isobaric measurements on one sample. As an illustration of the performance of the system, volume/velocity data sets recorded in a microfluidic device incorporating multiple channels of 100 microns length and individual widths ranging between 3.0 and 4.0 microns are presented.

  1. High-efficiency integrated piezoelectric energy harvesting systems

    NASA Astrophysics Data System (ADS)

    Hande, Abhiman; Shah, Pradeep

    2010-04-01

    This paper describes hierarchically architectured development of an energy harvesting (EH) system that consists of micro and/or macro-scale harvesters matched to multiple components of remote wireless sensor and communication nodes. The micro-scale harvesters consist of thin-film MEMS piezoelectric cantilever arrays and power generation modules in IC-like form to allow efficient EH from vibrations. The design uses new high conversion efficiency thin-film processes combined with novel cantilever structures tuned to multiple resonant frequencies as broadband arrays. The macro-scale harvesters are used to power the collector nodes that have higher power specifications. These bulk harvesters can be integrated with efficient adaptive power management circuits that match transducer impedance and maximize power harvested from multiple scavenging sources with very low intrinsic power consumption. Texas MicroPower, Inc. is developing process based on a composition that has the highest reported energy density as compared to other commercially available bulk PZT-based sensor/actuator ceramic materials and extending it to thin-film materials and miniature conversion transducer structures. The multiform factor harvesters can be deployed for several military and commercial applications such as underground unattended sensors, sensors in oil rigs, structural health monitoring, supply chain management, and battlefield applications such as sensors on soldier apparel, equipment, and wearable electronics.

  2. Integrated multi-sensor package (IMSP) for unmanned vehicle operations

    NASA Astrophysics Data System (ADS)

    Crow, Eddie C.; Reichard, Karl; Rogan, Chris; Callen, Jeff; Seifert, Elwood

    2007-10-01

    This paper describes recent efforts to develop integrated multi-sensor payloads for small robotic platforms for improved operator situational awareness and ultimately for greater robot autonomy. The focus is on enhancements to perception through integration of electro-optic, acoustic, and other sensors for navigation and inspection. The goals are to provide easier control and operation of the robot through fusion of multiple sensor outputs, to improve interoperability of the sensor payload package across multiple platforms through the use of open standards and architectures, and to reduce integration costs by embedded sensor data processing and fusion within the sensor payload package. The solutions investigated in this project to be discussed include: improved capture, processing and display of sensor data from multiple, non-commensurate sensors; an extensible architecture to support plug and play of integrated sensor packages; built-in health, power and system status monitoring using embedded diagnostics/prognostics; sensor payload integration into standard product forms for optimized size, weight and power; and the use of the open Joint Architecture for Unmanned Systems (JAUS)/ Society of Automotive Engineers (SAE) AS-4 interoperability standard. This project is in its first of three years. This paper will discuss the applicability of each of the solutions in terms of its projected impact to reducing operational time for the robot and teleoperator.

  3. Research on a Banknote Printing Wastewater Monitoring System based on Wireless Sensor Network

    NASA Astrophysics Data System (ADS)

    Li, B. B.; Yuan, Z. F.

    2006-10-01

    In this paper, a banknote printing wastewater monitoring system based on WSN is presented in line with the system demands and actual condition of the worksite for a banknote printing factory. In Physical Layer, the network node is a nRF9e5-centric embedded instrument, which can realize the multi-function such as data collecting, status monitoring, wireless data transmission and so on. Limited by the computing capability, memory capability, communicating energy and others factors, it is impossible for the node to get every detail information of the network, so the communication protocol on WSN couldn't be very complicated. The competitive-based MACA (Multiple Access with Collision Avoidance) Protocol is introduced in MAC, which can decide the communication process and working mode of the nodes, avoid the collision of data transmission, hidden and exposed station problem of nodes. On networks layer, the routing protocol in charge of the transmitting path of the data, the networks topology structure is arranged based on address assignation. Accompanied with some redundant nodes, the network performances stabile and expandable. The wastewater monitoring system is a tentative practice of WSN theory in engineering. Now, the system has passed test and proved efficiently.

  4. Recent Research applications at the Athens Neutron Monitor Station

    NASA Astrophysics Data System (ADS)

    Mavromichalaki, H.; Gerontidou, M.; Paschalis, P.; Papaioannou, A.; Paouris, E.; Papailiou, M.; Souvatzoglou, G.

    2015-08-01

    The ground based neutron monitor measurements play a key role in the field of space physics, solar-terrestrial relations, and space weather applications. The Athens cosmic ray group has developed several research applications such as an optimized automated Ground Level Enhancement Alert (GLE Alert Plus) and a web interface, providing data from multiple Neutron Monitor stations (Multi-Station tool). These services are actually available via the Space Weather Portal operated by the European Space Agency (http://swe.ssa.esa.int). In addition, two simulation tools, based on Geant4, have also been implemented. The first one is for the simulation of the cosmic ray showers in the atmosphere (DYASTIMA) and the second one is for the simulation of the 6NM-64 neutron monitor. The contribution of the simulation tools to the calculations of the radiation dose received by air crews and passengers within the Earth's atmosphere and to the neutron monitor study is presented as well. Furthermore, the accurate calculation of the barometric coefficient and the primary data processing by filtering algorithms, such as the well known Median Editor and the developed by the Athens group ANN Algorithm and Edge Editor which contribute to the provision of high quality neutron monitor data are also discussed. Finally, a Space Weather Forecasting Center which provides a three day geomagnetic activity report on a daily basis has been set up and has been operating for the last two years at the Athens Neutron Monitor Station.

  5. Calibration and validation of wearable monitors.

    PubMed

    Bassett, David R; Rowlands, Alex; Trost, Stewart G

    2012-01-01

    Wearable monitors are increasingly being used to objectively monitor physical activity in research studies within the field of exercise science. Calibration and validation of these devices are vital to obtaining accurate data. This article is aimed primarily at the physical activity measurement specialist, although the end user who is conducting studies with these devices also may benefit from knowing about this topic. Initially, wearable physical activity monitors should undergo unit calibration to ensure interinstrument reliability. The next step is to simultaneously collect both raw signal data (e.g., acceleration) from the wearable monitors and rates of energy expenditure, so that algorithms can be developed to convert the direct signals into energy expenditure. This process should use multiple wearable monitors and a large and diverse subject group and should include a wide range of physical activities commonly performed in daily life (from sedentary to vigorous). New methods of calibration now use "pattern recognition" approaches to train the algorithms on various activities, and they provide estimates of energy expenditure that are much better than those previously available with the single-regression approach. Once a method of predicting energy expenditure has been established, the next step is to examine its predictive accuracy by cross-validating it in other populations. In this article, we attempt to summarize the best practices for calibration and validation of wearable physical activity monitors. Finally, we conclude with some ideas for future research ideas that will move the field of physical activity measurement forward.

  6. A distributed pipeline for DIDSON data processing

    USGS Publications Warehouse

    Li, Liling; Danner, Tyler; Eickholt, Jesse; McCann, Erin L.; Pangle, Kevin; Johnson, Nicholas

    2018-01-01

    Technological advances in the field of ecology allow data on ecological systems to be collected at high resolution, both temporally and spatially. Devices such as Dual-frequency Identification Sonar (DIDSON) can be deployed in aquatic environments for extended periods and easily generate several terabytes of underwater surveillance data which may need to be processed multiple times. Due to the large amount of data generated and need for flexibility in processing, a distributed pipeline was constructed for DIDSON data making use of the Hadoop ecosystem. The pipeline is capable of ingesting raw DIDSON data, transforming the acoustic data to images, filtering the images, detecting and extracting motion, and generating feature data for machine learning and classification. All of the tasks in the pipeline can be run in parallel and the framework allows for custom processing. Applications of the pipeline include monitoring migration times, determining the presence of a particular species, estimating population size and other fishery management tasks.

  7. Finding Order in Randomness: Single-Molecule Studies Reveal Stochastic RNA Processing | Center for Cancer Research

    Cancer.gov

    Producing a functional eukaryotic messenger RNA (mRNA) requires the coordinated activity of several large protein complexes to initiate transcription, elongate nascent transcripts, splice together exons, and cleave and polyadenylate the 3’ end. Kinetic competition between these various processes has been proposed to regulate mRNA maturation, but this model could lead to multiple, randomly determined, or stochastic, pathways or outcomes. Regulatory checkpoints have been suggested as a means of ensuring quality control. However, current methods have been unable to tease apart the contributions of these processes at a single gene or on a time scale that could provide mechanistic insight. To begin to investigate the kinetic relationship between transcription and splicing, Daniel Larson, Ph.D., of CCR’s Laboratory of Receptor Biology and Gene Expression, and his colleagues employed a single-molecule RNA imaging approach to monitor production and processing of a human β-globin reporter gene in living cells.

  8. In-line moisture monitoring in fluidized bed granulation using a novel multi-resonance microwave sensor.

    PubMed

    Peters, Johanna; Bartscher, Kathrin; Döscher, Claas; Taute, Wolfgang; Höft, Michael; Knöchel, Reinhard; Breitkreutz, Jörg

    2017-08-01

    Microwave resonance technology (MRT) is known as a process analytical technology (PAT) tool for moisture measurements in fluid-bed granulation. It offers a great potential for wet granulation processes even where the suitability of near-infrared (NIR) spectroscopy is limited, e.g. colored granules, large variations in bulk density. However, previous sensor systems operating around a single resonance frequency showed limitations above approx. 7.5% granule moisture. This paper describes the application of a novel sensor working with four resonance frequencies. In-line data of all four resonance frequencies were collected and further processed. Based on calculation of density-independent microwave moisture values multiple linear regression (MLR) models using Karl-Fischer titration (KF) as well as loss on drying (LOD) as reference methods were build. Rapid, reliable in-process moisture control (RMSEP≤0.5%) even at higher moisture contents was achieved. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Optical coherence tomography angiography monitors human cutaneous wound healing over time.

    PubMed

    Deegan, Anthony J; Wang, Wendy; Men, Shaojie; Li, Yuandong; Song, Shaozhen; Xu, Jingjiang; Wang, Ruikang K

    2018-03-01

    In vivo imaging of the complex cascade of events known to be pivotal elements in the healing of cutaneous wounds is a difficult but essential task. Current techniques are highly invasive, or lack the level of vascular and structural detail required for accurate evaluation, monitoring and treatment. We aimed to use an advanced optical coherence tomography (OCT)-based angiography (OCTA) technique for the non-invasive, high resolution imaging of cutaneous wound healing. We used a clinical prototype OCTA to image, identify and track key vascular and structural adaptations known to occur throughout the healing process. Specific vascular parameters, such as diameter and density, were measured to aid our interpretations under a spatiotemporal framework. We identified multiple distinct, yet overlapping stages, hemostasis, inflammation, proliferation, and remodeling, and demonstrated the detailed vascularization and anatomical attributes underlying the multifactorial processes of dermatologic wound healing. OCTA provides an opportunity to both qualitatively and quantitatively assess the vascular response to acute cutaneous damage and in the future, may help to ascertain wound severity and possible healing outcomes; thus, enabling more effective treatment options.

  10. Optical coherence tomography angiography monitors human cutaneous wound healing over time

    PubMed Central

    Deegan, Anthony J.; Wang, Wendy; Men, Shaojie; Li, Yuandong; Song, Shaozhen; Xu, Jingjiang

    2018-01-01

    Background In vivo imaging of the complex cascade of events known to be pivotal elements in the healing of cutaneous wounds is a difficult but essential task. Current techniques are highly invasive, or lack the level of vascular and structural detail required for accurate evaluation, monitoring and treatment. We aimed to use an advanced optical coherence tomography (OCT)-based angiography (OCTA) technique for the non-invasive, high resolution imaging of cutaneous wound healing. Methods We used a clinical prototype OCTA to image, identify and track key vascular and structural adaptations known to occur throughout the healing process. Specific vascular parameters, such as diameter and density, were measured to aid our interpretations under a spatiotemporal framework. Results We identified multiple distinct, yet overlapping stages, hemostasis, inflammation, proliferation, and remodeling, and demonstrated the detailed vascularization and anatomical attributes underlying the multifactorial processes of dermatologic wound healing. Conclusions OCTA provides an opportunity to both qualitatively and quantitatively assess the vascular response to acute cutaneous damage and in the future, may help to ascertain wound severity and possible healing outcomes; thus, enabling more effective treatment options. PMID:29675355

  11. Method & apparatus for monitoring plasma processing operations

    DOEpatents

    Smith, Jr., Michael Lane; Ward, Pamela Denise; Stevenson, Joel O'Don

    2004-10-19

    The invention generally relates to various aspects of a plasma process and, more specifically, to the monitoring of such plasma processes. One aspect relates to a plasma monitoring module that may be adjusted in at least some manner so as to re-evaluate a previously monitored plasma process. For instance, optical emissions data on a plasma process that was previously monitored by the plasma monitoring module may be replayed through the plasma monitoring module after making at least one adjustment in relation to the plasma monitoring module.

  12. Monitoring snowmelt and solute transport at Oslo airport by combining time-lapse electrical resistivity, soil water sampling and tensiometer measurements

    NASA Astrophysics Data System (ADS)

    Bloem, E.; French, H. K.

    2013-12-01

    Monitoring contaminant transport at contaminated sites requires optimization of the configuration of a limited number of samplings points combined with heterogeneous flow and preferential flowpaths. Especially monitoring processes in the unsaturated zone is a major challenge due to the limited volume monitored by for example suction cups and their risk to clog in a highly active degradation zone. To make progress on soil contamination assessment and site characterization there is a strong need to integrate field-sale extensively instrumented tools, with non-invasive (geophysical) methods which provide spatially integrated measurements also in the unsaturated zone. Examples of sites that might require monitoring activities in the unsaturated zone are airports with winter frost where large quantities of de-icing chemicals are used each winter; salt and contaminant infiltration along roads; constructed infiltration systems for treatment of sewerage or landfill seepage. Electrical resistivity methods have proved to be useful as an indirect measurement of subsurface properties and processes at the field-scale. The non-uniqueness of the interpretation techniques can be reduced by constraining the inversion through the addition of independent geophysical measurements along the same profile. Or interpretation and understanding of geophysical images can be improved by the combination with classical measurements of soil physical properties, soil suction, contaminant concentration and temperatures. In our experiment, at the research field station at Gardermoen, Oslo airport, we applied a degradable de-icing chemical and an inactive tracer to the snow cover prior to snowmelt. To study the solute transport processes in the unsaturated zone time-lapse cross borehole electrical resistivity tomography (ERT) measurements were conducted at the same time as soil water samples were extracted at multiple depths with suction cups. Measurements of soil temperature, and soil tension were also carried out during the monitoring period. We present a selection of results from the snowmelt experiments and how the combination of measurement techniques can help interpret and understand the relative importance of the various contributions to the bulk electrical conductivity during snowmelt and solute transport.

  13. Towards a Near Real-Time Satellite-Based Flux Monitoring System for the MENA Region

    NASA Astrophysics Data System (ADS)

    Ershadi, A.; Houborg, R.; McCabe, M. F.; Anderson, M. C.; Hain, C.

    2013-12-01

    Satellite remote sensing has the potential to offer spatially and temporally distributed information on land surface characteristics, which may be used as inputs and constraints for estimating land surface fluxes of carbon, water and energy. Enhanced satellite-based monitoring systems for aiding local water resource assessments and agricultural management activities are particularly needed for the Middle East and North Africa (MENA) region. The MENA region is an area characterized by limited fresh water resources, an often inefficient use of these, and relatively poor in-situ monitoring as a result of sparse meteorological observations. To address these issues, an integrated modeling approach for near real-time monitoring of land surface states and fluxes at fine spatio-temporal scales over the MENA region is presented. This approach is based on synergistic application of multiple sensors and wavebands in the visible to shortwave infrared and thermal infrared (TIR) domain. The multi-scale flux mapping and monitoring system uses the Atmosphere-Land Exchange Inverse (ALEXI) model and associated flux disaggregation scheme (DisALEXI), and the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) in conjunction with model reanalysis data and multi-sensor remotely sensed data from polar orbiting (e.g. Landsat and MODerate resolution Imaging Spectroradiometer (MODIS)) and geostationary (MSG; Meteosat Second Generation) satellite platforms to facilitate time-continuous (i.e. daily) estimates of field-scale water, energy and carbon fluxes. Within this modeling system, TIR satellite data provide information about the sub-surface moisture status and plant stress, obviating the need for precipitation input and a detailed soil surface characterization (i.e. for prognostic modeling of soil transport processes). The STARFM fusion methodology blends aspects of high frequency (spatially coarse) and spatially fine resolution sensors and is applied directly to flux output fields to facilitate daily mapping of fluxes at sub-field scales. A complete processing infrastructure to automatically ingest and pre-process all required input data and to execute the integrated modeling system for near real-time agricultural monitoring purposes over targeted MENA sites is being developed, and initial results from this concerted effort will be discussed.

  14. Evaluation of a multiple-species approach to monitoring species at the ecoregional scale

    Treesearch

    Patricia N. Manley; William J. Zielinski; Matthew D. Schlesinger; Sylvia R. Mori

    2004-01-01

    Monitoring is required of land managers and conservation practitioners to assess the success of management actions. "Shortcuts" are sought to reduce monitoring costs, most often consisting of the selection of a small number of species that are closely monitored to represent the status of many associated species and environmental correlates. Assumptions...

  15. Curriculum-Based Measurement of Oral Reading: Quality of Progress Monitoring Outcomes

    ERIC Educational Resources Information Center

    Christ, Theodore J.; Zopluoglu, Cengiz; Long, Jeffery D.; Monaghen, Barbara D.

    2012-01-01

    Curriculum-based measurement of oral reading (CBM-R) is frequently used to set student goals and monitor student progress. This study examined the quality of growth estimates derived from CBM-R progress monitoring data. The authors used a linear mixed effects regression (LMER) model to simulate progress monitoring data for multiple levels of…

  16. 40 CFR 75.72 - Determination of NOX mass emissions for common stack and multiple stack configurations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the hourly stack flow rate (in scfh). Only one methodology for determining NOX mass emissions shall be...-diluent continuous emissions monitoring system and a flow monitoring system in the common stack, record... maintain a flow monitoring system and diluent monitor in the duct to the common stack from each unit; or...

  17. The Joint Experiment for Crop Assessment and Monitoring (JECAM): Update on Multisite Inter-comparison Experiments

    NASA Astrophysics Data System (ADS)

    Jarvis, I.; Gilliams, S. J. B.; Defourny, P.

    2016-12-01

    Globally there is significant convergence on agricultural monitoring research questions. The focus of interest usually revolves around crop type, crop area estimation and near real time crop condition and yield forecasting. Notwithstanding this convergence, agricultural systems differ significantly throughout the world, reflecting the diversity of ecosystems they are located in. Consequently, a global system of systems for operational monitoring must be based on multiple approaches. Research is required to compare and assess these approaches to identify which are most appropriate for any given location. To this end the Joint Experiments for Crop Assessment and Monitoring (JECAM) was established in 2009 to as a research platform to allow the global agricultural monitoring community to work towards a set of best practices and recommendations for using earth observation data to map, monitor and report on agricultural productivity globally. The JECAM initiative brings together researchers from a large number of globally distributed, well monitored agricultural test sites that cover a range of crop types, cropping systems and climate regimes. The results of JECAM optical inter-comparison research taking place in the Stimulating Innovation for Global Monitoring of Agriculture (SIGMA) project and the Sentinel-2 for Agriculture project will be discussed. The presentation will also highlight upcoming work on a Synthetic Aperture Radar (SAR) inter-comparison study. The outcome of these projects will result in a set of best practices that cover the range of remote sensing monitoring and reporting needs, including satellite data acquisition, pre-processing techniques, information retrieval and ground data validation. These outcomes provide the R&D foundation for GEOGLAM and will help to inform the development of the GEOGLAM system of systems for global agricultural monitoring.

  18. Monitoring Cloud-prone Complex Landscapes At Multiple Spatial Scales Using Medium And High Resolution Optical Data: A Case Study In Central Africa

    NASA Astrophysics Data System (ADS)

    Basnet, Bikash

    Tracking land surface dynamics over cloud-prone areas with complex mountainous terrain and a landscape that is heterogeneous at a scale of approximately 10 m, is an important challenge in the remote sensing of tropical regions in developing nations, due to the small plot sizes. Persistent monitoring of natural resources in these regions at multiple spatial scales requires development of tools to identify emerging land cover transformation due to anthropogenic causes, such as agricultural expansion and climate change. Along with the cloud cover and obstructions by topographic distortions due to steep terrain, there are limitations to the accuracy of monitoring change using available historical satellite imagery, largely due to sparse data access and the lack of high quality ground truth for classifier training. One such complex region is the Lake Kivu region in Central Africa. This work addressed these problems to create an effective process for monitoring the Lake Kivu region located in Central Africa. The Lake Kivu region is a biodiversity hotspot with a complex and heterogeneous landscape and intensive agricultural development, where individual plot sizes are often at the scale of 10m. Procedures were developed that use optical data from satellite and aerial observations at multiple scales to tackle the monitoring challenges. First, a novel processing chain was developed to systematically monitor the spatio-temporal land cover dynamics of this region over the years 1988, 2001, and 2011 using Landsat data, complemented by ancillary data. Topographic compensation was performed on Landsat reflectances to avoid the strong illumination angle impacts and image compositing was used to compensate for frequent cloud cover and thus incomplete annual data availability in the archive. A systematic supervised classification, using the state-of-the-art machine learning classifier Random Forest, was applied to the composite Landsat imagery to obtain land cover thematic maps with overall accuracies of 90% and higher. Subsequent change analysis between these years found extensive conversions of the natural environment as a result of human related activities. The gross forest cover loss for 1988--2001 and 2001--2011 periods was 216.4 and 130.5 thousand hectares, respectively, signifying significant deforestation in the period of civil war and a relatively stable and lower deforestation rate later, possibly due to conservation and reforestation efforts in the region. The other dominant land cover changes in the region were aggressive subsistence farming and urban expansion displacing natural vegetation and arable lands. Despite limited data availability, this study fills the gap of much needed detailed and updated land cover change information for this biologically important region of Central Africa. While useful on a regional scale, Landsat data can be inadequate for more detailed studies of land cover change. Based on an increasing availability of high resolution imagery and light detection and ranging (LiDAR) data from manned and unmanned aerial platforms (<1m resolution), a study was performed leading to a novel generic framework for land cover monitoring at fine spatial scales. The approach fuses high spatial resolution aerial imagery and LiDAR data to produce land cover maps with high spatial detail using object-based image analysis techniques. The classification framework was tested for a scene with both natural and cultural features and was found to be more than 90 percent accurate, sufficient for detailed land cover change studies.

  19. Ground-based Space Weather Monitoring with LOFAR

    NASA Astrophysics Data System (ADS)

    Wise, Michael; van Haarlem, Michiel; Lawrence, Gareth; Reid, Simon; Bos, Andre; Rawlings, Steve; Salvini, Stef; Mitchell, Cathryn; Soleimani, Manuch; Amado, Sergio; Teresa, Vital

    As one of the first of a new generation of radio instruments, the International LOFAR Telescope (ILT) will provide a number of unique and novel capabilities for the astronomical community. These include remote configuration and operation, dynamic real-time processing and system response, and the ability to provide multiple simultaneous streams of data to a community whose scientific interests run the gamut from lighting in the atmospheres of distant planets to the origins of the universe itself. The LOFAR (LOw Frequency ARray) system is optimized for a frequency range from 30-240 MHz and consists of multiple antenna fields spread across Europe. In the Netherlands, a total 36 LOFAR stations are nearing completion with an initial 8 international stations currently being deployed in Germany, France, Sweden, and the UK. Digital beam-forming techniques make the LOFAR system agile and allow for rapid repointing of the telescope as well as the potential for multiple simultaneous observations. With its dense core array and long interferometric baselines, LOFAR has the potential to achieve unparalleled sensitivity and spatial resolution in the low frequency radio regime. LOFAR will also be one of the first radio observatories to feature automated processing pipelines to deliver fully calibrated science products to its user community. As we discuss in this presentation, the same capabilities that make LOFAR a powerful tool for radio astronomy also provide an excellent platform upon which to build a ground-based monitoring system for space weather events. For example, the ability to monitor Solar activity in near real-time is one of the key scientific capabilities being developed for LOFAR. With only a fraction of its total observing capacity, LOFAR will be able to provide continuous monitoring of the Solar spectrum over the entire 10-240 MHz band down to microsecond timescales. Autonomous routines will scan these incoming spectral data for evidence of Solar flares and be capable of generating various responses including alerting external observatories or reallocating internal observing capacity to create short cadence (1-10 sec) images of the Sun. More uniquely, the core development, already invested by LOFAR to produce astronomical images of the sky, makes an excellent framework on which to build a near real-time ionospheric monitor and thereby study the effects of space weather events on our atmosphere. One of the key technical challenges to producing high quality scientific images in the low frequency radio regime are the effects of the active ionosphere over the detector array on signal propagation through the earth's atmosphere. To correct for these effects, the current LOFAR system includes an adaptive calibration employing both single and multi-layer phase screen models for the ionosphere. The output of this calibration automatically produces continuous ionospheric measurements with a data cadence in seconds. Although limited to the sky over the array, the resulting TEC maps can have vertical and horizontal resolutions down to 2m and relative accuracies of 0.001 TECU. The intent is to publish both Solar and ionospheric data-streams to the space weather community providing an excellent complement to existing space-based monitoring assets. In this presentation, we will describe the current and planned capabilities of the LOFAR system as well as show some first examples of the potential data products taken during the ongoing commissioning phase. We will also discuss plans to build upon the current LOFAR infrastructure and provide a source of near real-time monitoring data to the space weather community.

  20. A Targeted LC-MS/MS Method for the Simultaneous Detection and Quantitation of Egg, Milk, and Peanut Allergens in Sugar Cookies.

    PubMed

    Boo, Chelsea C; Parker, Christine H; Jackson, Lauren S

    2018-01-01

    Food allergy is a growing public health concern, with many individuals reporting allergies to multiple food sources. Compliance with food labeling regulations and prevention of inadvertent cross-contact in manufacturing requires the use of reliable methods for the detection and quantitation of allergens in processed foods. In this work, a novel liquid chromatography-tandem mass spectrometry multiple-reaction monitoring method for multiallergen detection and quantitation of egg, milk, and peanut was developed and evaluated in an allergen-incurred baked sugar cookie matrix. A systematic evaluation of method parameters, including sample extraction, concentration, and digestion, were optimized for candidate allergen peptide markers. The optimized method enabled the reliable detection and quantitation of egg, milk, and peanut allergens in sugar cookies, with allergen concentrations as low as 5 ppm allergen-incurred ingredient.

  1. Application of stable isotope ratio analysis for biodegradation monitoring in groundwater

    USGS Publications Warehouse

    Hatzinger, Paul B.; Böhlke, John Karl; Sturchio, Neil C.

    2013-01-01

    Stable isotope ratio analysis is increasingly being applied as a tool to detect, understand, and quantify biodegradation of organic and inorganic contaminants in groundwater. An important feature of this approach is that it allows degradative losses of contaminants to be distinguished from those caused by non-destructive processes such as dilution, dispersion, and sorption. Recent advances in analytical techniques, and new approaches for interpreting stable isotope data, have expanded the utility of this method while also exposing complications and ambiguities that must be considered in data interpretations. Isotopic analyses of multiple elements in a compound, and multiple compounds in the environment, are being used to distinguish biodegradative pathways by their characteristic isotope effects. Numerical models of contaminant transport, degradation pathways, and isotopic composition are improving quantitative estimates of in situ contaminant degradation rates under realistic environmental conditions.

  2. In vivo bioluminescence and reflectance imaging of multiple organs in bioluminescence reporter mice by bundled-fiber-coupled microscopy

    PubMed Central

    Ando, Yoriko; Sakurai, Takashi; Koida, Kowa; Tei, Hajime; Hida, Akiko; Nakao, Kazuki; Natsume, Mistuo; Numano, Rika

    2016-01-01

    Bioluminescence imaging (BLI) is used in biomedical research to monitor biological processes within living organisms. Recently, fiber bundles with high transmittance and density have been developed to detect low light with high resolution. Therefore, we have developed a bundled-fiber-coupled microscope with a highly sensitive cooled-CCD camera that enables the BLI of organs within the mouse body. This is the first report of in vivo BLI of the brain and multiple organs in luciferase-reporter mice using bundled-fiber optics. With reflectance imaging, the structures of blood vessels and organs can be seen clearly with light illumination, and it allowed identification of the structural details of bioluminescence images. This technique can also be applied to clinical diagnostics in a low invasive manner. PMID:27231601

  3. Damage Detection Sensor System for Aerospace and Multiple Applications

    NASA Technical Reports Server (NTRS)

    Williams, Martha; Lewis, Mark; Gibson, Tracy L.; Lane, John; Medelius, Pedro

    2017-01-01

    NASA has identified structural health monitoring and damage detection and verification as critical needs in multiple technology roadmaps. The sensor systems can be customized for detecting location, damage size, and depth, with velocity options and can be designed for particular environments for monitoring of impact or physical damage to a structure. The damage detection system has been successfully demonstrated in a harsh environment and remote integration tested over 1000 miles apart. Multiple applications includes: Spacecraft and Aircraft; Inflatable, Deployable and Expandable Structures; Space Debris Monitoring; Space Habitats; Military Shelters; Solar Arrays, Smart Garments and Wearables, Extravehicular activity (EVA) suits; Critical Hardware Enclosures; Embedded Composite Structures; and Flexible Hybrid Printed Electronics and Systems. For better implementation and infusion into more flexible architectures, important and improved designs in advancing embedded software and GUI interface, and increasing flexibility, modularity, and configurable capabilities of the system are currently being carried out.

  4. 40 CFR 60.153 - Monitoring of operations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) The owner or operator of any multiple hearth, fluidized bed, or electric sludge incinerator subject to...) Install, calibrate, maintain and operate temperature measuring devices at every hearth in multiple hearth... zones of electric incinerators. For multiple hearth furnaces, a minimum of one temperature measuring...

  5. Remote sensing of the energetic status of plants and ecosystems: optical and odorous signals

    NASA Astrophysics Data System (ADS)

    Penuelas, J.; Bartrons, M.; Llusia, J.; Filella, I.

    2016-12-01

    The optical and odorous signals emitted by plants and ecosystems present consistent relationships. They offer promising prospects for continuous local and global monitoring of the energetic status of plants and ecosystems, and therefore of their processing of energy and matter. We will discuss how the energetic status of plants (and ecosystems) resulting from the balance between the supply and demand of reducing power can be assessed biochemically, by the cellular NADPH/NADP ratio, optically, by using the photochemical reflectance index and sun-induced fluorescence as indicators of the dissipation of excess energy and associated physiological processes, and "odorously", by the emission of volatile organic compounds such as isoprenoids, as indicators of an excess of reducing equivalents and also of enhancement of protective converging physiological processes. These signals thus provide information on the energetic status, associated health status, and the functioning of plants and ecosystems. We will present the links among the three signals and will especially discuss the possibility of remotely sense the optical signals linked to carbon uptake and VOCs exchange by plants and ecosystems. These signals and their integration may have multiple applications for environmental and agricultural monitoring, for example, by extending the spatial coverage of carbon-flux and VOCs emission observations to most places and times, and/or for improving the process-based modeling of carbon fixation and isoprenoid emissions from terrestrial vegetation on plant, ecosystemic and global scales. Considerable challenges remain for a wide-scale and routine implementation of these biochemical, optical, and odorous signals for ecosystemic and/or agronomic monitoring and modeling, but its interest for making further steps forward in global ecology, agricultural applications, global carbon cycle, atmospheric science, and earth science warrants further research efforts in this line.

  6. MIR-ATR sensor for process monitoring

    NASA Astrophysics Data System (ADS)

    Geörg, Daniel; Schalk, Robert; Methner, Frank-Jürgen; Beuermann, Thomas

    2015-06-01

    A mid-infrared attenuated total reflectance (MIR-ATR) sensor has been developed for chemical reaction monitoring. The optical setup of the compact and low-priced sensor consists of an IR emitter as light source, a zinc selenide (ZnSe) ATR prism as boundary to the process, and four thermopile detectors, each equipped with an optical bandpass filter. The practical applicability was tested during esterification of ethanol and formic acid to ethyl formate and water as a model reaction with subsequent distillation. For reference analysis, a Fourier transform mid-infrared (FT-MIR) spectrometer with diamond ATR module was applied. On-line measurements using the MIR-ATR sensor and the FT-MIR spectrometer were performed in a bypass loop. The sensor was calibrated by multiple linear regression in order to link the measured absorbance in the four optical channels to the analyte concentrations. The analytical potential of the MIR-ATR sensor was demonstrated by simultaneous real-time monitoring of all four chemical substances involved in the esterification and distillation process. The temporal courses of the sensor signals are in accordance with the concentration values achieved by the commercial FT-MIR spectrometer. The standard error of prediction for ethanol, formic acid, ethyl formate, and water were 0.38 mol L  -  1, 0.48 mol L  -  1, 0.38 mol L  -  1, and 1.12 mol L  -  1, respectively. A procedure based on MIR spectra is presented to simulate the response characteristics of the sensor if the transmission ranges of the filters are varied. Using this tool analyte specific bandpass filters for a particular chemical reaction can be identified. By exchanging the optical filters, the sensor can be adapted to a wide range of processes in the chemical, pharmaceutical, and beverage industries.

  7. Assessment of multiple DWI offender restrictions

    DOT National Transportation Integrated Search

    1989-12-01

    This report discusses nine new approaches for reducing recidivism among multiple DWI offenders: dedicated detention facilities, diversion programs, electronic monitoring, ignition interlock systems, intensive probation supervision, publishing offende...

  8. Multiple Household Water Sources and Their Use in Remote Communities With Evidence From Pacific Island Countries

    NASA Astrophysics Data System (ADS)

    Elliott, Mark; MacDonald, Morgan C.; Chan, Terence; Kearton, Annika; Shields, Katherine F.; Bartram, Jamie K.; Hadwen, Wade L.

    2017-11-01

    Global water research and monitoring typically focus on the household's "main source of drinking-water." Use of multiple water sources to meet daily household needs has been noted in many developing countries but rarely quantified or reported in detail. We gathered self-reported data using a cross-sectional survey of 405 households in eight communities of the Republic of the Marshall Islands (RMI) and five Solomon Islands (SI) communities. Over 90% of households used multiple sources, with differences in sources and uses between wet and dry seasons. Most RMI households had large rainwater tanks and rationed stored rainwater for drinking throughout the dry season, whereas most SI households collected rainwater in small pots, precluding storage across seasons. Use of a source for cooking was strongly positively correlated with use for drinking, whereas use for cooking was negatively correlated or uncorrelated with nonconsumptive uses (e.g., bathing). Dry season water uses implied greater risk of water-borne disease, with fewer (frequently zero) handwashing sources reported and more unimproved sources consumed. Use of multiple sources is fundamental to household water management and feasible to monitor using electronic survey tools. We contend that recognizing multiple water sources can greatly improve understanding of household-level and community-level climate change resilience, that use of multiple sources confounds health impact studies of water interventions, and that incorporating multiple sources into water supply interventions can yield heretofore-unrealized benefits. We propose that failure to consider multiple sources undermines the design and effectiveness of global water monitoring, data interpretation, implementation, policy, and research.

  9. Environmental monitoring of Galway Bay: fusing data from remote and in-situ sources

    NASA Astrophysics Data System (ADS)

    O'Connor, Edel; Hayes, Jer; Smeaton, Alan F.; O'Connor, Noel E.; Diamond, Dermot

    2009-09-01

    Changes in sea surface temperature can be used as an indicator of water quality. In-situ sensors are being used for continuous autonomous monitoring. However these sensors have limited spatial resolution as they are in effect single point sensors. Satellite remote sensing can be used to provide better spatial coverage at good temporal scales. However in-situ sensors have a richer temporal scale for a particular point of interest. Work carried out in Galway Bay has combined data from multiple satellite sources and in-situ sensors and investigated the benefits and drawbacks of using multiple sensing modalities for monitoring a marine location.

  10. Satellite Remote Sensing of Harmful Algal Blooms (HABs) and a Potential Synthesized Framework

    PubMed Central

    Shen, Li; Xu, Huiping; Guo, Xulin

    2012-01-01

    Harmful algal blooms (HABs) are severe ecological disasters threatening aquatic systems throughout the World, which necessitate scientific efforts in detecting and monitoring them. Compared with traditional in situ point observations, satellite remote sensing is considered as a promising technique for studying HABs due to its advantages of large-scale, real-time, and long-term monitoring. The present review summarizes the suitability of current satellite data sources and different algorithms for detecting HABs. It also discusses the spatial scale issue of HABs. Based on the major problems identified from previous literature, including the unsystematic understanding of HABs, the insufficient incorporation of satellite remote sensing, and a lack of multiple oceanographic explanations of the mechanisms causing HABs, this review also attempts to provide a comprehensive understanding of the complicated mechanism of HABs impacted by multiple oceanographic factors. A potential synthesized framework can be established by combining multiple accessible satellite remote sensing approaches including visual interpretation, spectra analysis, parameters retrieval and spatial-temporal pattern analysis. This framework aims to lead to a systematic and comprehensive monitoring of HABs based on satellite remote sensing from multiple oceanographic perspectives. PMID:22969372

  11. Multi-scale ecosystem monitoring: an application of scaling data to answer multiple ecological questions

    USDA-ARS?s Scientific Manuscript database

    Background/Question/Methods Standardized monitoring data collection efforts using a probabilistic sample design, such as in the Bureau of Land Management’s (BLM) Assessment, Inventory, and Monitoring (AIM) Strategy, provide a core suite of ecological indicators, maximize data collection efficiency,...

  12. An electronic circuit for sensing malfunctions in test instrumentation

    NASA Technical Reports Server (NTRS)

    Miller, W. M., Jr.

    1969-01-01

    Monitoring device differentiates between malfunctions occurring in the system undergoing test and malfunctions within the test instrumentation itself. Electronic circuits in the monitor use transistors to commutate silicon controlled rectifiers by removing the drive voltage, display circuits are then used to monitor multiple discrete lines.

  13. A program for sustained improvement in preventing ventilator associated pneumonia in an intensive care setting

    PubMed Central

    2012-01-01

    Background Ventilator-associated pneumonia (VAP) is a common infection in the intensive care unit (ICU) and associated with a high mortality. Methods A quasi-experimental study was conducted in a medical-surgical ICU. Multiple interventions to optimize VAP prevention were performed from October 2008 to December 2010. All of these processes, including the Institute for Healthcare Improvement’s (IHI) ventilator bundle plus oral decontamination with chlorhexidine and continuous aspiration of subglottic secretions (CASS), were adopted for patients undergoing mechanical ventilation. Results We evaluated a total of 21,984 patient-days, and a total of 6,052 ventilator-days (ventilator utilization rate of 0.27). We found VAP rates of 1.3 and 2.0 per 1,000 ventilator days respectively in 2009 and 2010, achieving zero incidence of VAP several times during 12 months, whenever VAP bundle compliance was over 90%. Conclusion These results suggest that it is possible to reduce VAP rates to near zero and sustain these rates, but it requires a complex process involving multiple performance measures and interventions that must be permanently monitored. PMID:23020101

  14. A program for sustained improvement in preventing ventilator associated pneumonia in an intensive care setting.

    PubMed

    Caserta, Raquel A; Marra, Alexandre R; Durão, Marcelino S; Silva, Cláudia Vallone; Pavao dos Santos, Oscar Fernando; Neves, Henrique Sutton de Sousa; Edmond, Michael B; Timenetsky, Karina Tavares

    2012-09-29

    Ventilator-associated pneumonia (VAP) is a common infection in the intensive care unit (ICU) and associated with a high mortality. A quasi-experimental study was conducted in a medical-surgical ICU. Multiple interventions to optimize VAP prevention were performed from October 2008 to December 2010. All of these processes, including the Institute for Healthcare Improvement's (IHI) ventilator bundle plus oral decontamination with chlorhexidine and continuous aspiration of subglottic secretions (CASS), were adopted for patients undergoing mechanical ventilation. We evaluated a total of 21,984 patient-days, and a total of 6,052 ventilator-days (ventilator utilization rate of 0.27). We found VAP rates of 1.3 and 2.0 per 1,000 ventilator days respectively in 2009 and 2010, achieving zero incidence of VAP several times during 12 months, whenever VAP bundle compliance was over 90%. These results suggest that it is possible to reduce VAP rates to near zero and sustain these rates, but it requires a complex process involving multiple performance measures and interventions that must be permanently monitored.

  15. An automated data exploitation system for airborne sensors

    NASA Astrophysics Data System (ADS)

    Chen, Hai-Wen; McGurr, Mike

    2014-06-01

    Advanced wide area persistent surveillance (WAPS) sensor systems on manned or unmanned airborne vehicles are essential for wide-area urban security monitoring in order to protect our people and our warfighter from terrorist attacks. Currently, human (imagery) analysts process huge data collections from full motion video (FMV) for data exploitation and analysis (real-time and forensic), providing slow and inaccurate results. An Automated Data Exploitation System (ADES) is urgently needed. In this paper, we present a recently developed ADES for airborne vehicles under heavy urban background clutter conditions. This system includes four processes: (1) fast image registration, stabilization, and mosaicking; (2) advanced non-linear morphological moving target detection; (3) robust multiple target (vehicles, dismounts, and human) tracking (up to 100 target tracks); and (4) moving or static target/object recognition (super-resolution). Test results with real FMV data indicate that our ADES can reliably detect, track, and recognize multiple vehicles under heavy urban background clutters. Furthermore, our example shows that ADES as a baseline platform can provide capability for vehicle abnormal behavior detection to help imagery analysts quickly trace down potential threats and crimes.

  16. Successful demonstration of a comprehensive lithography defect monitoring strategy

    NASA Astrophysics Data System (ADS)

    Peterson, Ingrid B.; Breaux, Louis H.; Cross, Andrew; von den Hoff, Michael

    2003-07-01

    This paper describes the validation of the methodology, the model and the impact of an optimized Lithography Defect Monitoring Strategy at two different semiconductor manufacturing factories. The lithography defect inspection optimization was implemented for the Gate Module at both factories running 0.13-0.15μm technologies on 200mm wafers, one running microprocessor and the other memory devices. As minimum dimensions and process windows decrease in the lithography area, new technologies and technological advances with resists and resist systems are being implemented to meet the demands. Along with these new technological advances in the lithography area comes potentially unforeseen defect issues. The latest lithography processes involve new resists in extremely thin, uniform films, exposing the films under conditions of highly optimized focus and illumination, and finally removing the resist completely and cleanly. The lithography cell is defined as the cluster of process equipment that accomplishes the coating process (surface prep, resist spin, edge-bead removal and soft bake), the alignment and exposure, and the developing process (post-exposure bake, develop, rinse) of the resist. Often the resist spinning process involves multiple materials such as BARC (bottom ARC) and / or TARC (top ARC) materials in addition to the resist itself. The introduction of these new materials with the multiple materials interfaces and the tightness of the process windows leads to an increased variety of defect mechanisms in the lithography area. Defect management in the lithography area has become critical to successful product introduction and yield ramp. The semiconductor process itself contributes the largest number and variety of defects, and a significant portion of the total defects originate within the lithography cell. From a defect management perspective, the lithography cell has some unique characteristics. First, defects in the lithography process module have the widest range of sizes, from full-wafer to suboptical, and with the largest variety of characteristics. Some of these defects fall into the categories of coating problems, focus and exposure defects, developer defects, edge-bead removal problems, contamination and scratches usually defined as lithography macro defects as shown in Figure 1. Others fall into the category of lithography micro defects, Figure 2. They are characterized as having low topography such as stains, developer spots, satellites, are very small such as micro-bridging, partial micro-bridging, micro-bubbles, CD variation and single isolated missing or deformed contacts or vias. Lithography is the only area of the fab besides CMP in which defect excursions can be corrected by reworking the wafers. The opportunity to fix defect problems without scrapping wafers is best served by a defect inspection strategy that captures the full range of all relevant defect types with a proper balance between the costs of monitoring and inspection and the potential cost of yield loss. In the previous paper [1] it was shown that a combination of macro inspection and high numerical aperture (NA) brightfield imaging inspection technology is best suited for the application in the case of the idealized fab modeled. In this paper we will report on the successful efforts in implementing and validating the lithography defect monitoring strategy at two existing 200 mm factories running 0.15 μm and 0.13 μm design rules.

  17. A scaleable integrated sensing and control system for NDE, monitoring, and control of medium to very large composite smart structures

    NASA Astrophysics Data System (ADS)

    Jones, Jerry; Rhoades, Valerie; Arner, Radford; Clem, Timothy; Cuneo, Adam

    2007-04-01

    NDE measurements, monitoring, and control of smart and adaptive composite structures requires that the central knowledge system have an awareness of the entire structure. Achieving this goal necessitates the implementation of an integrated network of significant numbers of sensors. Additionally, in order to temporally coordinate the data from specially distributed sensors, the data must be time relevant. Early adoption precludes development of sensor technology specifically for this application, instead it will depend on the ability to utilize legacy systems. Partially supported by the U.S. Department of Commerce, National Institute of Standards and Technology, Advanced Technology Development Program (NIST-ATP), a scalable integrated system has been developed to implement monitoring of structural integrity and the control of adaptive/intelligent structures. The project, called SHIELD (Structural Health Identification and Electronic Life Determination), was jointly undertaken by: Caterpillar, N.A. Tech., Motorola, and Microstrain. SHIELD is capable of operation with composite structures, metallic structures, or hybrid structures. SHIELD consists of a real-time processing core on a Motorola MPC5200 using a C language based real-time operating system (RTOS). The RTOS kernel was customized to include a virtual backplane which makes the system completely scalable. This architecture provides for multiple processes to be operating simultaneously. They may be embedded as multiple threads on the core hardware or as separate independent processors connected to the core using a software driver called a NAT-Network Integrator (NATNI). NATNI's can be created for any communications application. In it's current embodiment, NATNI's have been created for CAN bus, TCP/IP (Ethernet) - both wired and 802.11 b and g, and serial communications using RS485 and RS232. Since SHIELD uses standard C language, it is easy to port any monitoring or control algorithm, thus providing for legacy technology which may use other hardware processors and various communications means. For example, two demonstrations of SHIELD have been completed, in January and May 2005 respectively. One demonstration used algorithms in C running in multiple threads in the SHIELD core and utilizing two different sensor networks, one CAN bus and one wireless. The second had algorithms operating in C on the SHIELD core and other algorithms running on multiple Texas Instruments DSP processors using a NATNI that communicated via wired TCP/IP. A key feature of SHIELD is the implementation of a wireless ZIGBEE (802.15.4) network for implementing large numbers of small, low cost, low power sensors communication via a meshstar wireless network. While SHIELD was designed to integrate with a wide variety of existing communications protocols, a ZIGBEE network capability was implemented specifically for SHIELD. This will facilitate the monitoring of medium to very large structures including marine applications, utility scale multi-megawatt wind energy systems, and aircraft/spacecraft. The SHIELD wireless network will facilitate large numbers of sensors (up to 32000), accommodate sensors embedded into the composite material, can communicate to both sensors and actuators, and prevents obsolescence by providing for re-programming of the nodes via remote RF communications. The wireless network provides for ultra-low energy use, spatial location, and accurate timestamping, utilizing the beaconing feature of ZIGBEE.

  18. Using LabView for real-time monitoring and tracking of multiple biological objects

    NASA Astrophysics Data System (ADS)

    Nikolskyy, Aleksandr I.; Krasilenko, Vladimir G.; Bilynsky, Yosyp Y.; Starovier, Anzhelika

    2017-04-01

    Today real-time studying and tracking of movement dynamics of various biological objects is important and widely researched. Features of objects, conditions of their visualization and model parameters strongly influence the choice of optimal methods and algorithms for a specific task. Therefore, to automate the processes of adaptation of recognition tracking algorithms, several Labview project trackers are considered in the article. Projects allow changing templates for training and retraining the system quickly. They adapt to the speed of objects and statistical characteristics of noise in images. New functions of comparison of images or their features, descriptors and pre-processing methods will be discussed. The experiments carried out to test the trackers on real video files will be presented and analyzed.

  19. Pharmacokinetics-on-a-Chip Using Label-Free SERS Technique for Programmable Dual-Drug Analysis.

    PubMed

    Fei, Jiayuan; Wu, Lei; Zhang, Yizhi; Zong, Shenfei; Wang, Zhuyuan; Cui, Yiping

    2017-06-23

    Synergistic effects of dual or multiple drugs have attracted great attention in medical fields, especially in cancer therapies. We provide a programmable microfluidic platform for pharmacokinetic detection of multiple drugs in multiple cells. The well-designed microfluidic platform includes two 2 × 3 microarrays of cell chambers, two gradient generators, and several pneumatic valves. Through the combined use of valves and gradient generators, each chamber can be controlled to infuse different kinds of living cells and drugs with specific concentrations as needed. In our experiments, 6-mercaptopurine (6MP) and methimazole (MMI) were chosen as two drug models and their pharmacokinetic parameters in different living cells were monitored through intracellular SERS spectra, which reflected the molecular structure of these drugs. The dynamic change of SERS fingerprints from 6MP and MMI molecules were recorded during drug metabolism in living cells. The results indicated that both 6MP and MMI molecules were diffused into the cells within 4 min and excreted out after 36 h. Moreover, the intracellular distribution of these drugs was monitored through SERS mapping. Thus, our microfluidic platform simultaneously accomplishes the functions to monitor pharmacokinetic action, distribution, and fingerprint of multiple drugs in multiple cells. Owing to its real-time, rapid-speed, high-precision, and programmable capability of multiple-drug and multicell analysis, such a microfluidic platform has great potential in drug design and development.

  20. Machine learning and predictive data analytics enabling metrology and process control in IC fabrication

    NASA Astrophysics Data System (ADS)

    Rana, Narender; Zhang, Yunlin; Wall, Donald; Dirahoui, Bachir; Bailey, Todd C.

    2015-03-01

    Integrate circuit (IC) technology is going through multiple changes in terms of patterning techniques (multiple patterning, EUV and DSA), device architectures (FinFET, nanowire, graphene) and patterning scale (few nanometers). These changes require tight controls on processes and measurements to achieve the required device performance, and challenge the metrology and process control in terms of capability and quality. Multivariate data with complex nonlinear trends and correlations generally cannot be described well by mathematical or parametric models but can be relatively easily learned by computing machines and used to predict or extrapolate. This paper introduces the predictive metrology approach which has been applied to three different applications. Machine learning and predictive analytics have been leveraged to accurately predict dimensions of EUV resist patterns down to 18 nm half pitch leveraging resist shrinkage patterns. These patterns could not be directly and accurately measured due to metrology tool limitations. Machine learning has also been applied to predict the electrical performance early in the process pipeline for deep trench capacitance and metal line resistance. As the wafer goes through various processes its associated cost multiplies. It may take days to weeks to get the electrical performance readout. Predicting the electrical performance early on can be very valuable in enabling timely actionable decision such as rework, scrap, feedforward, feedback predicted information or information derived from prediction to improve or monitor processes. This paper provides a general overview of machine learning and advanced analytics application in the advanced semiconductor development and manufacturing.

  1. Resolution and Assignment of Differential Ion Mobility Spectra of Sarcosine and Isomers.

    PubMed

    Berthias, Francis; Maatoug, Belkis; Glish, Gary L; Moussa, Fathi; Maitre, Philippe

    2018-04-01

    Due to their central role in biochemical processes, fast separation and identification of amino acids (AA) is of importance in many areas of the biomedical field including the diagnosis and monitoring of inborn errors of metabolism and biomarker discovery. Due to the large number of AA together with their isomers and isobars, common methods of AA analysis are tedious and time-consuming because they include a chromatographic separation step requiring pre- or post-column derivatization. Here, we propose a rapid method of separation and identification of sarcosine, a biomarker candidate of prostate cancer, from isomers using differential ion mobility spectrometry (DIMS) interfaced with a tandem mass spectrometer (MS/MS) instrument. Baseline separation of protonated sarcosine from α- and β-alanine isomers can be easily achieved. Identification of DIMS peak is performed using an isomer-specific activation mode where DIMS- and mass-selected ions are irradiated at selected wavenumbers allowing for the specific fragmentation via an infrared multiple photon dissociation (IRMPD) process. Two orthogonal methods to MS/MS are thus added, where the MS/MS(IRMPD) is nothing but an isomer-specific multiple reaction monitoring (MRM) method. The identification relies on the comparison of DIMS-MS/MS(IRMPD) chromatograms recorded at different wavenumbers. Based on the comparison of IR spectra of the three isomers, it is shown that specific depletion of the two protonated α- and β-alanine can be achieved, thus allowing for clear identification of the sarcosine peak. It is also demonstrated that DIMS-MS/MS(IRMPD) spectra in the carboxylic C=O stretching region allow for the resolution of overlapping DIMS peaks. Graphical Abstract ᅟ.

  2. Self-Monitoring of Attention versus Self-Monitoring of Academic Performance: Effects among Students with ADHD in the General Education Classroom

    ERIC Educational Resources Information Center

    Harris, Karen R.; Friedlander, Barbara Danoff; Saddler, Bruce; Frizzelle, Remedios; Graham, Steve

    2005-01-01

    A counterbalanced, multiple-baseline, across-subjects design was used to determine if attention and performance monitoring had differential effects on the on-task and spelling study behavior of 6 elementary students with attention-deficit/hyperactivity disorder (ADHD) in the general education classroom. Both self-monitoring of attention and…

  3. Multiple bio-monitoring system using visible light for electromagnetic-wave free indoor healthcare

    NASA Astrophysics Data System (ADS)

    An, Jinyoung; Pham, Ngoc Quan; Chung, Wan-Young

    2017-12-01

    In this paper, a multiple biomedical data transmission system with visible light communication (VLC) is proposed for an electromagnetic-wave-free indoor healthcare. VLC technology has emerged as an alternative solution to radio-frequency (RF) wireless systems, due to its various merits, e.g., ubiquity, power efficiency, no RF radiation, and security. With VLC, critical bio-medical signals, including electrocardiography (ECG), can be transmitted in places where RF radiation is restricted. This potential advantage of VLC could save more lives in emergency situations. A time hopping (TH) scheme is employed to transfer multiple medical-data streams in real time with a simple system design. Multiple data streams are transmitted using identical color LEDs and go into an optical detector. The received multiple data streams are demodulated and rearranged using a TH-based demodulator. The medical data is then monitored and managed to provide the necessary medical care for each patient.

  4. Interactive Genetic Algorithm - An Adaptive and Interactive Decision Support Framework for Design of Optimal Groundwater Monitoring Plans

    NASA Astrophysics Data System (ADS)

    Babbar-Sebens, M.; Minsker, B. S.

    2006-12-01

    In the water resources management field, decision making encompasses many kinds of engineering, social, and economic constraints and objectives. Representing all of these problem dependant criteria through models (analytical or numerical) and various formulations (e.g., objectives, constraints, etc.) within an optimization- simulation system can be a very non-trivial issue. Most models and formulations utilized for discerning desirable traits in a solution can only approximate the decision maker's (DM) true preference criteria, and they often fail to consider important qualitative and incomputable phenomena related to the management problem. In our research, we have proposed novel decision support frameworks that allow DMs to actively participate in the optimization process. The DMs explicitly indicate their true preferences based on their subjective criteria and the results of various simulation models and formulations. The feedback from the DMs is then used to guide the search process towards solutions that are "all-rounders" from the perspective of the DM. The two main research questions explored in this work are: a) Does interaction between the optimization algorithm and a DM assist the system in searching for groundwater monitoring designs that are robust from the DM's perspective?, and b) How can an interactive search process be made more effective when human factors, such as human fatigue and cognitive learning processes, affect the performance of the algorithm? The application of these frameworks on a real-world groundwater long-term monitoring (LTM) case study in Michigan highlighted the following salient advantages: a) in contrast to the non-interactive optimization methodology, the proposed interactive frameworks were able to identify low cost monitoring designs whose interpolation maps respected the expected spatial distribution of the contaminants, b) for many same-cost designs, the interactive methodologies were able to propose multiple alternatives that met the DM's preference criteria, therefore allowing the expert to select among several strong candidate designs depending on her/his LTM budget, c) two of the methodologies - Case-Based Micro Interactive Genetic Algorithm (CBMIGA) and Interactive Genetic Algorithm with Mixed Initiative Interaction (IGAMII) - were also able to assist in controlling human fatigue and adapt to the DM's learning process.

  5. Geophysical monitoring and reactive transport modeling of ureolytically-driven calcium carbonate precipitation

    PubMed Central

    2011-01-01

    Ureolytically-driven calcium carbonate precipitation is the basis for a promising in-situ remediation method for sequestration of divalent radionuclide and trace metal ions. It has also been proposed for use in geotechnical engineering for soil strengthening applications. Monitoring the occurrence, spatial distribution, and temporal evolution of calcium carbonate precipitation in the subsurface is critical for evaluating the performance of this technology and for developing the predictive models needed for engineering application. In this study, we conducted laboratory column experiments using natural sediment and groundwater to evaluate the utility of geophysical (complex resistivity and seismic) sensing methods, dynamic synchrotron x-ray computed tomography (micro-CT), and reactive transport modeling for tracking ureolytically-driven calcium carbonate precipitation processes under site relevant conditions. Reactive transport modeling with TOUGHREACT successfully simulated the changes of the major chemical components during urea hydrolysis. Even at the relatively low level of urea hydrolysis observed in the experiments, the simulations predicted an enhanced calcium carbonate precipitation rate that was 3-4 times greater than the baseline level. Reactive transport modeling results, geophysical monitoring data and micro-CT imaging correlated well with reaction processes validated by geochemical data. In particular, increases in ionic strength of the pore fluid during urea hydrolysis predicted by geochemical modeling were successfully captured by electrical conductivity measurements and confirmed by geochemical data. The low level of urea hydrolysis and calcium carbonate precipitation suggested by the model and geochemical data was corroborated by minor changes in seismic P-wave velocity measurements and micro-CT imaging; the latter provided direct evidence of sparsely distributed calcium carbonate precipitation. Ion exchange processes promoted through NH4+ production during urea hydrolysis were incorporated in the model and captured critical changes in the major metal species. The electrical phase increases were potentially due to ion exchange processes that modified charge structure at mineral/water interfaces. Our study revealed the potential of geophysical monitoring for geochemical changes during urea hydrolysis and the advantages of combining multiple approaches to understand complex biogeochemical processes in the subsurface. PMID:21943229

  6. New algorithms for processing time-series big EEG data within mobile health monitoring systems.

    PubMed

    Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani; Harous, Saad; Navaz, Alramzana Nujum

    2017-10-01

    Recent advances in miniature biomedical sensors, mobile smartphones, wireless communications, and distributed computing technologies provide promising techniques for developing mobile health systems. Such systems are capable of monitoring epileptic seizures reliably, which are classified as chronic diseases. Three challenging issues raised in this context with regard to the transformation, compression, storage, and visualization of big data, which results from a continuous recording of epileptic seizures using mobile devices. In this paper, we address the above challenges by developing three new algorithms to process and analyze big electroencephalography data in a rigorous and efficient manner. The first algorithm is responsible for transforming the standard European Data Format (EDF) into the standard JavaScript Object Notation (JSON) and compressing the transformed JSON data to decrease the size and time through the transfer process and to increase the network transfer rate. The second algorithm focuses on collecting and storing the compressed files generated by the transformation and compression algorithm. The collection process is performed with respect to the on-the-fly technique after decompressing files. The third algorithm provides relevant real-time interaction with signal data by prospective users. It particularly features the following capabilities: visualization of single or multiple signal channels on a smartphone device and query data segments. We tested and evaluated the effectiveness of our approach through a software architecture model implementing a mobile health system to monitor epileptic seizures. The experimental findings from 45 experiments are promising and efficiently satisfy the approach's objectives in a price of linearity. Moreover, the size of compressed JSON files and transfer times are reduced by 10% and 20%, respectively, while the average total time is remarkably reduced by 67% through all performed experiments. Our approach successfully develops efficient algorithms in terms of processing time, memory usage, and energy consumption while maintaining a high scalability of the proposed solution. Our approach efficiently supports data partitioning and parallelism relying on the MapReduce platform, which can help in monitoring and automatic detection of epileptic seizures. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Electric terminal performance and characterization of solid oxide fuel cells and systems

    NASA Astrophysics Data System (ADS)

    Lindahl, Peter Allan

    Solid Oxide Fuel Cells (SOFCs) are electrochemical devices which can effect efficient, clean, and quiet conversion of chemical to electrical energy. In contrast to conventional electricity generation systems which feature multiple discrete energy conversion processes, SOFCs are direct energy conversion devices. That is, they feature a fully integrated chemical to electrical energy conversion process where the electric load demanded of the cell intrinsically drives the electrochemical reactions and associated processes internal to the cell. As a result, the cell's electric terminals provide a path for interaction between load side electric demand and the conversion side processes. The implication of this is twofold. First, the magnitude and dynamic characteristics of the electric load demanded of the cell can directly impact the long-term efficacy of the cell's chemical to electrical energy conversion. Second, the electric terminal response to dynamic loads can be exploited for monitoring the cell's conversion side processes and used in diagnostic analysis and degradation-mitigating control schemes. This dissertation presents a multi-tier investigation into this electric terminal based performance characterization of SOFCs through the development of novel test systems, analysis techniques and control schemes. First, a reference-based simulation system is introduced. This system scales up the electric terminal performance of a prototype SOFC system, e.g. a single fuel cell, to that of a full power-level stack. This allows realistic stack/load interaction studies while maintaining explicit ability for post-test analysis of the prototype system. Next, a time-domain least squares fitting method for electrochemical impedance spectroscopy (EIS) is developed for reduced-time monitoring of the electrochemical and physicochemical mechanics of the fuel cell through its electric terminals. The utility of the reference-based simulator and the EIS technique are demonstrated through their combined use in the performance testing of a hybrid-source power management (HSPM) system designed to allow in-situ EIS monitoring of a stack under dynamic loading conditions. The results from the latter study suggest that an HSPM controller allows an opportunity for in-situ electric terminal monitoring and control-based mitigation of SOFC degradation. As such, an exploration of control-based SOFC degradation mitigation is presented and ideas for further work are suggested.

  8. Introducing passive acoustic filter in acoustic based condition monitoring: Motor bike piston-bore fault identification

    NASA Astrophysics Data System (ADS)

    Jena, D. P.; Panigrahi, S. N.

    2016-03-01

    Requirement of designing a sophisticated digital band-pass filter in acoustic based condition monitoring has been eliminated by introducing a passive acoustic filter in the present work. So far, no one has attempted to explore the possibility of implementing passive acoustic filters in acoustic based condition monitoring as a pre-conditioner. In order to enhance the acoustic based condition monitoring, a passive acoustic band-pass filter has been designed and deployed. Towards achieving an efficient band-pass acoustic filter, a generalized design methodology has been proposed to design and optimize the desired acoustic filter using multiple filter components in series. An appropriate objective function has been identified for genetic algorithm (GA) based optimization technique with multiple design constraints. In addition, the sturdiness of the proposed method has been demonstrated in designing a band-pass filter by using an n-branch Quincke tube, a high pass filter and multiple Helmholtz resonators. The performance of the designed acoustic band-pass filter has been shown by investigating the piston-bore defect of a motor-bike using engine noise signature. On the introducing a passive acoustic filter in acoustic based condition monitoring reveals the enhancement in machine learning based fault identification practice significantly. This is also a first attempt of its own kind.

  9. Quantitative Mass Spectrometry by Isotope Dilution and Multiple Reaction Monitoring (MRM).

    PubMed

    Russo, Paul; Hood, Brian L; Bateman, Nicholas W; Conrads, Thomas P

    2017-01-01

    Selected reaction monitoring (SRM) is used in molecular profiling to detect and quantify specific known proteins in complex mixtures. Using isotope dilution (Barnidge et al., Anal Chem 75(3):445-451, 2003) methodologies, peptides can be quantified without the need for an antibody-based method. Selected reaction monitoring assays employ electrospray ionization mass spectrometry (ESI-MS) followed by two stages of mass selection: a first stage where the mass of the peptide ion is selected and, after fragmentation by collision-induced dissociation (CID), a second stage (tandem MS) where either a single (e.g., SRM) or multiple (multiple reaction monitoring, MRM) specific peptide fragment ions are transmitted for detection. The MRM experiment is accomplished by specifying the parent masses of the selected endogenous and isotope-labeled peptides for MS/MS fragmentation and then monitoring fragment ions of interest, using their intensities/abundances and relative ratios to quantify the parent protein of interest. In this example protocol, we will utilize isotope dilution MRM-MS to quantify in absolute terms the total levels of the protein of interest, ataxia telangiectasia mutated (ATM) serine/threonine protein kinase. Ataxia telangiectasia mutated (ATM) phosphorylates several key proteins that initiate activation of the DNA damage checkpoint leading to cell cycle arrest.

  10. Monitoring of self-healing composites: a nonlinear ultrasound approach

    NASA Astrophysics Data System (ADS)

    Malfense Fierro, Gian-Piero; Pinto, Fulvio; Dello Iacono, Stefania; Martone, Alfonso; Amendola, Eugenio; Meo, Michele

    2017-11-01

    Self-healing composites using a thermally mendable polymer, based on Diels-Alder reaction were fabricated and subjected to various multiple damage loads. Unlike traditional destructive methods, this work presents a nonlinear ultrasound technique to evaluate the structural recovery of the proposed self-healing laminate structures. The results were compared to computer tomography and linear ultrasound methods. The laminates were subjected to multiple loading and healing cycles and the induced damage and recovery at each stage was evaluated. The results highlight the benefit and added advantage of using a nonlinear based methodology to monitor the structural recovery of reversibly cross-linked epoxy with efficient recycling and multiple self-healing capability.

  11. Parallelization of a spatial random field characterization process using the Method of Anchored Distributions and the HTCondor high throughput computing system

    NASA Astrophysics Data System (ADS)

    Osorio-Murillo, C. A.; Over, M. W.; Frystacky, H.; Ames, D. P.; Rubin, Y.

    2013-12-01

    A new software application called MAD# has been coupled with the HTCondor high throughput computing system to aid scientists and educators with the characterization of spatial random fields and enable understanding the spatial distribution of parameters used in hydrogeologic and related modeling. MAD# is an open source desktop software application used to characterize spatial random fields using direct and indirect information through Bayesian inverse modeling technique called the Method of Anchored Distributions (MAD). MAD relates indirect information with a target spatial random field via a forward simulation model. MAD# executes inverse process running the forward model multiple times to transfer information from indirect information to the target variable. MAD# uses two parallelization profiles according to computational resources available: one computer with multiple cores and multiple computers - multiple cores through HTCondor. HTCondor is a system that manages a cluster of desktop computers for submits serial or parallel jobs using scheduling policies, resources monitoring, job queuing mechanism. This poster will show how MAD# reduces the time execution of the characterization of random fields using these two parallel approaches in different case studies. A test of the approach was conducted using 1D problem with 400 cells to characterize saturated conductivity, residual water content, and shape parameters of the Mualem-van Genuchten model in four materials via the HYDRUS model. The number of simulations evaluated in the inversion was 10 million. Using the one computer approach (eight cores) were evaluated 100,000 simulations in 12 hours (10 million - 1200 hours approximately). In the evaluation on HTCondor, 32 desktop computers (132 cores) were used, with a processing time of 60 hours non-continuous in five days. HTCondor reduced the processing time for uncertainty characterization by a factor of 20 (1200 hours reduced to 60 hours.)

  12. Toward daily monitoring of vegetation conditions at field scale through fusing data from multiple sensors

    USDA-ARS?s Scientific Manuscript database

    Vegetation monitoring requires remote sensing data at fine spatial and temporal resolution. While imagery from coarse resolution sensors such as MODIS/VIIRS can provide daily observations, they lack spatial detail to capture surface features for crop and rangeland monitoring. The Landsat satellite s...

  13. SiMon: Simulation Monitor for Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  14. Neutron multiplicity counting: Confidence intervals for reconstruction parameters

    DOE PAGES

    Verbeke, Jerome M.

    2016-03-09

    From nuclear materials accountability to homeland security, the need for improved nuclear material detection, assay, and authentication has grown over the past decades. Starting in the 1940s, neutron multiplicity counting techniques have enabled quantitative evaluation of masses and multiplications of fissile materials. In this paper, we propose a new method to compute uncertainties on these parameters using a model-based sequential Bayesian processor, resulting in credible regions in the fissile material mass and multiplication space. These uncertainties will enable us to evaluate quantitatively proposed improvements to the theoretical fission chain model. Additionally, because the processor can calculate uncertainties in real time,more » it is a useful tool in applications such as portal monitoring: monitoring can stop as soon as a preset confidence of non-threat is reached.« less

  15. Development of climate data storage and processing model

    NASA Astrophysics Data System (ADS)

    Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.

    2016-11-01

    We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.

  16. Real-time physiological monitoring with distributed networks of sensors and object-oriented programming techniques

    NASA Astrophysics Data System (ADS)

    Wiesmann, William P.; Pranger, L. Alex; Bogucki, Mary S.

    1998-05-01

    Remote monitoring of physiologic data from individual high- risk workers distributed over time and space is a considerable challenge. This is often due to an inadequate capability to accurately integrate large amounts of data into usable information in real time. In this report, we have used the vertical and horizontal organization of the 'fireground' as a framework to design a distributed network of sensors. In this system, sensor output is linked through a hierarchical object oriented programing process to accurately interpret physiological data, incorporate these data into a synchronous model and relay processed data, trends and predictions to members of the fire incident command structure. There are several unique aspects to this approach. The first includes a process to account for variability in vital parameter values for each individual's normal physiologic response by including an adaptive network in each data process. This information is used by the model in an iterative process to baseline a 'normal' physiologic response to a given stress for each individual and to detect deviations that indicate dysfunction or a significant insult. The second unique capability of the system orders the information for each user including the subject, local company officers, medical personnel and the incident commanders. Information can be retrieved and used for training exercises and after action analysis. Finally this system can easily be adapted to existing communication and processing links along with incorporating the best parts of current models through the use of object oriented programming techniques. These modern software techniques are well suited to handling multiple data processes independently over time in a distributed network.

  17. Parallel computer processing and modeling: applications for the ICU

    NASA Astrophysics Data System (ADS)

    Baxter, Grant; Pranger, L. Alex; Draghic, Nicole; Sims, Nathaniel M.; Wiesmann, William P.

    2003-07-01

    Current patient monitoring procedures in hospital intensive care units (ICUs) generate vast quantities of medical data, much of which is considered extemporaneous and not evaluated. Although sophisticated monitors to analyze individual types of patient data are routinely used in the hospital setting, this equipment lacks high order signal analysis tools for detecting long-term trends and correlations between different signals within a patient data set. Without the ability to continuously analyze disjoint sets of patient data, it is difficult to detect slow-forming complications. As a result, the early onset of conditions such as pneumonia or sepsis may not be apparent until the advanced stages. We report here on the development of a distributed software architecture test bed and software medical models to analyze both asynchronous and continuous patient data in real time. Hardware and software has been developed to support a multi-node distributed computer cluster capable of amassing data from multiple patient monitors and projecting near and long-term outcomes based upon the application of physiologic models to the incoming patient data stream. One computer acts as a central coordinating node; additional computers accommodate processing needs. A simple, non-clinical model for sepsis detection was implemented on the system for demonstration purposes. This work shows exceptional promise as a highly effective means to rapidly predict and thereby mitigate the effect of nosocomial infections.

  18. Monitoring Results in Routine Immunization: Development of Routine Immunization Dashboard in Selected African Countries in the Context of the Polio Eradication Endgame Strategic Plan.

    PubMed

    Poy, Alain; van den Ent, Maya M V X; Sosler, Stephen; Hinman, Alan R; Brown, Sidney; Sodha, Samir; Ehlman, Daniel C; Wallace, Aaron S; Mihigo, Richard

    2017-07-01

    To monitor immunization-system strengthening in the Polio Eradication Endgame Strategic Plan 2013-2018 (PEESP), the Global Polio Eradication Initiative identified 1 indicator: 10% annual improvement in third dose of diphtheria- tetanus-pertussis-containing vaccine (DTP3) coverage in polio high-risk districts of 10 polio focus countries. A multiagency team, including staff from the African Region, developed a comprehensive list of outcome and process indicators measuring various aspects of the performance of an immunization system. The development and implementation of the dashboard to assess immunization system performance allowed national program managers to monitor the key immunization indicators and stratify by high-risk and non-high-risk districts. Although only a single outcome indicator goal (at least 10% annual increase in DTP3 coverage achieved in 80% of high-risk districts) initially existed in the endgame strategy, we successfully added additional outcome indicators (eg, decreasing the number of DTP3-unvaccinated children) as well as program process indicators focusing on cold chain, stock availability, and vaccination sessions to better describe progress on the pathway to raising immunization coverage. When measuring progress toward improving immunization systems, it is helpful to use a comprehensive approach that allows for measuring multiple dimensions of the system. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  19. Harmonization of Multiple Forest Disturbance Data to Create a 1986-2011 Database for the Conterminous United States

    NASA Astrophysics Data System (ADS)

    Soulard, C. E.; Acevedo, W.; Yang, Z.; Cohen, W. B.; Stehman, S. V.; Taylor, J. L.

    2015-12-01

    A wide range of spatial forest disturbance data exist for the conterminous United States, yet inconsistencies between map products arise because of differing programmatic objectives and methodologies. Researchers on the Land Change Research Project (LCRP) are working to assess spatial agreement, characterize uncertainties, and resolve discrepancies between these national level datasets, in regard to forest disturbance. Disturbance maps from the Global Forest Change (GFC), Landfire Vegetation Disturbance (LVD), National Land Cover Dataset (NLCD), Vegetation Change Tracker (VCT), Web-enabled Landsat Data (WELD), and Monitoring Trends in Burn Severity (MTBS) were harmonized using a pixel-based data fusion process. The harmonization process reconciled forest harvesting, forest fire, and remaining forest disturbance across four intervals (1986-1992, 1992-2001, 2001-2006, and 2006-2011) by relying on convergence of evidence across all datasets available for each interval. Pixels with high agreement across datasets were retained, while moderate-to-low agreement pixels were visually assessed and either manually edited using reference imagery or discarded from the final disturbance map(s). National results show that annual rates of forest harvest and overall fire have increased over the past 25 years. Overall, this study shows that leveraging the best elements of readily-available data improves forest loss monitoring relative to using a single dataset to monitor forest change, particularly by reducing commission errors.

  20. A motion-tolerant approach for monitoring SpO2 and heart rate using photoplethysmography signal with dual frame length processing and multi-classifier fusion.

    PubMed

    Fan, Feiyi; Yan, Yuepeng; Tang, Yongzhong; Zhang, Hao

    2017-12-01

    Monitoring pulse oxygen saturation (SpO 2 ) and heart rate (HR) using photoplethysmography (PPG) signal contaminated by a motion artifact (MA) remains a difficult problem, especially when the oximeter is not equipped with a 3-axis accelerometer for adaptive noise cancellation. In this paper, we report a pioneering investigation on the impact of altering the frame length of Molgedey and Schuster independent component analysis (ICAMS) on performance, design a multi-classifier fusion strategy for selecting the PPG correlated signal component, and propose a novel approach to extract SpO 2 and HR readings from PPG signal contaminated by strong MA interference. The algorithm comprises multiple stages, including dual frame length ICAMS, a multi-classifier-based PPG correlated component selector, line spectral analysis, tree-based HR monitoring, and post-processing. Our approach is evaluated by multi-subject tests. The root mean square error (RMSE) is calculated for each trial. Three statistical metrics are selected as performance evaluation criteria: mean RMSE, median RMSE and the standard deviation (SD) of RMSE. The experimental results demonstrate that a shorter ICAMS analysis window probably results in better performance in SpO 2 estimation. Notably, the designed multi-classifier signal component selector achieved satisfactory performance. The subject tests indicate that our algorithm outperforms other baseline methods regarding accuracy under most criteria. The proposed work can contribute to improving the performance of current pulse oximetry and personal wearable monitoring devices. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Monitoring the impacts of Ocean Acidification on coral reef bioerosion: challenges, methods, recommendations

    NASA Astrophysics Data System (ADS)

    Enochs, I.; Manzello, D.; Carlton, R.

    2013-05-01

    Coral reef habitats exist as a dynamic balance between the additive process of calcification and the destructive effects of erosion. A disruption to either the positive or negative side of the coral reef carbonate budget can push a reef system towards rapid collapse. It is well understood that Ocean Acidification (OA) may impair calcification and emerging experimental evidence suggests that it will likely increase the erosive potential of a diverse suite of bioeroding taxa. This may lead to previously unforeseen scenarios where reef framework degradation occurs at a faster pace than that predicted by more simplistic models, resulting from the multifaceted impacts of both slower coral growth and enhanced rates of habitat erosion. As such, it is of paramount importance that monitoring plans tasked with assessing reef resilience to climate change and OA incorporate methods for quantifying bioerosion. This is a complex undertaking as reef ecosystem bioerosion is the result of numerous behaviors, employed by diverse flora and fauna, operating at vastly different scales. Furthermore, these erosive processes are highly variable, dependent on seasonal fluctuations and differing between reef regions, species, individuals, and even the physical characteristics of the substrates acted upon. The strengths and weaknesses of existing bioerosion monitoring methodologies are discussed, ranging from quantification of single species erosion rates to multi-phyletic census-based approaches. Traditional techniques involving the weight change of carbonate blocks are compared alongside more modern methodologies such as micro computed tomography. Finally, recommendations are made for a comprehensive monitoring strategy, incorporating multiple methodologies in a time and cost-effective manner.

  2. Strategy of Surgical Resection for Glioma Based on Intraoperative Functional Mapping and Monitoring

    PubMed Central

    TAMURA, Manabu; MURAGAKI, Yoshihiro; SAITO, Taiichi; MARUYAMA, Takashi; NITTA, Masayuki; TSUZUKI, Shunsuke; ISEKI, Hiroshi; OKADA, Yoshikazu

    2015-01-01

    A growing number of papers have pointed out the relationship between aggressive resection of gliomas and survival prognosis. For maximum resection, the current concept of surgical decision-making is in “information-guided surgery” using multimodal intraoperative information. With this, anatomical information from intraoperative magnetic resonance imaging (MRI) and navigation, functional information from brain mapping and monitoring, and histopathological information must all be taken into account in the new perspective for innovative minimally invasive surgical treatment of glioma. Intraoperative neurofunctional information such as neurophysiological functional monitoring takes the most important part in the process to acquire objective visual data during tumor removal and to integrate these findings as digitized data for intraoperative surgical decision-making. Moreover, the analysis of qualitative data and threshold-setting for quantitative data raise difficult issues in the interpretation and processing of each data type, such as determination of motor evoked potential (MEP) decline, underestimation in tractography, and judgments of patient response for neurofunctional mapping and monitoring during awake craniotomy. Neurofunctional diagnosis of false-positives in these situations may affect the extent of resection, while false-negatives influence intra- and postoperative complication rates. Additionally, even though the various intraoperative visualized data from multiple sources contribute significantly to the reliability of surgical decisions when the information is integrated and provided, it is not uncommon for individual pieces of information to convey opposing suggestions. Such conflicting pieces of information facilitate higher-order decision-making that is dependent on the policies of the facility and the priorities of the patient, as well as the availability of the histopathological characteristics from resected tissue. PMID:26185825

  3. Health monitoring and rehabilitation of a concrete structure using intelligent materials

    NASA Astrophysics Data System (ADS)

    Song, G.; Mo, Y. L.; Otero, K.; Gu, H.

    2006-04-01

    This paper presents the concept of an intelligent reinforced concrete structure (IRCS) and its application in structural health monitoring and rehabilitation. The IRCS has multiple functions which include self-rehabilitation, self-vibration damping, and self-structural health monitoring. These functions are enabled by two types of intelligent (smart) materials: shape memory alloys (SMAs) and piezoceramics. In this research, Nitinol type SMA and PZT (lead zirconate titanate) type piezoceramics are used. The proposed concrete structure is reinforced by martensite Nitinol cables using the method of post-tensioning. The martensite SMA significantly increases the concrete's damping property and its ability to handle large impact. In the presence of cracks due to explosions or earthquakes, by electrically heating the SMA cables, the SMA cables contract and close up the cracks. In this research, PZT patches are embedded in the concrete structure to detect possible cracks inside the concrete structure. The wavelet packet analysis method is then applied as a signal-processing tool to analyze the sensor signals. A damage index is defined to describe the damage severity for health monitoring purposes. In addition, by monitoring the electric resistance change of the SMA cables, the crack width can be estimated. To demonstrate this concept, a concrete beam specimen with reinforced SMA cables and with embedded PZT patches is fabricated. Experiments demonstrate that the IRC has the ability of self-sensing and self-rehabilitation. Three-point bending tests were conducted. During the loading process, a crack opens up to 0.47 inches. Upon removal of the load and heating the SMA cables, the crack closes up. The damage index formed by wavelet packet analysis of the PZT sensor data predicts and confirms the onset and severity of the crack during the loading. Also during the loading, the electrical resistance value of the SMA cable changes by up to 27% and this phenomenon is used to monitor the crack width.

  4. Wavelet PCA for automatic identification of walking with and without an exoskeleton on a treadmill using pressure and accelerometer sensors.

    PubMed

    Naik, Ganesh R; Pendharkar, Gita; Nguyen, Hung T

    2016-08-01

    Nowadays portable devices with more number of sensors are used for gait assessment and monitoring for elderly and disabled. However, the problem with using multiple sensors is that if they are placed on the same platform or base, there could be cross talk between them, which could change the signal amplitude or add noise to the signal. Hence, this study uses wavelet PCA as a signal processing technique to separate the original sensor signal from the signal obtained from the sensors through the integrated unit to compare the two types of walking (with and without an exoskeleton). This comparison using wavelet PCA will enable the researchers to obtain accurate sensor data and compare and analyze the data in order to further improve the design of compact portable devices used to monitor and assess the gait in stroke or paralyzed subjects. The advantage of designing such systems is that they can also be used to assess and monitor the gait of the stroke subjects at home, which will save them time and efforts to visit the laboratory or clinic.

  5. Development of a portable Linux-based ECG measurement and monitoring system.

    PubMed

    Tan, Tan-Hsu; Chang, Ching-Su; Huang, Yung-Fa; Chen, Yung-Fu; Lee, Cheng

    2011-08-01

    This work presents a portable Linux-based electrocardiogram (ECG) signals measurement and monitoring system. The proposed system consists of an ECG front end and an embedded Linux platform (ELP). The ECG front end digitizes 12-lead ECG signals acquired from electrodes and then delivers them to the ELP via a universal serial bus (USB) interface for storage, signal processing, and graphic display. The proposed system can be installed anywhere (e.g., offices, homes, healthcare centers and ambulances) to allow people to self-monitor their health conditions at any time. The proposed system also enables remote diagnosis via Internet. Additionally, the system has a 7-in. interactive TFT-LCD touch screen that enables users to execute various functions, such as scaling a single-lead or multiple-lead ECG waveforms. The effectiveness of the proposed system was verified by using a commercial 12-lead ECG signal simulator and in vivo experiments. In addition to its portability, the proposed system is license-free as Linux, an open-source code, is utilized during software development. The cost-effectiveness of the system significantly enhances its practical application for personal healthcare.

  6. Monitoring the Urban Tree Cover for Urban Ecosystem Services - The Case of Leipzig, Germany

    NASA Astrophysics Data System (ADS)

    Banzhaf, E.; Kollai, H.

    2015-04-01

    Urban dynamics such as (extreme) growth and shrinkage bring about fundamental challenges for urban land use and related changes. In order to achieve a sustainable urban development, it is crucial to monitor urban green infrastructure at microscale level as it provides various urban ecosystem services in neighbourhoods, supporting quality of life and environmental health. We monitor urban trees by means of a multiple data set to get a detailed knowledge on its distribution and change over a decade for the entire city. We have digital orthophotos, a digital elevation model and a digital surface model. The refined knowledge on the absolute height above ground helps to differentiate tree tops. Grounded on an object-based image analysis scheme a detailed mapping of trees in an urbanized environment is processed. Results show high accuracy of tree detection and avoidance of misclassification due to shadows. The study area is the City of Leipzig, Germany. One of the leading German cities, it is home to contiguous community allotments that characterize the configuration of the city. Leipzig has one of the most well-preserved floodplain forests in Europe.

  7. A Flexible and Wearable Human Stress Monitoring Patch

    PubMed Central

    Yoon, Sunghyun; Sim, Jai Kyoung; Cho, Young-Ho

    2016-01-01

    A human stress monitoring patch integrates three sensors of skin temperature, skin conductance, and pulsewave in the size of stamp (25 mm × 15 mm × 72 μm) in order to enhance wearing comfort with small skin contact area and high flexibility. The skin contact area is minimized through the invention of an integrated multi-layer structure and the associated microfabrication process; thus being reduced to 1/125 of that of the conventional single-layer multiple sensors. The patch flexibility is increased mainly by the development of flexible pulsewave sensor, made of a flexible piezoelectric membrane supported by a perforated polyimide membrane. In the human physiological range, the fabricated stress patch measures skin temperature with the sensitivity of 0.31 Ω/°C, skin conductance with the sensitivity of 0.28 μV/0.02 μS, and pulse wave with the response time of 70 msec. The skin-attachable stress patch, capable to detect multimodal bio-signals, shows potential for application to wearable emotion monitoring. PMID:27004608

  8. Wearable Technology for Chronic Wound Monitoring: Current Dressings, Advancements, and Future Prospects.

    PubMed

    Brown, Matthew S; Ashley, Brandon; Koh, Ahyeon

    2018-01-01

    Chronic non-healing wounds challenge tissue regeneration and impair infection regulation for patients afflicted with this condition. Next generation wound care technology capable of in situ physiological surveillance which can diagnose wound parameters, treat various chronic wound symptoms, and reduce infection at the wound noninvasively with the use of a closed loop therapeutic system would provide patients with an improved standard of care and an accelerated wound repair mechanism. The indicating biomarkers specific to chronic wounds include blood pressure, temperature, oxygen, pH, lactate, glucose, interleukin-6 (IL-6), and infection status. A wound monitoring device would help decrease prolonged hospitalization, multiple doctors' visits, and the expensive lab testing associated with the diagnosis and treatment of chronic wounds. A device capable of monitoring the wound status and stimulating the healing process is highly desirable. In this review, we discuss the impaired physiological states of chronic wounds and explain the current treatment methods. Specifically, we focus on improvements in materials, platforms, fabrication methods for wearable devices, and quantitative analysis of various biomarkers vital to wound healing progress.

  9. Wearable Technology for Chronic Wound Monitoring: Current Dressings, Advancements, and Future Prospects

    PubMed Central

    Brown, Matthew S.; Ashley, Brandon; Koh, Ahyeon

    2018-01-01

    Chronic non-healing wounds challenge tissue regeneration and impair infection regulation for patients afflicted with this condition. Next generation wound care technology capable of in situ physiological surveillance which can diagnose wound parameters, treat various chronic wound symptoms, and reduce infection at the wound noninvasively with the use of a closed loop therapeutic system would provide patients with an improved standard of care and an accelerated wound repair mechanism. The indicating biomarkers specific to chronic wounds include blood pressure, temperature, oxygen, pH, lactate, glucose, interleukin-6 (IL-6), and infection status. A wound monitoring device would help decrease prolonged hospitalization, multiple doctors' visits, and the expensive lab testing associated with the diagnosis and treatment of chronic wounds. A device capable of monitoring the wound status and stimulating the healing process is highly desirable. In this review, we discuss the impaired physiological states of chronic wounds and explain the current treatment methods. Specifically, we focus on improvements in materials, platforms, fabrication methods for wearable devices, and quantitative analysis of various biomarkers vital to wound healing progress. PMID:29755977

  10. A comparative evaluation of adaptive noise cancellation algorithms for minimizing motion artifacts in a forehead-mounted wearable pulse oximeter.

    PubMed

    Comtois, Gary; Mendelson, Yitzhak; Ramuka, Piyush

    2007-01-01

    Wearable physiological monitoring using a pulse oximeter would enable field medics to monitor multiple injuries simultaneously, thereby prioritizing medical intervention when resources are limited. However, a primary factor limiting the accuracy of pulse oximetry is poor signal-to-noise ratio since photoplethysmographic (PPG) signals, from which arterial oxygen saturation (SpO2) and heart rate (HR) measurements are derived, are compromised by movement artifacts. This study was undertaken to quantify SpO2 and HR errors induced by certain motion artifacts utilizing accelerometry-based adaptive noise cancellation (ANC). Since the fingers are generally more vulnerable to motion artifacts, measurements were performed using a custom forehead-mounted wearable pulse oximeter developed for real-time remote physiological monitoring and triage applications. This study revealed that processing motion-corrupted PPG signals by least mean squares (LMS) and recursive least squares (RLS) algorithms can be effective to reduce SpO2 and HR errors during jogging, but the degree of improvement depends on filter order. Although both algorithms produced similar improvements, implementing the adaptive LMS algorithm is advantageous since it requires significantly less operations.

  11. URBAN-NET: A Network-based Infrastructure Monitoring and Analysis System for Emergency Management and Public Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Sangkeun; Chen, Liangzhe; Duan, Sisi

    Abstract Critical Infrastructures (CIs) such as energy, water, and transportation are complex networks that are crucial for sustaining day-to-day commodity flows vital to national security, economic stability, and public safety. The nature of these CIs is such that failures caused by an extreme weather event or a man-made incident can trigger widespread cascading failures, sending ripple effects at regional or even national scales. To minimize such effects, it is critical for emergency responders to identify existing or potential vulnerabilities within CIs during such stressor events in a systematic and quantifiable manner and take appropriate mitigating actions. We present here amore » novel critical infrastructure monitoring and analysis system named URBAN-NET. The system includes a software stack and tools for monitoring CIs, pre-processing data, interconnecting multiple CI datasets as a heterogeneous network, identifying vulnerabilities through graph-based topological analysis, and predicting consequences based on what-if simulations along with visualization. As a proof-of-concept, we present several case studies to show the capabilities of our system. We also discuss remaining challenges and future work.« less

  12. A flight expert system (FLES) for on-board fault monitoring and diagnosis

    NASA Technical Reports Server (NTRS)

    Ali, Moonis; Scharnhorst, D. A.; Ai, C. S.; Feber, H. J.

    1987-01-01

    The increasing complexity of modern aircraft creates a need for a larger number of caution and warning devices. But more alerts require more memorization and higher workloads for the pilot and tend to induce a higher probability of errors. Therefore, an architecture for a flight expert system (FLES) is developed to assist pilots in monitoring, diagnosing and recovering from in-flight faults. A prototype of FLES has been implemented. A sensor simulation model was developed and employed to provide FLES with airplane status information during the diagnostic process. The simulator is based on the Lockheed Advanced Concept System (ACS), a future generation airplane, and on the Boeing 737. A distinction between two types of faults, maladjustments and malfunctions, has led to two approaches to fault diagnosis. These approaches are evident in two FLES subsystems: the flight phase monitor and the sensor interrupt handler. The specific problem addressed in these subsystems has been that of integrating information received from multiple sensors with domain knowledge in order to access abnormal situations during airplane flight. Malfunctions and maladjustments are handled separately, diagnosed using domain knowledge.

  13. General Use of UAS in EW Environment-EW Concepts and Tactics for Single or Multiple UAS Over the Net-Centric Battlefield

    DTIC Science & Technology

    2009-09-01

    Tactics for Single or Multiple UAS over the Net-Centric Battlefield 6. AUTHOR( S ) Mustafa Gokhan Erdemli 5. FUNDING NUMBERS 7. PERFORMING ORGANIZATION...NAME( S ) AND ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943-5000 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING...MONITORING AGENCY NAME( S ) AND ADDRESS(ES) N/A 10. SPONSORING/MONITORING AGENCY REPORT NUMBER 11. SUPPLEMENTARY NOTES The views expressed in this

  14. Immobilized Metal Affinity Chromatography Coupled to Multiple Reaction Monitoring Enables Reproducible Quantification of Phospho-signaling*

    PubMed Central

    Kennedy, Jacob J.; Yan, Ping; Zhao, Lei; Ivey, Richard G.; Voytovich, Uliana J.; Moore, Heather D.; Lin, Chenwei; Pogosova-Agadjanyan, Era L.; Stirewalt, Derek L.; Reding, Kerryn W.; Whiteaker, Jeffrey R.; Paulovich, Amanda G.

    2016-01-01

    A major goal in cell signaling research is the quantification of phosphorylation pharmacodynamics following perturbations. Traditional methods of studying cellular phospho-signaling measure one analyte at a time with poor standardization, rendering them inadequate for interrogating network biology and contributing to the irreproducibility of preclinical research. In this study, we test the feasibility of circumventing these issues by coupling immobilized metal affinity chromatography (IMAC)-based enrichment of phosphopeptides with targeted, multiple reaction monitoring (MRM) mass spectrometry to achieve precise, specific, standardized, multiplex quantification of phospho-signaling responses. A multiplex immobilized metal affinity chromatography- multiple reaction monitoring assay targeting phospho-analytes responsive to DNA damage was configured, analytically characterized, and deployed to generate phospho-pharmacodynamic curves from primary and immortalized human cells experiencing genotoxic stress. The multiplexed assays demonstrated linear ranges of ≥3 orders of magnitude, median lower limit of quantification of 0.64 fmol on column, median intra-assay variability of 9.3%, median inter-assay variability of 12.7%, and median total CV of 16.0%. The multiplex immobilized metal affinity chromatography- multiple reaction monitoring assay enabled robust quantification of 107 DNA damage-responsive phosphosites from human cells following DNA damage. The assays have been made publicly available as a resource to the community. The approach is generally applicable, enabling wide interrogation of signaling networks. PMID:26621847

  15. Efficient and automatic image reduction framework for space debris detection based on GPU technology

    NASA Astrophysics Data System (ADS)

    Diprima, Francesco; Santoni, Fabio; Piergentili, Fabrizio; Fortunato, Vito; Abbattista, Cristoforo; Amoruso, Leonardo

    2018-04-01

    In the last years, the increasing number of space debris has triggered the need of a distributed monitoring system for the prevention of possible space collisions. Space surveillance based on ground telescope allows the monitoring of the traffic of the Resident Space Objects (RSOs) in the Earth orbit. This space debris surveillance has several applications such as orbit prediction and conjunction assessment. In this paper is proposed an optimized and performance-oriented pipeline for sources extraction intended to the automatic detection of space debris in optical data. The detection method is based on the morphological operations and Hough Transform for lines. Near real-time detection is obtained using General Purpose computing on Graphics Processing Units (GPGPU). The high degree of processing parallelism provided by GPGPU allows to split data analysis over thousands of threads in order to process big datasets with a limited computational time. The implementation has been tested on a large and heterogeneous images data set, containing both imaging satellites from different orbit ranges and multiple observation modes (i.e. sidereal and object tracking). These images were taken during an observation campaign performed from the EQUO (EQUatorial Observatory) observatory settled at the Broglio Space Center (BSC) in Kenya, which is part of the ASI-Sapienza Agreement.

  16. Detailed Characterization of Human Induced Pluripotent Stem Cells Manufactured for Therapeutic Applications.

    PubMed

    Baghbaderani, Behnam Ahmadian; Syama, Adhikarla; Sivapatham, Renuka; Pei, Ying; Mukherjee, Odity; Fellner, Thomas; Zeng, Xianmin; Rao, Mahendra S

    2016-08-01

    We have recently described manufacturing of human induced pluripotent stem cells (iPSC) master cell banks (MCB) generated by a clinically compliant process using cord blood as a starting material (Baghbaderani et al. in Stem Cell Reports, 5(4), 647-659, 2015). In this manuscript, we describe the detailed characterization of the two iPSC clones generated using this process, including whole genome sequencing (WGS), microarray, and comparative genomic hybridization (aCGH) single nucleotide polymorphism (SNP) analysis. We compare their profiles with a proposed calibration material and with a reporter subclone and lines made by a similar process from different donors. We believe that iPSCs are likely to be used to make multiple clinical products. We further believe that the lines used as input material will be used at different sites and, given their immortal status, will be used for many years or even decades. Therefore, it will be important to develop assays to monitor the state of the cells and their drift in culture. We suggest that a detailed characterization of the initial status of the cells, a comparison with some calibration material and the development of reporter sublcones will help determine which set of tests will be most useful in monitoring the cells and establishing criteria for discarding a line.

  17. Simultaneous excitation system for efficient guided wave structural health monitoring

    NASA Astrophysics Data System (ADS)

    Hua, Jiadong; Michaels, Jennifer E.; Chen, Xin; Lin, Jing

    2017-10-01

    Many structural health monitoring systems utilize guided wave transducer arrays for defect detection and localization. Signals are usually acquired using the ;pitch-catch; method whereby each transducer is excited in turn and the response is received by the remaining transducers. When extensive signal averaging is performed, the data acquisition process can be quite time-consuming, especially for metallic components that require a low repetition rate to allow signals to die out. Such a long data acquisition time is particularly problematic if environmental and operational conditions are changing while data are being acquired. To reduce the total data acquisition time, proposed here is a methodology whereby multiple transmitters are simultaneously triggered, and each transmitter is driven with a unique excitation. The simultaneously transmitted waves are captured by one or more receivers, and their responses are processed by dispersion-compensated filtering to extract the response from each individual transmitter. The excitation sequences are constructed by concatenating a series of chirps whose start and stop frequencies are randomly selected from a specified range. The process is optimized using a Monte-Carlo approach to select sequences with impulse-like autocorrelations and relatively flat cross-correlations. The efficacy of the proposed methodology is evaluated by several metrics and is experimentally demonstrated with sparse array imaging of simulated damage.

  18. Integrating hydrologic and geophysical data to constrain coastal surficial aquifer processes at multiple spatial and temporal scales

    USGS Publications Warehouse

    Schultz, Gregory M.; Ruppel, Carolyn; Fulton, Patrick; Hyndman, David W.; Day-Lewis, Frederick D.; Singha, Kamini

    2007-01-01

    Since 1997, repeated, coincident geophysical surveys and extensive hydrologic studies in shallow monitoring wells have been used to study static and dynamic processes associated with surface water-groundwater interaction at a range of spatial scales at the estuarine and ocean boundaries of an undeveloped, permeable barrier island in the Georgia part of the U.S. South Atlantic Bight. Because geophysical and hydrologic data measure different parameters, at different resolution and precision, and over vastly different spatial scales, reconciling the coincident data or even combining complementary inversion, hydrogeochemcial analyses and well-based groundwater monitoring, and, in some cases, limited vegetation mapping to demonstrate the utility of an integrative, multidisciplinary approach for elucidating groundwater processes at spatial scales (tens to thousands of meters) that are often difficult to capture with traditional hydrologic approaches. The case studies highlight regional aquifer characteristics, varying degrees of lateral saltwater intrusion at estuarine boundaries, complex subsurface salinity gradients at the ocean boundary, and imaging of submarsh groundwater discharge and possible free convection in the pore waters of a clastic marsh. This study also documents the use of geophysical techniques for detecting temporal changes in groundwater salinity regimes under natural (not forced) gradients at intratidal to interannual (1998-200 Southeastern U.S.A. drought) time scales.

  19. Landslide Life-Cycle Monitoring and Failure Prediction using Satellite Remote Sensing

    NASA Astrophysics Data System (ADS)

    Bouali, E. H. Y.; Oommen, T.; Escobar-Wolf, R. P.

    2017-12-01

    The consequences of slope instability are severe across the world: the US Geological Survey estimates that, each year, the United States spends $3.5B to repair damages caused by landslides, 25-50 deaths occur, real estate values in affected areas are reduced, productivity decreases, and natural environments are destroyed. A 2012 study by D.N. Petley found that loss of life is typically underestimated and, between 2004 and 2010, 2,620 fatal landslides caused 32,322 deaths around the world. These statistics have led research into the study of landslide monitoring and forecasting. More specifically, this presentation focuses on assessing the potential for using satellite-based optical and radar imagery toward overall landslide life-cycle monitoring and prediction. Radar images from multiple satellites (ERS-1, ERS-2, ENVISAT, and COSMO-SkyMed) are processed using the Persistent Scatterer Interferometry (PSI) technique. Optical images, from the Worldview-2 satellite, are orthorectified and processed using the Co-registration of Optically Sensed Images and Correlation (COSI-Corr) algorithm. Both approaches, process stacks of respective images, yield ground displacement rate values. Ground displacement information is used to generate `inverse-velocity vs time' plots, a proxy relationship that is used to estimate landslide occurrence (slope failure) and derived from a relationship quantified by T. Fukuzono in 1985 and B. Voight in 1988 between a material's time of failure and the strain rate applied to that material. Successful laboratory tests have demonstrated the usefulness of `inverse-velocity vs time' plots. This presentation will investigate the applicability of this approach with remote sensing on natural landslides in the western United States.

  20. Advances in Materials for Recent Low-Profile Implantable Bioelectronics

    PubMed Central

    Kim, Yun-Soung; Tillman, Bryan W.; Chun, Youngjae

    2018-01-01

    The rapid development of micro/nanofabrication technologies to engineer a variety of materials has enabled new types of bioelectronics for health monitoring and disease diagnostics. In this review, we summarize widely used electronic materials in recent low-profile implantable systems, including traditional metals and semiconductors, soft polymers, biodegradable metals, and organic materials. Silicon-based compounds have represented the traditional materials in medical devices, due to the fully established fabrication processes. Examples include miniaturized sensors for monitoring intraocular pressure and blood pressure, which are designed in an ultra-thin diaphragm to react with the applied pressure. These sensors are integrated into rigid circuits and multiple modules; this brings challenges regarding the fundamental material’s property mismatch with the targeted human tissues, which are intrinsically soft. Therefore, many polymeric materials have been investigated for hybrid integration with well-characterized functional materials such as silicon membranes and metal interconnects, which enable soft implantable bioelectronics. The most recent trend in implantable systems uses transient materials that naturally dissolve in body fluid after a programmed lifetime. Such biodegradable metallic materials are advantageous in the design of electronics due to their proven electrical properties. Collectively, this review delivers the development history of materials in implantable devices, while introducing new bioelectronics based on bioresorbable materials with multiple functionalities. PMID:29596359

Top