Hybrid monitoring scheme for end-to-end performance enhancement of multicast-based real-time media
NASA Astrophysics Data System (ADS)
Park, Ju-Won; Kim, JongWon
2004-10-01
As real-time media applications based on IP multicast networks spread widely, end-to-end QoS (quality of service) provisioning for these applications have become very important. To guarantee the end-to-end QoS of multi-party media applications, it is essential to monitor the time-varying status of both network metrics (i.e., delay, jitter and loss) and system metrics (i.e., CPU and memory utilization). In this paper, targeting the multicast-enabled AG (Access Grid) a next-generation group collaboration tool based on multi-party media services, the applicability of hybrid monitoring scheme that combines active and passive monitoring is investigated. The active monitoring measures network-layer metrics (i.e., network condition) with probe packets while the passive monitoring checks both application-layer metrics (i.e., user traffic condition by analyzing RTCP packets) and system metrics. By comparing these hybrid results, we attempt to pinpoint the causes of performance degradation and explore corresponding reactions to improve the end-to-end performance. The experimental results show that the proposed hybrid monitoring can provide useful information to coordinate the performance improvement of multi-party real-time media applications.
Do prospective workday appraisals influence end-of-workday affect and self-monitored performance?
Grawitch, Matthew J; Granda, Stephanie E; Barber, Larissa K
2008-10-01
The current study uses self-regulation as the basis for a model that examines the influence of three types of workday appraisals (resource, task, and response). At the beginning of their workday, a total of 170 faculty, graduate students, and staff of a university completed appraisal ratings of their anticipated workday tasks, resources, and responses. At the end of the workday, they completed assessments of positive and negative affect and self-monitored performance. Results suggested that resource appraisals of control and skills were predictive of task appraisals of difficulty, threat, and ambiguity. Task appraisals were then predictive of both response appraisals, in terms of anticipated support and effort, and self-monitored performance at the end of the day. Anticipated effort and self-monitored performance were both positively related to positive affect at the end of the day. Anticipated support and self-monitored performance were both negatively related to negative affect at the end of the day, while threat task appraisals were positively related to negative affect. Implications of the results for workplace interventions are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fishbaugher, M. J.
1985-05-01
The decreasing cost of microcomputers along with improvements in power metering circuitry have changed the way in which electrical energy use is monitored. Although utilities still rely on kilowatt-hour (kWh) meters for billing purposes, a microcomputer-based monitoring system is used when greater temporal and end-use resolution is desired. Because these types of monitoring systems will be used increasingly in large-scale conservation and end-use studies, it is important that their performance be analyzed to determine their accuracy. A co-instrumentation test was devised in which two such microcomputer-based monitoring systems made simultaneous measurements of electrical end-uses in two commercial buildings. The analysismore » of the co-instrumentation data aids in the evaluation of microcomputer-based monitoring systems used for end-use measurements. Separate and independent data loggers were used to measure the same loads simultaneously. In addition to these two systems, a utility billing meter measured the total energy use in each building during the co-instrumentation test. The utility's meters provided a relatively accurate standard by which the performance of both loggers could be judged. The comparison between the SCL and PNL microcomputer-based loggers has shown that power measurement techniques directly affect system performance. The co-instrumentation test has shown that there are certain standards that a monitoring system must meet if it is to perform well. First, it is essential to calibrate a microcomputer-based logger against a known standard load before the system is installed. Second, a microcomputer-based system must have some way of accounting for power factors. Recent advances in power metering circuitry have made it relatively easy to apply these power factors automatically in real time.« less
Telecommunications end-to-end systems monitoring on TOPEX/Poseidon: Tools and techniques
NASA Technical Reports Server (NTRS)
Calanche, Bruno J.
1994-01-01
The TOPEX/Poseidon Project Satellite Performance Analysis Team's (SPAT) roles and responsibilities have grown to include functions that are typically performed by other teams on JPL Flight Projects. In particular, SPAT Telecommunication's role has expanded beyond the nominal function of monitoring, assessing, characterizing, and trending the spacecraft (S/C) RF/Telecom subsystem to one of End-to-End Information Systems (EEIS) monitoring. This has been accomplished by taking advantage of the spacecraft and ground data system structures and protocols. By processing both the received spacecraft telemetry minor frame ground generated CRC flags and NASCOM block poly error flags, bit error rates (BER) for each link segment can be determined. This provides the capability to characterize the separate link segments, determine science data recovery, and perform fault/anomaly detection and isolation. By monitoring and managing the links, TOPEX has successfully recovered approximately 99.9 percent of the science data with an integrity (BER) of better than 1 x 10(exp 8). This paper presents the algorithms used to process the above flags and the techniques used for EEIS monitoring.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, W.
2000-02-22
Modern High Energy Nuclear and Particle Physics (HENP) experiments at Laboratories around the world present a significant challenge to wide area networks. Petabytes (1015) or exabytes (1018) of data will be generated during the lifetime of the experiment. Much of this data will be distributed via the Internet to the experiment's collaborators at Universities and Institutes throughout the world for analysis. In order to assess the feasibility of the computing goals of these and future experiments, the HENP networking community is actively monitoring performance across a large part of the Internet used by its collaborators. Since 1995, the pingER projectmore » has been collecting data on ping packet loss and round trip times. In January 2000, there are 28 monitoring sites in 15 countries gathering data on over 2,000 end-to-end pairs. HENP labs such as SLAC, Fermi Lab and CERN are using Advanced Network's Surveyor project and monitoring performance from one-way delay of UDP packets. More recently several HENP sites have become involved with NLANR's active measurement program (AMP). In addition SLAC and CERN are part of the RIPE test-traffic project and SLAC is home for a NIMI machine. The large End-to-end performance monitoring infrastructure allows the HENP networking community to chart long term trends and closely examine short term glitches across a wide range of networks and connections. The different methodologies provide opportunities to compare results based on different protocols and statistical samples. Understanding agreement and discrepancies between results provides particular insight into the nature of the network. This paper will highlight the practical side of monitoring by reviewing the special needs of High Energy Nuclear and Particle Physics experiments and provide an overview of the experience of measuring performance across a large number of interconnected networks throughout the world with various methodologies. In particular, results from each project will be compared and disagreement will be analyzed. The goal is to address issues for improving understanding for gathering and analysis of accurate monitoring data, but the outlook for the computing goals of HENP will also be examined.« less
User-level framework for performance monitoring of HPC applications
NASA Astrophysics Data System (ADS)
Hristova, R.; Goranov, G.
2013-10-01
HP-SEE is an infrastructure that links the existing HPC facilities in South East Europe in a common infrastructure. The analysis of the performance monitoring of the High-Performance Computing (HPC) applications in the infrastructure can be useful for the end user as diagnostic for the overall performance of his applications. The existing monitoring tools for HP-SEE provide to the end user only aggregated information for all applications. Usually, the user does not have permissions to select only the relevant information for him and for his applications. In this article we present a framework for performance monitoring of the HPC applications in the HP-SEE infrastructure. The framework provides standardized performance metrics, which every user can use in order to monitor his applications. Furthermore as a part of the framework a program interface is developed. The interface allows the user to publish metrics data from his application and to read and analyze gathered information. Publishing and reading through the framework is possible only with grid certificate valid for the infrastructure. Therefore the user is authorized to access only the data for his applications.
NASA Technical Reports Server (NTRS)
Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary
1996-01-01
We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.
Ali, Saqib; Wang, Guojun; Cottrell, Roger Leslie; ...
2018-05-28
Internet performance is highly correlated with key economic development metrics of a region. According to World Bank, the economic growth of a country increases 1.3% with a 10% increase in the speed of the Internet. Therefore, it is necessary to monitor and understand the performance of the Internet links in the region. It helps to figure out the infrastructural inefficiencies, poor resource allocation, and routing issues in the region. Moreover, it provides healthy suggestions for future upgrades. Therefore, the objective of this paper is to understand the Internet performance and routing infrastructure of South Asian countries in comparison to themore » developed world and neighboring countries using end-to-end Internet performance measurements. The South Asian countries comprise nearly 32% of the Internet users in Asia and nearly 16% of the world. The Internet performance metrics in the region are collected through the PingER framework. The framework is developed by the SLAC National Accelerator Laboratory, USA and is running for the last 20 years. PingER has 16 monitoring nodes in the region, and in the last year PingER monitors about 40 sites in South Asia using the ubiquitous ping facility. The collected data is used to estimate the key Internet performance metrics of South Asian countries. The performance metrics are compared with the neighboring countries and the developed world. Particularly, the TCP throughput of the countries is also correlated with different development indices. Further, worldwide Internet connectivity and routing patterns of the countries are investigated to figure out the inconsistencies in the region. Furthermore, the performance analysis revealed that the South Asia region is 7-10 years behind the developed regions of North America (USA and Canada), Europe, and East Asia.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ali, Saqib; Wang, Guojun; Cottrell, Roger Leslie
Internet performance is highly correlated with key economic development metrics of a region. According to World Bank, the economic growth of a country increases 1.3% with a 10% increase in the speed of the Internet. Therefore, it is necessary to monitor and understand the performance of the Internet links in the region. It helps to figure out the infrastructural inefficiencies, poor resource allocation, and routing issues in the region. Moreover, it provides healthy suggestions for future upgrades. Therefore, the objective of this paper is to understand the Internet performance and routing infrastructure of South Asian countries in comparison to themore » developed world and neighboring countries using end-to-end Internet performance measurements. The South Asian countries comprise nearly 32% of the Internet users in Asia and nearly 16% of the world. The Internet performance metrics in the region are collected through the PingER framework. The framework is developed by the SLAC National Accelerator Laboratory, USA and is running for the last 20 years. PingER has 16 monitoring nodes in the region, and in the last year PingER monitors about 40 sites in South Asia using the ubiquitous ping facility. The collected data is used to estimate the key Internet performance metrics of South Asian countries. The performance metrics are compared with the neighboring countries and the developed world. Particularly, the TCP throughput of the countries is also correlated with different development indices. Further, worldwide Internet connectivity and routing patterns of the countries are investigated to figure out the inconsistencies in the region. Furthermore, the performance analysis revealed that the South Asia region is 7-10 years behind the developed regions of North America (USA and Canada), Europe, and East Asia.« less
Correlation analysis on real-time tab-delimited network monitoring data
Pan, Aditya; Majumdar, Jahin; Bansal, Abhay; ...
2016-01-01
End-End performance monitoring in the Internet, also called PingER is a part of SLAC National Accelerator Laboratory’s research project. It was created to answer the growing need to monitor network both to analyze current performance and to designate resources to optimize execution between research centers, and the universities and institutes co-operating on present and future operations. The monitoring support reflects the broad geographical area of the collaborations and requires a comprehensive number of research and financial channels. The data architecture retrieval and methodology of the interpretation have emerged over numerous years. Analyzing this data is the main challenge due tomore » its high volume. Finally, by using correlation analysis, we can make crucial conclusions about how the network data affects the performance of the hosts and how it depends from countries to countries.« less
Performance monitoring for brain-computer-interface actions.
Schurger, Aaron; Gale, Steven; Gozel, Olivia; Blanke, Olaf
2017-02-01
When presented with a difficult perceptual decision, human observers are able to make metacognitive judgements of subjective certainty. Such judgements can be made independently of and prior to any overt response to a sensory stimulus, presumably via internal monitoring. Retrospective judgements about one's own task performance, on the other hand, require first that the subject perform a task and thus could potentially be made based on motor processes, proprioceptive, and other sensory feedback rather than internal monitoring. With this dichotomy in mind, we set out to study performance monitoring using a brain-computer interface (BCI), with which subjects could voluntarily perform an action - moving a cursor on a computer screen - without any movement of the body, and thus without somatosensory feedback. Real-time visual feedback was available to subjects during training, but not during the experiment where the true final position of the cursor was only revealed after the subject had estimated where s/he thought it had ended up after 6s of BCI-based cursor control. During the first half of the experiment subjects based their assessments primarily on the prior probability of the end position of the cursor on previous trials. However, during the second half of the experiment subjects' judgements moved significantly closer to the true end position of the cursor, and away from the prior. This suggests that subjects can monitor task performance when the task is performed without overt movement of the body. Copyright © 2016 Elsevier Inc. All rights reserved.
Monitoring the CMS strip tracker readout system
NASA Astrophysics Data System (ADS)
Mersi, S.; Bainbridge, R.; Baulieu, G.; Bel, S.; Cole, J.; Cripps, N.; Delaere, C.; Drouhin, F.; Fulcher, J.; Giassi, A.; Gross, L.; Hahn, K.; Mirabito, L.; Nikolic, M.; Tkaczyk, S.; Wingham, M.
2008-07-01
The CMS Silicon Strip Tracker at the LHC comprises a sensitive area of approximately 200 m2 and 10 million readout channels. Its data acquisition system is based around a custom analogue front-end chip. Both the control and the readout of the front-end electronics are performed by off-detector VME boards in the counting room, which digitise the raw event data and perform zero-suppression and formatting. The data acquisition system uses the CMS online software framework to configure, control and monitor the hardware components and steer the data acquisition. The first data analysis is performed online within the official CMS reconstruction framework, which provides many services, such as distributed analysis, access to geometry and conditions data, and a Data Quality Monitoring tool based on the online physics reconstruction. The data acquisition monitoring of the Strip Tracker uses both the data acquisition and the reconstruction software frameworks in order to provide real-time feedback to shifters on the operational state of the detector, archiving for later analysis and possibly trigger automatic recovery actions in case of errors. Here we review the proposed architecture of the monitoring system and we describe its software components, which are already in place, the various monitoring streams available, and our experiences of operating and monitoring a large-scale system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ali, Saqib; Wang, Guojun; Cottrell, Roger Leslie
PingER (Ping End-to-End Reporting) is a worldwide end-to-end Internet performance measurement framework. It was developed by the SLAC National Accelerator Laboratory, Stanford, USA and running from the last 20 years. It has more than 700 monitoring agents and remote sites which monitor the performance of Internet links around 170 countries of the world. At present, the size of the compressed PingER data set is about 60 GB comprising of 100,000 flat files. The data is publicly available for valuable Internet performance analyses. However, the data sets suffer from missing values and anomalies due to congestion, bottleneck links, queuing overflow, networkmore » software misconfiguration, hardware failure, cable cuts, and social upheavals. Therefore, the objective of this paper is to detect such performance drops or spikes labeled as anomalies or outliers for the PingER data set. In the proposed approach, the raw text files of the data set are transformed into a PingER dimensional model. The missing values are imputed using the k-NN algorithm. The data is partitioned into similar instances using the k-means clustering algorithm. Afterward, clustering is integrated with the Local Outlier Factor (LOF) using the Cluster Based Local Outlier Factor (CBLOF) algorithm to detect the anomalies or outliers from the PingER data. Lastly, anomalies are further analyzed to identify the time frame and location of the hosts generating the major percentage of the anomalies in the PingER data set ranging from 1998 to 2016.« less
Ali, Saqib; Wang, Guojun; Cottrell, Roger Leslie; ...
2018-05-28
PingER (Ping End-to-End Reporting) is a worldwide end-to-end Internet performance measurement framework. It was developed by the SLAC National Accelerator Laboratory, Stanford, USA and running from the last 20 years. It has more than 700 monitoring agents and remote sites which monitor the performance of Internet links around 170 countries of the world. At present, the size of the compressed PingER data set is about 60 GB comprising of 100,000 flat files. The data is publicly available for valuable Internet performance analyses. However, the data sets suffer from missing values and anomalies due to congestion, bottleneck links, queuing overflow, networkmore » software misconfiguration, hardware failure, cable cuts, and social upheavals. Therefore, the objective of this paper is to detect such performance drops or spikes labeled as anomalies or outliers for the PingER data set. In the proposed approach, the raw text files of the data set are transformed into a PingER dimensional model. The missing values are imputed using the k-NN algorithm. The data is partitioned into similar instances using the k-means clustering algorithm. Afterward, clustering is integrated with the Local Outlier Factor (LOF) using the Cluster Based Local Outlier Factor (CBLOF) algorithm to detect the anomalies or outliers from the PingER data. Lastly, anomalies are further analyzed to identify the time frame and location of the hosts generating the major percentage of the anomalies in the PingER data set ranging from 1998 to 2016.« less
49 CFR 385.333 - What happens at the end of the 18-month safety monitoring period?
Code of Federal Regulations, 2010 CFR
2010-10-01
... SAFETY REGULATIONS SAFETY FITNESS PROCEDURES New Entrant Safety Assurance Program § 385.333 What happens at the end of the 18-month safety monitoring period? (a) If a safety audit has been performed within... the same basis as any other carrier. (d) If a safety audit or compliance review has not been performed...
AN INTERNET RACK MONITOR-CONTROLLER FOR APS LINAC RF ELECTRONICS UPGRADE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Hengjie; Smith, Terry; Nassiri, Alireza
To support the research and development in APS LINAC area, the existing LINAC rf control performance needs to be much improved, and thus an upgrade of the legacy LINAC rf electronics becomes necessary. The proposed upgrade plan centers on the concept of using a modern, network-attached, rackmount digital electronics platform –Internet Rack Monitor-Controller (or IRMC) to achieve the goal of modernizing the rf electronics at a lower cost. The system model of the envisioned IRMC is basically a 3-tier stack with a high-performance DSP in the mid-layer to perform the core tasks of real-time rf data processing and controls. Themore » Digital Front-End (DFE) attachment layer at bottom bridges the applicationspecific rf front-ends to the DSP. A network communication gateway, together with an embedded event receiver (EVR) in the top layer merges the Internet Rack MonitorController node into the networks of the accelerator controls infrastructure. Although the concept is very much in trend with today’s Internet-of-Things (IoT), this implementation has actually been used in the accelerators for over two decades.« less
Performance Test of the Next Generation X-Ray Beam Position Monitor System for The APS Upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, B.; Lee, S.; Westferro, F.
The Advanced Photon Source is developing its next major upgrade (APS-U) based on the multi-bend achromat lattice. Improved beam stability is critical for the upgrade and will require keeping short-time beam angle change below 0.25 µrad and long-term angle drift below 0.6 µrad. A reliable white x-ray beam diagnostic system in the front end will be a key part of the planned beam stabilization system. This system includes an x-ray beam position monitor (XBPM) based on x-ray fluorescence (XRF) from two specially designed GlidCop A-15 absorbers, a second XBPM using XRF photons from the Exit Mask, and two white beammore » intensity monitors using XRF from the photon shutter and Compton-scattered photons from the front end beryllium window or a retractable diamond film in windowless front ends. We present orbit stability data for the first XBPM used in the feedback control during user operations, as well as test data from the second XBPM and the intensity monitors. They demonstrate that the XBPM system meets APS-U beam stability requirements.« less
Web-Accessible Scientific Workflow System for Performance Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roelof Versteeg; Roelof Versteeg; Trevor Rowe
2006-03-01
We describe the design and implementation of a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic Javascript and HTML/CSS) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This environment allows for reproducible, transparent result generation by a diverse user base. It has been implemented for several monitoringmore » systems with different degrees of complexity.« less
NASA Astrophysics Data System (ADS)
Mattson, E.; Versteeg, R.; Ankeny, M.; Stormberg, G.
2005-12-01
Long term performance monitoring has been identified by DOE, DOD and EPA as one of the most challenging and costly elements of contaminated site remedial efforts. Such monitoring should provide timely and actionable information relevant to a multitude of stakeholder needs. This information should be obtained in a manner which is auditable, cost effective and transparent. Over the last several years INL staff has designed and implemented a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition from diverse sensors (geophysical, geochemical and hydrological) with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic javascript and html/css) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This system has been implemented and is operational for several sites, including the Ruby Gulch Waste Rock Repository (a capped mine waste rock dump on the Gilt Edge Mine Superfund Site), the INL Vadoze Zone Research Park and an alternative cover landfill. Implementations for other vadoze zone sites are currently in progress. These systems allow for autonomous performance monitoring through automated data analysis and report generation. This performance monitoring has allowed users to obtain insights into system dynamics, regulatory compliance and residence times of water. Our system uses modular components for data selection and graphing and WSDL compliant webservices for external functions such as statistical analyses and model invocations. Thus, implementing this system for novel sites and extending functionality (e.g. adding novel models) is relatively straightforward. As system access requires a standard webbrowser and uses intuitive functionality, stakeholders with diverse degrees of technical insight can use this system with little or no training.
Decker, Andrew S.; Cipriano, Gabriela C.; Tsouri, Gill
2016-01-01
Objective. To assess and improve student adherence to hand hygiene indications using radio frequency identification (RFID) enabled hand hygiene stations and performance report cards. Design. Students volunteered to wear RFID-enabled hospital employee nametags to monitor their adherence to hand-hygiene indications. After training in World Health Organization (WHO) hand hygiene methods and indications, student were instructed to treat the classroom as a patient care area. Report cards illustrating individual performance were distributed via e-mail to students at the middle and end of each 5-day observation period. Students were eligible for individual and team prizes consisting of Starbucks gift cards in $5 increments. Assessment. A hand hygiene station with an RFID reader and dispensing sensor recorded the nametag nearest to the station at the time of use. Mean frequency of use per student was 5.41 (range: 2-10). Distance between the student’s seat and the dispenser was the only variable significantly associated with adherence. Student satisfaction with the system was assessed by a self-administered survey at the end of the study. Most students reported that the system increased their motivation to perform hand hygiene as indicated. Conclusion. The RFID-enabled hand hygiene system and benchmarking reports with performance incentives was feasible, reliable, and affordable. Future studies should record video to monitor adherence to the WHO 8-step technique. PMID:27170822
Decker, Andrew S; Cipriano, Gabriela C; Tsouri, Gill; Lavigne, Jill E
2016-04-25
Objective. To assess and improve student adherence to hand hygiene indications using radio frequency identification (RFID) enabled hand hygiene stations and performance report cards. Design. Students volunteered to wear RFID-enabled hospital employee nametags to monitor their adherence to hand-hygiene indications. After training in World Health Organization (WHO) hand hygiene methods and indications, student were instructed to treat the classroom as a patient care area. Report cards illustrating individual performance were distributed via e-mail to students at the middle and end of each 5-day observation period. Students were eligible for individual and team prizes consisting of Starbucks gift cards in $5 increments. Assessment. A hand hygiene station with an RFID reader and dispensing sensor recorded the nametag nearest to the station at the time of use. Mean frequency of use per student was 5.41 (range: 2-10). Distance between the student's seat and the dispenser was the only variable significantly associated with adherence. Student satisfaction with the system was assessed by a self-administered survey at the end of the study. Most students reported that the system increased their motivation to perform hand hygiene as indicated. Conclusion. The RFID-enabled hand hygiene system and benchmarking reports with performance incentives was feasible, reliable, and affordable. Future studies should record video to monitor adherence to the WHO 8-step technique.
Self-Monitoring of Attained Subgoals in Private Study.
ERIC Educational Resources Information Center
Morgan, Mark
1985-01-01
Three conditions of self-monitoring of private study were compared for their effects on academic performance and intrinsic motivation. In end-of-year examinations, a group who self-monitored subgoals outperformed groups who self-monitored either time or study or distal goals on the target course of the investigation. (Author/LMO)
NASA Astrophysics Data System (ADS)
Caratelli, A.; Bonacini, S.; Kloukinas, K.; Marchioro, A.; Moreira, P.; De Oliveira, R.; Paillard, C.
2015-03-01
The future upgrades of the LHC experiments will increase the beam luminosity leading to a corresponding growth of the amounts of data to be treated by the data acquisition systems. To address these needs, the GBT (Giga-Bit Transceiver optical link [1,2]) architecture was developed to provide the simultaneous transfer of readout data, timing and trigger signals as well as slow control and monitoring data. The GBT-SCA ASIC, part of the GBT chip-set, has the purpose to distribute control and monitoring signals to the on-detector front-end electronics and perform monitoring operations of detector environmental parameters. In order to meet the requirements of different front-end ASICs used in the experiments, it provides various user-configurable interfaces capable to perform simultaneous operations. It is designed employing radiation tolerant design techniques to ensure robustness against SEUs and TID radiation effects and is implemented in a commercial 130 nm CMOS technology. This work presents the GBT-SCA architecture, the ASIC interfaces, the data transfer protocol, and its integration with the GBT optical link.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakamura, Katsumasa; Shioyama, Yoshiyuki; Nomoto, Satoru
2007-05-01
Purpose: The voluntary breath-hold (BH) technique is a simple method to control the respiration-related motion of a tumor during irradiation. However, the abdominal and chest wall position may not be accurately reproduced using the BH technique. The purpose of this study was to examine whether visual feedback can reduce the fluctuation in wall motion during BH using a new respiratory monitoring device. Methods and Materials: We developed a laser-based BH monitoring and visual feedback system. For this study, five healthy volunteers were enrolled. The volunteers, practicing abdominal breathing, performed shallow end-expiration BH (SEBH), shallow end-inspiration BH (SIBH), and deep end-inspirationmore » BH (DIBH) with or without visual feedback. The abdominal and chest wall positions were measured at 80-ms intervals during BHs. Results: The fluctuation in the chest wall position was smaller than that of the abdominal wall position. The reproducibility of the wall position was improved by visual feedback. With a monitoring device, visual feedback reduced the mean deviation of the abdominal wall from 2.1 {+-} 1.3 mm to 1.5 {+-} 0.5 mm, 2.5 {+-} 1.9 mm to 1.1 {+-} 0.4 mm, and 6.6 {+-} 2.4 mm to 2.6 {+-} 1.4 mm in SEBH, SIBH, and DIBH, respectively. Conclusions: Volunteers can perform the BH maneuver in a highly reproducible fashion when informed about the position of the wall, although in the case of DIBH, the deviation in the wall position remained substantial.« less
Methods for accurate cold-chain temperature monitoring using digital data-logger thermometers
NASA Astrophysics Data System (ADS)
Chojnacky, M. J.; Miller, W. M.; Strouse, G. F.
2013-09-01
Complete and accurate records of vaccine temperature history are vital to preserving drug potency and patient safety. However, previously published vaccine storage and handling guidelines have failed to indicate a need for continuous temperature monitoring in vaccine storage refrigerators. We evaluated the performance of seven digital data logger models as candidates for continuous temperature monitoring of refrigerated vaccines, based on the following criteria: out-of-box performance and compliance with manufacturer accuracy specifications over the range of use; measurement stability over extended, continuous use; proper setup in a vaccine storage refrigerator so that measurements reflect liquid vaccine temperatures; and practical methods for end-user validation and establishing metrological traceability. Data loggers were tested using ice melting point checks and by comparison to calibrated thermocouples to characterize performance over 0 °C to 10 °C. We also monitored logger performance in a study designed to replicate the range of vaccine storage and environmental conditions encountered at provider offices. Based on the results of this study, the Centers for Disease Control released new guidelines on proper methods for storage, handling, and temperature monitoring of vaccines for participants in its federally-funded Vaccines for Children Program. Improved temperature monitoring practices will ultimately decrease waste from damaged vaccines, improve consumer confidence, and increase effective inoculation rates.
End-to-End Concurrent Multipath Transfer Using Transport Layer Multihoming
2006-07-01
Department,Newark, DE ,19716 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S... Systems , Inc. My thanks to all for their generous funding that made this research possible. iv TABLE OF CONTENTS ABSTRACT...SCTP implementation, which is written for the BSD family of operating systems . This implementation effort was funded by Cisco Systems , with the goal of
QoS-aware health monitoring system using cloud-based WBANs.
Almashaqbeh, Ghada; Hayajneh, Thaier; Vasilakos, Athanasios V; Mohd, Bassam J
2014-10-01
Wireless Body Area Networks (WBANs) are amongst the best options for remote health monitoring. However, as standalone systems WBANs have many limitations due to the large amount of processed data, mobility of monitored users, and the network coverage area. Integrating WBANs with cloud computing provides effective solutions to these problems and promotes the performance of WBANs based systems. Accordingly, in this paper we propose a cloud-based real-time remote health monitoring system for tracking the health status of non-hospitalized patients while practicing their daily activities. Compared with existing cloud-based WBAN frameworks, we divide the cloud into local one, that includes the monitored users and local medical staff, and a global one that includes the outer world. The performance of the proposed framework is optimized by reducing congestion, interference, and data delivery delay while supporting users' mobility. Several novel techniques and algorithms are proposed to accomplish our objective. First, the concept of data classification and aggregation is utilized to avoid clogging the network with unnecessary data traffic. Second, a dynamic channel assignment policy is developed to distribute the WBANs associated with the users on the available frequency channels to manage interference. Third, a delay-aware routing metric is proposed to be used by the local cloud in its multi-hop communication to speed up the reporting process of the health-related data. Fourth, the delay-aware metric is further utilized by the association protocols used by the WBANs to connect with the local cloud. Finally, the system with all the proposed techniques and algorithms is evaluated using extensive ns-2 simulations. The simulation results show superior performance of the proposed architecture in optimizing the end-to-end delay, handling the increased interference levels, maximizing the network capacity, and tracking user's mobility.
Feng, Chuan; Rozenblit, Jerzy W; Hamilton, Allan J
2010-11-01
Surgeons performing laparoscopic surgery have strong biases regarding the quality and nature of the laparoscopic video monitor display. In a comparative study, we used a unique computerized sensing and analysis system to evaluate the various types of monitors employed in laparoscopic surgery. We compared the impact of different types of monitor displays on an individual's performance of a laparoscopic training task which required the subject to move the instrument to a set of targets. Participants (varying from no laparoscopic experience to board-certified surgeons) were asked to perform the assigned task while using all three display systems, which were randomly assigned: a conventional laparoscopic monitor system (2D), a high-definition monitor system (HD), and a stereoscopic display (3D). The effects of monitor system on various performance parameters (total time consumed to finish the task, average speed, and movement economy) were analyzed by computer. Each of the subjects filled out a subjective questionnaire at the end of their training session. A total of 27 participants completed our study. Performance with the HD monitor was significantly slower than with either the 3D or 2D monitor (p < 0.0001). Movement economy with the HD monitor was significantly reduced compared with the 3D (p < 0.0004) or 2D (p < 0.0001) monitor. In terms of average time required to complete the task, performance with the 3D monitor was significantly faster than with the HD (p < 0.0001) or 2D (p < 0.0086) monitor. However, the HD system was the overwhelming favorite according to subjective evaluation. Computerized sensing and analysis is capable of quantitatively assessing the seemingly minor effect of monitor display on surgical training performance. The study demonstrates that, while users expressed a decided preference for HD systems, actual quantitative analysis indicates that HD monitors offer no statistically significant advantage and may even worsen performance compared with standard 2D or 3D laparoscopic monitors.
de Paula Simola, Rauno Álvaro; Raeder, Christian; Wiewelhove, Thimo; Kellmann, Michael; Meyer, Tim; Pfeiffer, Mark; Ferrauti, Alexander
2016-10-01
The study investigates whether tensiomyography (TMG) is sensitive to differentiate between strength and endurance athletes, and to monitor fatigue after either one week of intensive strength (ST) or endurance (END) training. Fourteen strength (24.1±2.0years) and eleven endurance athletes (25.5±4.8years) performed an intensive training period of 6days of ST or END, respectively. ST and END groups completed specific performance tests as well as TMG measurements of maximal radial deformation of the muscle belly (Dm), deformation time between 10% and 90% Dm (Tc), rate of deformation development until 10% Dm (V10) and 90% Dm (V90) before (baseline), after training period (post1), and after 72h of recovery (post2). Specific performance of both groups decreased from baseline to post1 (P<0.05) and returned to baseline values at post2 (P<0.05). The ST group showed higher countermovement jump (P<0.05) and shorter Tc (P<0.05) at baseline. After training, Dm, V10, and V90 were reduced in the ST (P<0.05) while TMG changes were less pronounced in the END. TMG could be a useful tool to differentiate between strength and endurance athletes, and to monitor fatigue and recovery especially in strength training. Copyright © 2016 Elsevier Ltd. All rights reserved.
Efficient heart beat detection using embedded system electronics
NASA Astrophysics Data System (ADS)
Ramasamy, Mouli; Oh, Sechang; Varadan, Vijay K.
2014-04-01
The present day bio-technical field concentrates on developing various types of innovative ambulatory and wearable devices to monitor several bio-physical, physio-pathological, bio-electrical and bio-potential factors to assess a human body's health condition without intruding quotidian activities. One of the most important aspects of this evolving technology is monitoring heart beat rate and electrocardiogram (ECG) from which many other subsidiary results can be derived. Conventionally, the devices and systems consumes a lot of power since the acquired signals are always processed on the receiver end. Because of this back end processing, the unprocessed raw data is transmitted resulting in usage of more power, memory and processing time. This paper proposes an innovative technique where the acquired signals are processed by a microcontroller in the front end of the module and just the processed signal is then transmitted wirelessly to the display unit. Therefore, power consumption is considerably reduced and clearer data analysis is performed within the module. This also avoids the need for the user to be educated about usage of the device and signal/system analysis, since only the number of heart beats will displayed at the user end. Additionally, the proposed concept also eradicates the other disadvantages like obtrusiveness, high power consumption and size. To demonstrate the above said factors, a commercial controller board was used to extend the monitoring method by using the saved ECG data from a computer.
End-tidal carbon dioxide monitoring stabilized hemodynamic changes during ECT.
Saito, Shigeru; Kadoi, Yuji; Nihishara, Fumio; Aso, Chizu; Goto, Fumio
2003-03-01
Accumulation of carbon dioxide (CO2) can disturb systemic and cerebral hemodynamics in patients receiving electroconvulsive therapy (ECT). The purpose of this study was to identify the effects of end-tidal CO2 monitoring on hemodynamic changes in patients who received ECT under propofol anesthesia. ECT was prescribed to 40 patients under propofol anesthesia. Ventilation was assisted using a face mask and 100% oxygen, with or without end-tidal CO2 monitoring. Heart rate was significantly increased in patients without end-tidal CO2 monitoring at 1 to 5 minutes after electrical stimulation (p < 0.01). Mean arterial blood pressure and middle cerebral artery blood flow velocity in the group without end-tidal CO2 monitoring were significantly larger than the values in the group with the monitor at 1 to 5 minutes after electrical stimulation. Arterial CO2 tension in the group without end-tidal CO2 monitoring was larger than the value in the group with the monitoring at 1 minute (45+/-5 mm Hg with the monitor and 56+/-8 without the monitor) and 5 minutes (37+/-4 mm Hg with the monitor and 51+/-8 without the monitor) after electrical stimulation (p < 0.01). Application of end-tidal CO2 monitoring is considered beneficial for safe and effective anesthesia management of patients undergoing ECT, especially patients with an intracranial disorder or ischemic heart disease.
High-speed railway real-time localization auxiliary method based on deep neural network
NASA Astrophysics Data System (ADS)
Chen, Dongjie; Zhang, Wensheng; Yang, Yang
2017-11-01
High-speed railway intelligent monitoring and management system is composed of schedule integration, geographic information, location services, and data mining technology for integration of time and space data. Assistant localization is a significant submodule of the intelligent monitoring system. In practical application, the general access is to capture the image sequences of the components by using a high-definition camera, digital image processing technique and target detection, tracking and even behavior analysis method. In this paper, we present an end-to-end character recognition method based on a deep CNN network called YOLO-toc for high-speed railway pillar plate number. Different from other deep CNNs, YOLO-toc is an end-to-end multi-target detection framework, furthermore, it exhibits a state-of-art performance on real-time detection with a nearly 50fps achieved on GPU (GTX960). Finally, we realize a real-time but high-accuracy pillar plate number recognition system and integrate natural scene OCR into a dedicated classification YOLO-toc model.
Evaluation of the ET2000 guardrail end treatment.
DOT National Transportation Integrated Search
2004-01-01
The objectives of this study were to monitor and report the performance of the ET2000 guardrail end treatment design in traffic crashes. The involved vehicle was inspected when available. : Data for a total of 135 collisions involving the ET2000 were...
End-user perspective of low-cost sensors for outdoor air pollution monitoring.
Rai, Aakash C; Kumar, Prashant; Pilla, Francesco; Skouloudis, Andreas N; Di Sabatino, Silvana; Ratti, Carlo; Yasar, Ansar; Rickerby, David
2017-12-31
Low-cost sensor technology can potentially revolutionise the area of air pollution monitoring by providing high-density spatiotemporal pollution data. Such data can be utilised for supplementing traditional pollution monitoring, improving exposure estimates, and raising community awareness about air pollution. However, data quality remains a major concern that hinders the widespread adoption of low-cost sensor technology. Unreliable data may mislead unsuspecting users and potentially lead to alarming consequences such as reporting acceptable air pollutant levels when they are above the limits deemed safe for human health. This article provides scientific guidance to the end-users for effectively deploying low-cost sensors for monitoring air pollution and people's exposure, while ensuring reasonable data quality. We review the performance characteristics of several low-cost particle and gas monitoring sensors and provide recommendations to end-users for making proper sensor selection by summarizing the capabilities and limitations of such sensors. The challenges, best practices, and future outlook for effectively deploying low-cost sensors, and maintaining data quality are also discussed. For data quality assurance, a two-stage sensor calibration process is recommended, which includes laboratory calibration under controlled conditions by the manufacturer supplemented with routine calibration checks performed by the end-user under final deployment conditions. For large sensor networks where routine calibration checks are impractical, statistical techniques for data quality assurance should be utilised. Further advancements and adoption of sophisticated mathematical and statistical techniques for sensor calibration, fault detection, and data quality assurance can indeed help to realise the promised benefits of a low-cost air pollution sensor network. Copyright © 2017 Elsevier B.V. All rights reserved.
40 CFR 63.497 - Back-end process provisions-monitoring provisions for control and recovery devices.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 9 2010-07-01 2010-07-01 false Back-end process provisions-monitoring... Polymers and Resins § 63.497 Back-end process provisions—monitoring provisions for control and recovery devices. (a) An owner or operator complying with the residual organic HAP limitations in § 63.494(a) using...
Tevatron beam position monitor upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolbers, Stephen; Banerjee, B.; Barker, B.
2005-05-01
The Tevatron Beam Position Monitor (BPM) readout electronics and software have been upgraded to improve measurement precision, functionality and reliability. The original system, designed and built in the early 1980's, became inadequate for current and future operations of the Tevatron. The upgraded system consists of 960 channels of new electronics to process analog signals from 240 BPMs, new front-end software, new online and controls software, and modified applications to take advantage of the improved measurements and support the new functionality. The new system reads signals from both ends of the existing directional stripline pickups to provide simultaneous proton and antiprotonmore » position measurements. Measurements using the new system are presented that demonstrate its improved resolution and overall performance.« less
Performance measurement: A tool for program control
NASA Technical Reports Server (NTRS)
Abell, Nancy
1994-01-01
Performance measurement is a management tool for planning, monitoring, and controlling as aspects of program and project management--cost, schedule, and technical requirements. It is a means (concept and approach) to a desired end (effective program planning and control). To reach the desired end, however, performance measurement must be applied and used appropriately, with full knowledge and recognition of its power and of its limitations--what it can and cannot do for the project manager. What is the potential of this management tool? What does performance measurement do that a traditional plan vs. actual technique cannot do? Performance measurement provides an improvement over the customary comparison of how much money was spent (actual cost) vs. how much was planned to be spent based on a schedule of activities (work planned). This commonly used plan vs. actual comparison does not allow one to know from the numerical data if the actual cost incurred was for work intended to be done.
Coordinated Fault Tolerance for High-Performance Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dongarra, Jack; Bosilca, George; et al.
2013-04-08
Our work to meet our goal of end-to-end fault tolerance has focused on two areas: (1) improving fault tolerance in various software currently available and widely used throughout the HEC domain and (2) using fault information exchange and coordination to achieve holistic, systemwide fault tolerance and understanding how to design and implement interfaces for integrating fault tolerance features for multiple layers of the software stack—from the application, math libraries, and programming language runtime to other common system software such as jobs schedulers, resource managers, and monitoring tools.
Performance Monitoring of Residential Hot Water Distribution Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, Anna; Lanzisera, Steven; Lutz, Jim
Current water distribution systems are designed such that users need to run the water for some time to achieve the desired temperature, wasting energy and water in the process. We developed a wireless sensor network for large-scale, long time-series monitoring of residential water end use. Our system consists of flow meters connected to wireless motes transmitting data to a central manager mote, which in turn posts data to our server via the internet. This project also demonstrates a reliable and flexible data collection system that could be configured for various other forms of end use metering in buildings. The purposemore » of this study was to determine water and energy use and waste in hot water distribution systems in California residences. We installed meters at every end use point and the water heater in 20 homes and collected 1s flow and temperature data over an 8 month period. For a typical shower and dishwasher events, approximately half the energy is wasted. This relatively low efficiency highlights the importance of further examining the energy and water waste in hot water distribution systems.« less
Sampling Approaches for Multi-Domain Internet Performance Measurement Infrastructures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calyam, Prasad
2014-09-15
The next-generation of high-performance networks being developed in DOE communities are critical for supporting current and emerging data-intensive science applications. The goal of this project is to investigate multi-domain network status sampling techniques and tools to measure/analyze performance, and thereby provide “network awareness” to end-users and network operators in DOE communities. We leverage the infrastructure and datasets available through perfSONAR, which is a multi-domain measurement framework that has been widely deployed in high-performance computing and networking communities; the DOE community is a core developer and the largest adopter of perfSONAR. Our investigations include development of semantic scheduling algorithms, measurement federationmore » policies, and tools to sample multi-domain and multi-layer network status within perfSONAR deployments. We validate our algorithms and policies with end-to-end measurement analysis tools for various monitoring objectives such as network weather forecasting, anomaly detection, and fault-diagnosis. In addition, we develop a multi-domain architecture for an enterprise-specific perfSONAR deployment that can implement monitoring-objective based sampling and that adheres to any domain-specific measurement policies.« less
[The eye of technology and the well being of women and men in Icelandic work places].
Rafnsdóttir, Guobjörg Linda; Tómasson, Kristinn; Guomundsdóttir, Margrét Lilja
2005-11-01
The study assessed the association between working under surveillance and electronic performance monitoring and the well-being among women and men in six Icelandic workplaces. In the time period from February to April 2003, a questionnaire based on the General Nordic Questionnaire for Psychological and Social Factors at Work was delivered to 1369 employees in six companies where different methods of electronic performance monitoring (EPM) are used. The data was analyzed using odds ratio and logistical regression. The response rate was 72%, with close to equal participation of men and women. The employees who were working under EPM were more likely to have poor psychosocial work-environment, to have experienced significant stress recently, to be mentally exhausted at the end of the workday, to have significant sleep difficulties and to be dissatisfied in their job. The development of the information and communication technology that allows employers and managers to monitor and collect different electronic data about the work process and productivity of the workers makes it important to follow the health condition of those who work under electronic performance monitoring.
2015-10-01
Arterial oxygen saturation was monitored 130 using a finger pulse oximeter and end-tidal CO2 (ETCO2) was collected from a nasal cannula 131 (Cardiocap/5...Johnson et al, J Appl Physiol 2014 PMID 24876357. 5 Keywords Trauma, coagulation, central venous pressure, stroke volume, pulse pressure...Johnson BD, Curry TB, Convertino VA, & Joyner MJ. The association between pulse pressure and stroke volume during lower body negative pressure and
Electrochemistry-based Battery Modeling for Prognostics
NASA Technical Reports Server (NTRS)
Daigle, Matthew J.; Kulkarni, Chetan Shrikant
2013-01-01
Batteries are used in a wide variety of applications. In recent years, they have become popular as a source of power for electric vehicles such as cars, unmanned aerial vehicles, and commericial passenger aircraft. In such application domains, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. To implement such technologies, it is crucial to understand how batteries work and to capture that knowledge in the form of models that can be used by monitoring, diagnosis, and prognosis algorithms. In this work, we develop electrochemistry-based models of lithium-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable accuracy for reliable EOD prediction in a variety of usage profiles. This paper reports on the progress of such a model, with results demonstrating the model validity and accurate EOD predictions.
Monitoring Method of Cutting Force by Using Additional Spindle Sensors
NASA Astrophysics Data System (ADS)
Sarhan, Ahmed Aly Diaa; Matsubara, Atsushi; Sugihara, Motoyuki; Saraie, Hidenori; Ibaraki, Soichi; Kakino, Yoshiaki
This paper describes a monitoring method of cutting forces for end milling process by using displacement sensors. Four eddy-current displacement sensors are installed on the spindle housing of a machining center so that they can detect the radial motion of the rotating spindle. Thermocouples are also attached to the spindle structure in order to examine the thermal effect in the displacement sensing. The change in the spindle stiffness due to the spindle temperature and the speed is investigated as well. Finally, the estimation performance of cutting forces using the spindle displacement sensors is experimentally investigated by machining tests on carbon steel in end milling operations under different cutting conditions. It is found that the monitoring errors are attributable to the thermal displacement of the spindle, the time lag of the sensing system, and the modeling error of the spindle stiffness. It is also shown that the root mean square errors between estimated and measured amplitudes of cutting forces are reduced to be less than 20N with proper selection of the linear stiffness.
Cirillo, V; Zito Marinosci, G; De Robertis, E; Iacono, C; Romano, G M; Desantis, O; Piazza, O; Servillo, G; Tufano, R
2015-11-01
The recently introduced Navigator® (GE Healthcare, Helsinki, Finland) and SmartPilot® View (Dräger Medical, Lübeck, Germany) show the concentrations and predicted effects of combined anesthetic drugs, and should facilitate more precisely their titration. Our aim was to evaluate if Navigator® or SmartPilot® View guided anesthesia was associated with a good quality of analgesia, depth of hypnosis and may reduce anesthetic requirements. We performed a prospective non-randomized study. Sixty ASA I-II patients undergoing balanced general anesthesia for abdominal and plastic surgery were enrolled. Patients were divided in 4 groups. Group 1 (N. 15) and group 3 (N. 15) were cases in whom anesthesia was performed with standard monitoring plus the aid of Navigator® (Nav) or SmartPilot® View (SPV) display. Group 2 (N. 15) and group 4 (N. 15) were controls in whom anesthesia was performed with standard monitoring (heart rate, NIBP, SpO2, end-tidal CO2, end-expired sevoflurane concentration, train of four, Bispectral Index [Aspect Medical Systems, Natick, MA, USA] or Entropy [GE Healthcare]). Patients' vital parameters and end-expired sevoflurane concentration were recorded during anesthesia. All patients recovered uneventfully and showed hemodynamic stability. End-tidal sevoflurane concentrations values [median (min-max)], during maintenance of anesthesia, were significantly (P<0.05) lower in SPV [1.1% (0.8-1.5)] and Nav [1%(0.8-1.8)] groups compared to SPV-control group [1.5%(1-2.5)] and Nav-control group [1.5%(0.8-2)]. BIS and entropy values were respectively higher in the SPV group [53 (46-57)] compared to the control group [43 (37-51)] (P<0.05) and Nav group [53 (43-60)] compared to the control group [41 (35-51)] (P<0.05). No significant differences in Remifentanil dosing were observed in the four groups. Navigator® and SmartPilot® View may be of clinical use in monitoring adequacy of anesthesia. Both displays can optimize the administration and monitoring of anesthetic drugs during general anesthesia and may reduce the consumption of volatile anesthetic agents.
Mainstream end-tidal carbon dioxide monitoring in the neonatal intensive care unit.
Rozycki, H J; Sysyn, G D; Marshall, M K; Malloy, R; Wiswell, T E
1998-04-01
Continuous noninvasive monitoring of arterial carbon dioxide (CO2) in neonatal intensive care unit (NICU) patients would help clinicians avoid complications of hypocarbia and hypercarbia. End-tidal CO2 monitoring has not been used in this population to date, but recent technical advances and the introduction of surfactant therapy, which improves ventilation-perfusion matching, might improve the clinical utility of end-tidal monitoring. To determine the accuracy and precision of end-tidal CO2 monitoring in NICU patients. Nonrandomized recording of simultaneous end-tidal and arterial CO2 pairs. Two university NICUs. Forty-five newborn infants receiving mechanical ventilation who had indwelling arterial access, and a predefined subsample of infants who were <1000 g birth weight, <8 days of age, and who received surfactant therapy (extremely low birth weight -ELBW- <8). The correlation coefficient, degree of bias, and 95% confidence interval were determined for both the overall population and the ELBW <8 subgroup. Those factors which significantly influenced the bias were identified. The ability of the end-tidal monitor to alert the clinician to instances of hypocarbia or hypercarbia was determined. There were 411 end-tidal/arterial pairs analyzed from 45 patients. The correlation coefficient was 0.833 and the bias was -6. 9 mm Hg (95% confidence interval, +/-11.5 mm Hg). The results did not differ markedly in the ELBW <8 infants. Measures of the degree of lung disease, the ventilation index and the oxygenation index, had small influences on the degree of bias. This type of capnometry identified 91% of the instances when the arterial CO2 pressure was between 34 and 54 mm Hg using an end-tidal range of 29 to 45 mm Hg. End-tidal values outside this range had a 63% accuracy in predicting hypocarbia or hypercarbia. End-tidal CO2 monitoring in NICU patients is as accurate as capillary or transcutaneous monitoring but less precise than the latter. It may be useful for trending or for screening patients for abnormal arterial CO2 values.
Corrosion probe. Innovative technology summary report
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Over 253 million liters of high-level waste (HLW) generated from plutonium production is stored in mild steel tanks at the Department of Energy (DOE) Hanford Site. Corrosion monitoring of double-shell storage tanks (DSTs) is currently performed at Hanford using a combination of process knowledge and tank waste sampling and analysis. Available technologies for corrosion monitoring have progressed to a point where it is feasible to monitor and control corrosion by on-line monitoring of the corrosion process and direct addition of corrosion inhibitors. The electrochemical noise (EN) technique deploys EN-based corrosion monitoring probes into storage tanks. This system is specifically designedmore » to measure corrosion rates and detect changes in waste chemistry that trigger the onset of pitting and cracking. These on-line probes can determine whether additional corrosion inhibitor is required and, if so, provide information on an effective end point to the corrosion inhibitor addition procedure. This report describes the technology, its performance, its application, costs, regulatory and policy issues, and lessons learned.« less
Extending BPM Environments of Your Choice with Performance Related Decision Support
NASA Astrophysics Data System (ADS)
Fritzsche, Mathias; Picht, Michael; Gilani, Wasif; Spence, Ivor; Brown, John; Kilpatrick, Peter
What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools, process optimizations or a combination of such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.
Kumar Thakur, Rupak; Anoop, C S
2015-08-01
Cardio-vascular health monitoring has gained considerable attention in the recent years. Principle of non-contact capacitive electrocardiograph (ECG) and its applicability as a valuable, low-cost, easy-to-use scheme for cardio-vascular health monitoring has been demonstrated in some recent research papers. In this paper, we develop a complete non-contact ECG system using a suitable front-end electronic circuit and a heart-rate (HR) measurement unit using enhanced Fourier interpolation technique. The front-end electronic circuit is realized using low-cost, readily available components and the proposed HR measurement unit is designed to achieve fairly accurate results. The entire system has been extensively tested to verify its efficacy and test results show that the developed system can estimate HR with an accuracy of ±2 beats. Detailed tests have been conducted to validate the performance of the system for different cloth thicknesses of the subject. Some basic tests which illustrate the application of the proposed system for heart-rate variability estimation has been conducted and results reported. The developed system can be used as a portable, reliable, long-term cardiac health monitoring device and can be extended to human drowsiness detection.
Implementation experience of a patient monitoring solution based on end-to-end standards.
Martinez, I; Fernandez, J; Galarraga, M; Serrano, L; de Toledo, P; Escayola, J; Jimenez-Fernandez, S; Led, S; Martinez-Espronceda, M; Garcia, J
2007-01-01
This paper presents a proof-of-concept design of a patient monitoring solution for Intensive Care Unit (ICU). It is end-to-end standards-based, using ISO/IEEE 11073 (X73) in the bedside environment and EN13606 to communicate the information to an Electronic Healthcare Record (EHR) server. At the bedside end a plug-and-play sensor network is implemented, which communicates with a gateway that collects the medical information and sends it to a monitoring server. At this point the server transforms the data frame into an EN13606 extract, to be stored on the EHR server. The presented system has been tested in a laboratory environment to demonstrate the feasibility of this end-to-end standards-based solution.
Description of the SSF PMAD DC testbed control system data acquisition function
NASA Technical Reports Server (NTRS)
Baez, Anastacio N.; Mackin, Michael; Wright, Theodore
1992-01-01
The NASA LeRC in Cleveland, Ohio has completed the development and integration of a Power Management and Distribution (PMAD) DC Testbed. This testbed is a reduced scale representation of the end to end, sources to loads, Space Station Freedom Electrical Power System (SSF EPS). This unique facility is being used to demonstrate DC power generation and distribution, power management and control, and system operation techniques considered to be prime candidates for the Space Station Freedom. A key capability of the testbed is its ability to be configured to address system level issues in support of critical SSF program design milestones. Electrical power system control and operation issues like source control, source regulation, system fault protection, end-to-end system stability, health monitoring, resource allocation, and resource management are being evaluated in the testbed. The SSF EPS control functional allocation between on-board computers and ground based systems is evolving. Initially, ground based systems will perform the bulk of power system control and operation. The EPS control system is required to continuously monitor and determine the current state of the power system. The DC Testbed Control System consists of standard controllers arranged in a hierarchical and distributed architecture. These controllers provide all the monitoring and control functions for the DC Testbed Electrical Power System. Higher level controllers include the Power Management Controller, Load Management Controller, Operator Interface System, and a network of computer systems that perform some of the SSF Ground based Control Center Operation. The lower level controllers include Main Bus Switch Controllers and Photovoltaic Controllers. Power system status information is periodically provided to the higher level controllers to perform system control and operation. The data acquisition function of the control system is distributed among the various levels of the hierarchy. Data requirements are dictated by the control system algorithms being implemented at each level. A functional description of the various levels of the testbed control system architecture, the data acquisition function, and the status of its implementationis presented.
Testing the Feasibility of a Low-Cost Network Performance Measurement Infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chevalier, Scott; Schopf, Jennifer M.; Miller, Kenneth
2016-07-01
Todays science collaborations depend on reliable, high performance networks, but monitoring the end-to-end performance of a network can be costly and difficult. The most accurate approaches involve using measurement equipment in many locations, which can be both expensive and difficult to manage due to immobile or complicated assets. The perfSONAR framework facilitates network measurement making management of the tests more reasonable. Traditional deployments have used over-provisioned servers, which can be expensive to deploy and maintain. As scientific network uses proliferate, there is a desire to instrument more facets of a network to better understand trends. This work explores low costmore » alternatives to assist with network measurement. Benefits include the ability to deploy more resources quickly, and reduced capital and operating expenditures. Finally, we present candidate platforms and a testing scenario that evaluated the relative merits of four types of small form factor equipment to deliver accurate performance measurements.« less
Tracking the NOvA Detectors' Performance
NASA Astrophysics Data System (ADS)
Psihas, Fernanda; NOvA Collaboration
2016-03-01
The NOvA experiment measures long baseline νμ -->νe oscillations in Fermilab's NuMI beam. We employ two detectors equipped with over 10 thousand sets of data-taking electronics; avalanche photo diodes and front end boards which collect and process the scintillation signal from particle interactions within the detectors. These sets of electronics -as well as the systems which power and cool them- must be monitored and maintained at precise working conditions to ensure maximal data-taking uptime, good data quality and a lasting life for our detectors. This poster describes the automated systems used on NOvA to simultaneously monitor our data quality, diagnose hardware issues, track our performance and coordinate maintenance for the detectors.
Validation of the Natus CO-Stat End Tidal Breath Analyzer in children and adults.
Vreman, H J; Wong, R J; Harmatz, P; Fanaroff, A A; Berman, B; Stevenson, D K
1999-12-01
The performance of a point-of-care, noninvasive end tidal breath carbon monoxide analyzer (CO-Stat End Tidal Breath Analyzer, Natus Medical Inc.) that also reports end tidal carbon dioxide (ETCO2) and respiratory rate (RR), was compared to established, marketed (predicate) devices in children (n = 39) and adults (n = 48) who are normal or at-risk of elevated CO excretion. Concentrations of end tidal breath CO (ETCO), room air CO, ETCO corrected for inhaled CO (ETCOc), ETCO2, and RR were measured with the CO-Stat analyzer and the data compared to those obtained from the same subjects using the Vitalograph BreathCO monitor (Vitalograph, Inc.) for ETCOc and the Pryon CO2 monitor (SC210 and SC300, Pryon Corp) for ETCO2 and RR. Adults and children were studied at three medical centers. The data were analyzed by paired t-tests and linear regression. Bias and imprecision between the CO-Stat analyzer and the predicate devices was calculated by the method of Bland and Altman. Paired t-tests, performed on the three parameters measured with the CO-Stat analyzer and predicate devices showed that only the ETCOc values in the adults and the ETCO2 values in the children were significantly different (lower, p < or = 0.0001, and higher, p < or = 0.0001, respectively). The mean bias and imprecision of the CO-Stat analyzer for adult ETCOc and children ETCO2 measurements were -0.9 +/- 1.2 ppm and 0.4 +/- 0.6%, respectively. Linear regression analysis for the ETCOc results in children and adults had a high degree of correlation (r = 0.91 and 0.98, respectively). We conclude that in a clinical environment the Natus CO-Stat End Tidal Breath Analyzer performs at least as well as predicate devices for the measurements of ETCOc, ETCO2, and RR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, T.; Curtis, O.; Stephenson, R.
As part of the NAHB Research Center Industry Partnership, Southface partnered with TaC Studios, an Atlanta-based architecture firm specializing in residential and light commercial design, on the construction of a new test home in Atlanta, GA, in the mixed humid climate. This home serves as a residence and home office for the firm's owners, as well as a demonstration of their design approach to potential and current clients. Southface believes the home demonstrates current best practices for the mixed-humid climate, including a building envelope featuring advanced air sealing details and low density spray foam insulation, glazing that exceeds ENERGY STARmore » requirements, and a high performance heating and cooling system. Construction quality and execution was a high priority for TaC Studios and was ensured by a third party review process. Post-construction testing showed that the project met stated goals for envelope performance, an air infiltration rate of 2.15 ACH50. The homeowners wished to further validate whole house energy savings through the project's involvement with Building America and this long-term monitoring effort. As a Building America test home, this home was evaluated to detail whole house energy use, end use loads, and the efficiency and operation of the ground source heat pump and associated systems. Given that the home includes many non-typical end use loads including a home office, pool, landscape water feature, and other luxury features not accounted for in Building America modeling tools, these end uses were separately monitored to determine their impact on overall energy consumption.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, T.; Curtis, O.; Stephenson, R.
As part of the NAHB Research Center Industry Partnership, Southface partnered with TaC Studios, an Atlanta-based architecture firm specializing in residential and light commercial design, on the construction of a new test home in Atlanta, GA in the mixed humid climate. This home serves as a residence and home office for the firm's owners, as well as a demonstration of their design approach topotential and current clients. Southface believes the home demonstrates current best practices for the mixed-humid climate, including a building envelope featuring advanced air sealing details and low density spray foam insulation, glazing that exceeds ENERGY STAR requirements,more » and a high performance heating and cooling system. Construction quality and execution was a high priority for TaCStudios and was ensured by a third party review process. Post-construction testing showed that the project met stated goals for envelope performance, an air infiltration rate of 2.15 ACH50. The homeowners wished to further validate whole house energy savings through the project's involvement with Building America and this long-term monitoring effort. As a Building America test home, this homewas evaluated to detail whole house energy use, end use loads, and the efficiency and operation of the ground source heat pump and associated systems. Given that the home includes many non-typical end use loads including a home office, pool, landscape water feature, and other luxury features not accounted for in Building America modeling tools, these end uses were separately monitored todetermine their impact on overall energy consumption.« less
Manufacturing of Wearable Sensors for Human Health and Performance Monitoring
NASA Astrophysics Data System (ADS)
Alizadeh, Azar
2015-03-01
Continuous monitoring of physiological and biological parameters is expected to improve performance and medical outcomes by assessing overall health status and alerting for life-saving interventions. Continuous monitoring of these parameters requires wearable devices with an appropriate form factor (lightweight, comfortable, low energy consuming and even single-use) to avoid disrupting daily activities thus ensuring operation relevance and user acceptance. Many previous efforts to implement remote and wearable sensors have suffered from high cost and poor performance, as well as low clinical and end-use acceptance. New manufacturing and system level design approaches are needed to make the performance and clinical benefits of these sensors possible while satisfying challenging economic, regulatory, clinical, and user-acceptance criteria. In this talk we will review several recent design and manufacturing efforts aimed at designing and building prototype wearable sensors. We will discuss unique opportunities and challenges provided by additive manufacturing, including 3D printing, to drive innovation through new designs, faster prototyping and manufacturing, distributed networks, and new ecosystems. We will also show alternative hybrid self-assembly based integration techniques for low cost large scale manufacturing of single use wearable devices. Coauthors: Prabhjot Singh and Jeffrey Ashe.
Final Report - Cloud-Based Management Platform for Distributed, Multi-Domain Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chowdhury, Pulak; Mukherjee, Biswanath
2017-11-03
In this Department of Energy (DOE) Small Business Innovation Research (SBIR) Phase II project final report, Ennetix presents the development of a solution for end-to-end monitoring, analysis, and visualization of network performance for distributed networks. This solution benefits enterprises of all sizes, operators of distributed and federated networks, and service providers.
Characteristics and Performance of Existing Load Disaggregation Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayhorn, Ebony T.; Sullivan, Greg P.; Butner, Ryan S.
2015-04-10
Non-intrusive load monitoring (NILM) or non-intrusive appliance load monitoring (NIALM) is an analytic approach to disaggregate building loads based on a single metering point. This advanced load monitoring and disaggregation technique has the potential to provide an alternative solution to high-priced traditional sub-metering and enable innovative approaches for energy conservation, energy efficiency, and demand response. However, since the inception of the concept in the 1980’s, evaluations of these technologies have focused on reporting performance accuracy without investigating sources of inaccuracies or fully understanding and articulating the meaning of the metrics used to quantify performance. As a result, the market for,more » as well as, advances in these technologies have been slowly maturing.To improve the market for these NILM technologies, there has to be confidence that the deployment will lead to benefits. In reality, every end-user and application that this technology may enable does not require the highest levels of performance accuracy to produce benefits. Also, there are other important characteristics that need to be considered, which may affect the appeal of NILM products to certain market targets (i.e. residential and commercial building consumers) and the suitability for particular applications. These characteristics include the following: 1) ease of use, the level of expertise/bandwidth required to properly use the product; 2) ease of installation, the level of expertise required to install along with hardware needs that impact product cost; and 3) ability to inform decisions and actions, whether the energy outputs received by end-users (e.g. third party applications, residential users, building operators, etc.) empower decisions and actions to be taken at time frames required for certain applications. Therefore, stakeholders, researchers, and other interested parties should be kept abreast of the evolving capabilities, uses, and characteristics of NILM that make them attractive for certain building environments and different classes of end-users. The intent of this report is to raise awareness of trending NILM approaches. Additionally, three existing technologies were acquired and evaluated using the Residential Building Stock Assessment (RBSA) owner-occupied test bed operated by the Northwest Energy Efficiency Alliance (NEEA) to understand performance accuracy of current NILM products under realistic conditions. Based on this field study experience, the characteristics exhibited by the NILM products included in the assessment are also discussed in this report in terms of ease of use, ease of installation, ability to inform decisions and actions. Results of the analysis performed to investigate the accuracy of the participating NILM products in estimating energy use of individual appliances are also presented.« less
Monitoring worksite clinic performance using a cost-benefit tool.
Tao, Xuguang; Chenoweth, David; Alfriend, Amy S; Baron, David M; Kirkland, Tracie W; Scherb, Jill; Bernacki, Edward J
2009-10-01
The purpose of this study was to explore the usefulness of continuously assessing the return on investment (ROI) of worksite medical clinics as a means of evaluating clinic performance. Visit data from January 1, 2007, to December 31, 2008, were collected from all the on-site clinics operated for the Pepsi Bottling Group. An average system-wide ROI was calculated from the time of each clinic's opening and throughout the study period. A multivariate linear regression model was used to determine the association of average ROI with penetration/utilization rate and plant size. A total of 26 on-site clinics were actively running as of December 2008. The average ROI at the time of start up was 0.4, which increased to 1.2 at approximately 4 months and 1.6 at the end of the first year of operation. Overall, it seems that the cost of operating a clinic becomes equal to the cost of similar care purchased in the community (ROI = 1) at approximately 3 months after a clinic's opening and flattens out at the end of the first year. The magnitude of the ROI was closely related to the number of visits (a function of the penetration/utilization rate) and the size of the plant population served. Serial monitoring of ROIs is a useful metric in assessing on-site clinic performance and quantifying the effect of new initiatives aimed at increasing a clinic's cost effectiveness.
Validating Future Force Performance Measures (Army Class): End of Training Longitudinal Validation
2009-09-01
Organization 66 Canal Center Plaza, Suite 700 Alexandria, Virginia 22314 8 . PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING...4 Table B. 8 . Intercorrelations among RBI Scale Scores...12. Intercorrelations among WPA Dimension and Facet Scores ...................................... 8 x xi CONTENTS (continued) Page
48 CFR 1852.216-77 - Award fee for end item contracts.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Contractor's performance for the entire contract will be evaluated to determine total earned award fee. No award fee or base fee will be paid to the Contractor if the final award fee evaluation is “poor... the Contractor's interim performance every 6* months to monitor Contractor performance prior to...
The Application of Biocybernetic Techniques to Enhance Pilot Performance during Tactical missions.
1979-10-01
and observe the altitude indicator ( meatball ) at the end of the runway. At touchdown, the pilot must apply thrust until the arresting hook catches. 103...l PILOT TASES 3 ~ CRITICALI DIFFICULTY J.[ 27 COMMAND TYRUST FOR BOLTER LT SAFETY" D’ENM .....MM................. ...... I.26 MONITOR MEATBALL FLT...THRUST FOR BOLTER ............................................................................ X MONITOR MEATBALL
Ethical considerations in adherence research.
Patel, Nupur U; Moore, Blake A; Craver, Rebekah F; Feldman, Steven R
2016-01-01
Poor adherence to treatment is a common cause of medical treatment failure. Studying adherence is complicated by the potential for the study environment to impact adherence behavior. Studies performed without informing patients about adherence monitoring must balance the risks of deception against the potential benefits of the knowledge to be gained. Ethically monitoring a patient's adherence to a treatment plan without full disclosure of the monitoring plan requires protecting the patient's rights and upholding the fiduciary obligations of the investigator. Adherence monitoring can utilize different levels of deception varying from stealth monitoring, debriefing after the study while informing the subject that some information had been withheld in regard to the use of adherence monitoring (withholding), informed consent that discloses some form of adherence monitoring is being used and will be disclosed at the end of the study (authorized deception), and full disclosure. Different approaches offer different benefits and potential pitfalls. The approach used must balance the risk of nondisclosure against the potential for confounding the adherence monitoring data and the potential benefits that adherence monitoring data will have for the research subjects and/or other populations. This commentary aims to define various methods of adherence monitoring and to provide a discussion of the ethical considerations that accompany the use of each method and adherence monitoring in general as it is used in clinical research.
EMODnet MedSea Checkpoint for sustainable Blue Growth
NASA Astrophysics Data System (ADS)
Moussat, Eric; Pinardi, Nadia; Manzella, Giuseppe; Blanc, Frederique
2016-04-01
The EMODNET checkpoint is a wide monitoring system assessment activity aiming to support the sustainable Blue Growth at the scale of the European Sea Basins by: 1) Clarifying the observation landscape of all compartments of the marine environment including Air, Water, Seabed, Biota and Human activities, pointing out to the existing programs, national, European and international 2) Evaluating fitness for use indicators that will show the accessibility and usability of observation and modeling data sets and their roles and synergies based upon selected applications by the European Marine Environment Strategy 3) Prioritizing the needs to optimize the overall monitoring Infrastructure (in situ and satellite data collection and assembling, data management and networking, modeling and forecasting, geo-infrastructure) and release recommendations for evolutions to better meet the application requirements in view of sustainable Blue Growth The assessment is designed for : - Institutional stakeholders for decision making on observation and monitoring systems - Data providers and producers to know how their data collected once for a given purpose could fit other user needs - End-users interested in a regional status and possible uses of existing monitoring data Selected end-user applications are of paramount importance for: (i) the blue economy sector (offshore industries, fisheries); (ii) marine environment variability and change (eutrophication, river inputs and ocean climate change impacts); (iii) emergency management (oil spills); and (iv) preservation of natural resources and biodiversity (Marine Protected Areas). End-user applications generate innovative products based on the existing observation landscape. The fitness for use assessment is made thanks to the comparison of the expected product specifications with the quality of the product derived from the selected data. This involves the development of checkpoint information and indicators based on Data quality and Metadata standards for geographic information (ISO 19157 and ISO 19115 respectively). The fitness for use of the input datasets are assessed using 2 categories of criteria to determine how these datasets fits the user requirements which drive them to select a data source rather than another one and to show performance and gaps of the present monitoring systems : • Data appropriateness : what is made available to the user ?. • Data availability : how it is made available to the user? All information are stored in a GIS platform and made available with two types of interfaces: - Front-end interfaces with users, to present the input data used by all challenges, the innovative products generated by challenges and the assessment indicators. - Back-end interfaces to partners, to store the checkpoint descriptors of input data, specification to generate targeted products, catalogue information of products with associated checkpoint indicators linked to the input data The validation of the records is done at three levels, at technical level (GIS), at challenge level (use), and at sea basin level (synthesis of monitoring data adequacy including expert comments) to end with the production of a yearly Data Adequacy Report.
SME2EM: Smart mobile end-to-end monitoring architecture for life-long diseases.
Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani
2016-01-01
Monitoring life-long diseases requires continuous measurements and recording of physical vital signs. Most of these diseases are manifested through unexpected and non-uniform occurrences and behaviors. It is impractical to keep patients in hospitals, health-care institutions, or even at home for long periods of time. Monitoring solutions based on smartphones combined with mobile sensors and wireless communication technologies are a potential candidate to support complete mobility-freedom, not only for patients, but also for physicians. However, existing monitoring architectures based on smartphones and modern communication technologies are not suitable to address some challenging issues, such as intensive and big data, resource constraints, data integration, and context awareness in an integrated framework. This manuscript provides a novel mobile-based end-to-end architecture for live monitoring and visualization of life-long diseases. The proposed architecture provides smartness features to cope with continuous monitoring, data explosion, dynamic adaptation, unlimited mobility, and constrained devices resources. The integration of the architecture׳s components provides information about diseases׳ recurrences as soon as they occur to expedite taking necessary actions, and thus prevent severe consequences. Our architecture system is formally model-checked to automatically verify its correctness against designers׳ desirable properties at design time. Its components are fully implemented as Web services with respect to the SOA architecture to be easy to deploy and integrate, and supported by Cloud infrastructure and services to allow high scalability, availability of processes and data being stored and exchanged. The architecture׳s applicability is evaluated through concrete experimental scenarios on monitoring and visualizing states of epileptic diseases. The obtained theoretical and experimental results are very promising and efficiently satisfy the proposed architecture׳s objectives, including resource awareness, smart data integration and visualization, cost reduction, and performance guarantee. Copyright © 2015 Elsevier Ltd. All rights reserved.
De Beer, T R M; Allesø, M; Goethals, F; Coppens, A; Heyden, Y Vander; De Diego, H Lopez; Rantanen, J; Verpoort, F; Vervaet, C; Remon, J P; Baeyens, W R G
2007-11-01
The aim of the present study was to propose a strategy for the implementation of a Process Analytical Technology system in freeze-drying processes. Mannitol solutions, some of them supplied with NaCl, were used as models to freeze-dry. Noninvasive and in-line Raman measurements were continuously performed during lyophilization of the solutions to monitor real time the mannitol solid state, the end points of the different process steps (freezing, primary drying, secondary drying), and physical phenomena occurring during the process. At-line near-infrared (NIR) and X-ray powder diffractometry (XRPD) measurements were done to confirm the Raman conclusions and to find out additional information. The collected spectra during the processes were analyzed using principal component analysis and multivariate curve resolution. A two-level full factorial design was used to study the significant influence of process (freezing rate) and formulation variables (concentration of mannitol, concentration of NaCl, volume of freeze-dried sample) upon freeze-drying. Raman spectroscopy was able to monitor (i) the mannitol solid state (amorphous, alpha, beta, delta, and hemihydrate), (ii) several process step end points (end of mannitol crystallization during freezing, primary drying), and (iii) physical phenomena occurring during freeze-drying (onset of ice nucleation, onset of mannitol crystallization during the freezing step, onset of ice sublimation). NIR proved to be a more sensitive tool to monitor sublimation than Raman spectroscopy, while XRPD helped to unravel the mannitol hemihydrate in the samples. The experimental design results showed that several process and formulation variables significantly influence different aspects of lyophilization and that both are interrelated. Raman spectroscopy (in-line) and NIR spectroscopy and XRPD (at-line) not only allowed the real-time monitoring of mannitol freeze-drying processes but also helped (in combination with experimental design) us to understand the process.
CCSDS Spacecraft Monitor and Control Service Framework
NASA Technical Reports Server (NTRS)
Merri, Mario; Schmidt, Michael; Ercolani, Alessandro; Dankiewicz, Ivan; Cooper, Sam; Thompson, Roger; Symonds, Martin; Oyake, Amalaye; Vaughs, Ashton; Shames, Peter
2004-01-01
This CCSDS paper presents a reference architecture and service framework for spacecraft monitoring and control. It has been prepared by the Spacecraft Monitoring and Control working group of the CCSDS Mission Operations and Information Management Systems (MOIMS) area. In this context, Spacecraft Monitoring and Control (SM&C) refers to end-to-end services between on- board or remote applications and ground-based functions responsible for mission operations. The scope of SM&C includes: 1) Operational Concept: definition of an operational concept that covers a set of standard operations activities related to the monitoring and control of both ground and space segments. 2) Core Set of Services: definition of an extensible set of services to support the operational concept together with its information model and behaviours. This includes (non exhaustively) ground systems such as Automatic Command and Control, Data Archiving and Retrieval, Flight Dynamics, Mission Planning and Performance Evaluation. 3) Application-layer information: definition of the standard information set to be exchanged for SM&C purposes.
Development of a Pre-Prototype Power Assisted Glove End Effector for Extravehicular Activity
NASA Technical Reports Server (NTRS)
1986-01-01
The purpose of this program was to develop an EVA power tool which is capable of performing a variety of functions while at the same time increasing the EVA crewmember's effectiveness by reducing hand fatigue associated with gripping tools through a pressurized EMU glove. The Power Assisted Glove End Effector (PAGE) preprototype hardware met or exceeded all of its technical requirements and has incorporated acoustic feedback to allow the EVA crewmember to monitor motor loading and speed. If this tool is to be developed for flight use, several issues need to be addressed. These issues are listed.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 9 2011-07-01 2011-07-01 false Back-end process provisions-monitoring... Standards for Hazardous Air Pollutant Emissions: Group I Polymers and Resins § 63.497 Back-end process... limitations. (a) An owner or operator complying with the residual organic HAP limitations in § 63.494(a)(1...
40 CFR 60.185 - Monitoring of operations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Primary Lead... reverberatory furnace, or sintering machine discharge end. The span of this system shall be set at 80 to 100... discharged into the atmosphere from any sintering machine, electric furnace or converter subject to § 60.183...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Leon E.; Conrad, Ryan C.; Keller, Daniel T.
The International Atomic Energy Agency (IAEA) deploys unattended monitoring systems to provide continuous monitoring of nuclear material within safeguarded facilities around the world. As the number of unattended monitoring instruments increases, the IAEA is challenged to become more efficient in the implementation of those systems. In 2010, the IAEA initiated the Front-End Electronics for Unattended Measurement (FEUM) project with the goals of greater flexibility in the interfaces to various sensors and data acquisition systems, and improved capabilities for remotely located sensors (e.g., where sensor and front-end electronics might be separated by tens of meters). In consultation with the IAEA, amore » technical evaluation of a candidate FEUM device produced by a commercial vendor is being performed. This evaluation is assessing the device against the IAEA’s original technical specifications and a broad range of important parameters that included sensor types, cable types, and industrial electromagnetic noise that can degrade signals from remotely located detectors. Testing has been performed in a laboratory and also in environments representative of IAEA deployments. The results are expected to inform the IAEA about where and how FEUM devices might be implemented in the field. Data and preliminary findings from the testing performed to date are presented.« less
Auinger, Andreas; Riedl, René; Kindermann, Harald; Helfert, Markus; Ocenasek, Helmuth
2017-01-01
Research has shown that physical activity is essential in the prevention and treatment of chronic diseases like cardiovascular disease (CVD). Smart wearables (e.g., smartwatches) are increasingly used to foster and monitor human behaviour, including physical activity. However, despite this increased usage, little evidence is available on the effects of smart wearables in behaviour change. The little research which is available typically focuses on the behaviour of healthy individuals rather than patients. In this study, we investigate the effects of using smart wearables by patients undergoing cardiac rehabilitation. A field experiment involving 29 patients was designed and participants were either assigned to the study group (N = 13 patients who finished the study and used a self-tracking device) or the control group (N = 16 patients who finished the study and did not use a device). For both groups data about physiological performance during cardiac stress test was collected at the beginning (baseline), in the middle (in week 6, at the end of the rehabilitation in the organized rehabilitation setting), and at the end of the study (after 12 weeks, at the end of the rehabilitation, including the organized rehabilitation plus another 6 weeks of self-organized rehabilitation). Comparing the physiological performance of both groups, the data showed significant differences. The participants in the study group not only maintained the same performance level as during the midterm examination in week 6, they improved performance even further during the six weeks that followed. The results presented in this paper provide evidence for positive effects of digital self-tracking by patients undergoing cardiac rehabilitation on performance of the cardiovascular system. In this way, our study provides novel insight about the effects of the use of smart wearables by CVD patients. Our findings have implications for the design of self-management approaches in a patient rehabilitation setting. In essence, the use of smart wearables can prolong the success of the rehabilitation outside of the organized rehabilitation setting. PMID:29020079
The Deep Space Network information system in the year 2000
NASA Technical Reports Server (NTRS)
Markley, R. W.; Beswick, C. A.
1992-01-01
The Deep Space Network (DSN), the largest, most sensitive scientific communications and radio navigation network in the world, is considered. Focus is made on the telemetry processing, monitor and control, and ground data transport architectures of the DSN ground information system envisioned for the year 2000. The telemetry architecture will be unified from the front-end area to the end user. It will provide highly automated monitor and control of the DSN, automated configuration of support activities, and a vastly improved human interface. Automated decision support systems will be in place for DSN resource management, performance analysis, fault diagnosis, and contingency management.
Algebraic Approaches for Scalable End-to-End Monitoring and Diagnosis
NASA Astrophysics Data System (ADS)
Zhao, Yao; Chen, Yan
The rigidity of the Internet architecture led to flourish in the research of end-to-end based systems. In this chapter, we describe a linear algebra-based end-to-end monitoring and diagnosis system. We first propose a tomography-based overlay monitoring system (TOM). Given n end hosts, TOM selectively monitors a basis set of O(nlogn) paths out of all n(n - 1) end-to-end paths. Any end-to-end path can be written as a unique linear combination of paths in the basis set. Consequently, by monitoring loss rates for the paths in the basis set, TOM infers loss rates for all end-to-end paths. Furthermore, leveraging on the scalable measurements from the TOM system, we propose the Least-biased End-to-End Network Diagnosis (in short, LEND) system. We define a minimal identifiable link sequence (MILS) as a link sequence of minimal length whose properties can be uniquely identified from end-to-end measurements. LEND applies an algebraic approach to find out the MILSes and infers the properties of the MILSes efficiently. This also means LEND system achieves the finest diagnosis granularity under the least biased statistical assumptions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2014-09-01
As part of the NAHB Research Center Industry Partnership, Southface partnered with TaC Studios, an Atlanta based architecture firm specializing in residential and light commercial design, on the construction of a new test home in Atlanta, GA in the mixed-humid climate. This home serves as a residence and home office for the firm's owners, as well as a demonstration of their design approach to potential and current clients. Southface believes the home demonstrates current best practices for the mixed-humid climate, including a building envelope featuring advanced air sealing details and low density spray foam insulation, glazing that exceeds ENERGY STARmore » requirements, and a high performance heating and cooling system. Construction quality and execution was a high priority for TaC Studios and was ensured by a third party review process. Post construction testing showed that the project met stated goals for envelope performance, an air infiltration rate of 2.15 ACH50. The homeowner's wished to further validate whole house energy savings through the project's involvement with Building America and this long-term monitoring effort. As a Building America test home, this home was evaluated to detail whole house energy use, end use loads, and the efficiency and operation of the ground source heat pump and associated systems. Given that the home includes many non-typical end use loads including a home office, pool, landscape water feature, and other luxury features not accounted for in Building America modeling tools, these end uses were separately monitored to determine their impact on overall energy consumption.« less
Influence of tip end-plate on noise of small axial fan
NASA Astrophysics Data System (ADS)
Mao, Hongya; Wang, Yanping; Lin, Peifeng; Jin, Yingzi; Setoguchi, Toshiaki; Kim, Heuy Dong
2017-02-01
In this work, tip end-plate is used to improve the noise performance of small axial fans. Both numerical simulations and experimental methods were adopted to study the fluid flow and noise level of axial fans. Four modified models and the prototype are simulated. Influences of tip end-plate on static characteristics, internal flow field and noise of small axial fans are analyzed. The results show that on basis of the prototype, the model with the tip end-plate of 2 mm width and changed length achieved best noise performance. The overall sound pressure level of the model with the tip end-plate of 2 mm width and changed length is 2.4 dB less than that of the prototype at the monitoring point in specified far field. It is found that the mechanism of noise reduction is due to the decrease of vorticity variation on the surface of blades caused by the tip end-plate. Compared with the prototype, the static pressure of the model with the tip end-plate of 2 mm width and changed length at design flow rate decreases by 2 Pa and the efficiency decreases by 0.8%. It is concluded that the method of adding tip end-plate to impeller blades has a positive influence on reducing noise, but it may diminish the static characteristics of small axial fan to some extent.
Performance of an implantable impedance spectroscopy monitor using ZigBee
NASA Astrophysics Data System (ADS)
Bogónez-Franco, P.; Bayés-Genís, A.; Rosell, J.; Bragós, R.
2010-04-01
This paper presents the characterization measurements of an implantable bioimpedance monitor with ZigBee. Such measurements are done over RC networks, performing short and long-term measurements, with and without mismatch in electrodes and varying the temperature and the RF range. The bioimpedance monitor will be used in organ monitoring through electrical impedance spectroscopy in the 100 Hz - 200 kHz range. The specific application is the study of the viability and evolution of engineered tissue in cardiac regeneration in an experimental protocol with pig models. The bioimpedance monitor includes a ZigBee transceiver to transmit the measured data outside the animal chest. The bioimpedance monitor is based in the 12 Bit Impedance Converter and Network Analyzer AD5933, improved with an analog front-end that implements a 4-electrode measurement structure and allows to measure small impedances. In the debugging prototype, the system autonomy exceeds 1 month when a 14 frequencies impedance spectrum is acquired every 5 minutes. The receiver side consists of a ZigBee transceiver connected to a PC to process the received data. In the current implementation, the effective range of the RF link was of a few centimeters, then needing a range extender placed close to the animal. We have increased it by using an antenna with higher gain. Basic errors in the phantom circuit parameters estimation after model fitting are below 1%.
Helium gas purity monitor based on low frequency acoustic resonance
NASA Astrophysics Data System (ADS)
Kasthurirengan, S.; Jacob, S.; Karunanithi, R.; Karthikeyan, A.
1996-05-01
Monitoring gas purity is an important aspect of gas recovery stations where air is usually one of the major impurities. Purity monitors of Katherometric type are commercially available for this purpose. Alternatively, we discuss here a helium gas purity monitor based on acoustic resonance of a cavity at audio frequencies. It measures the purity by monitoring the resonant frequency of a cylindrical cavity filled with the gas under test and excited by conventional telephone transducers fixed at the ends. The use of the latter simplifies the design considerably. The paper discusses the details of the resonant cavity and the electronic circuit along with temperature compensation. The unit has been calibrated with helium gas of known purities. The unit has a response time of the order of 10 minutes and measures the gas purity to an accuracy of 0.02%. The unit has been installed in our helium recovery system and is found to perform satisfactorily.
Travel reliability inventory for Chicago.
DOT National Transportation Integrated Search
2013-04-01
The overarching goal of this research project is to enable state DOTs to document and monitor the reliability performance : of their highway networks. To this end, a computer tool, TRIC, was developed to produce travel reliability inventories from : ...
NASA Astrophysics Data System (ADS)
Wang, Hong; Cao, Xiaojian; Jia, Ke; Chai, Xueting; Lu, Hua; Lu, Zuhong
2001-10-01
A fiber optic fluorescence biosensor for choline is introduced in this paper. Choline is an important neurotransmitter in mammals. Due to the growing needs for on-site clinical monitoring of the choline, much effect has been devoted to develop choline biosensors. Fiber-optic fluorescence biosensors have many advantages, including miniaturization, flexibility, and lack of electrical contact and interference. The choline fiber-optic biosensor we designed implemented a bifurcated fiber to perform fluorescence measurements. The light of the blue LED is coupled into one end of the fiber as excitation and the emission spectrum from sensing film is monitored by fiber-spectrometer (S2000, Ocean Optics) through the other end of the fiber. The sensing end of the fiber is coated with Nafion film dispersed with choline oxidase and oxygen sensitive luminescent Ru(II) complex (Tris(2,2'-bipyridyl)dichlororuthenium(II), hexahydrate). Choline oxidase catalyzes the oxidation of choline to betaine and hydrogen peroxide while consuming oxygen. The fluorescence intensity of oxygen- sensitive Ru(II) are related to the choline concentration. The response of the fiber-optic sensor in choline solution is represented and discussed. The result indicates a low-cost, high-performance, portable choline biosensor.
NASA Astrophysics Data System (ADS)
Bogena, H. R.; Huisman, S.; Rosenbaum, U.; Wuethen, A.; Vereecken, H.
2009-04-01
Wireless sensor network technology allows near real-time monitoring of soil properties with a high spatial and temporal resolution for observing hydrological processes in small watersheds. The novel wireless sensor network SoilNet uses the low-cost ZigBee radio network for communication and a hybrid topology with a mixture of underground end devices each wired to several soil sensors and aboveground router devices. The SoilNet sensor network consists of soil water content, salinity and temperature sensors attached to end devices by cables, router devices and a coordinator device. The end devices are buried in the soil and linked wirelessly with nearby aboveground router devices. This ZigBee network design considers channel errors, delays, packet losses, and power and topology constraints. In order to conserve battery power, a reactive routing protocol is used that determines a new route only when it is required. The sensor network is also able to react to external influences, e.g. the occurrence of precipitation. The SoilNet communicator, routing and end devices have been developed by the Forschungszentrum Juelich and will be marketed through external companies. Simultaneously, we have also developed a data management and visualisation system. Recently, a small forest catchment Wüstebach (27 ha) was instrumented with 50 end devices and more than 400 soil sensors in the frame of the TERENO-RUR hydrological observatory. We will present first results of this large sensor network both in terms of spatial-temporal variations in soil water content and the performance of the sensor network (e.g. network stability and power use).
Acceptance Testing of a Satellite SCADA Photovoltaic-Diesel Hybrid System
NASA Technical Reports Server (NTRS)
Kalu, A.; Emrich, C.; Ventre, G.; Wilson, W.; Acosta, Roberto (Technical Monitor)
2000-01-01
Satellite Supervisory Control and Data Acquisition (SCADA) of a Photovoltaic (PV)/diesel hybrid system was tested using NASA's Advanced Communication Technology Satellite (ACTS) and Ultra Small Aperture Terminal (USAT) ground stations. The setup consisted of a custom-designed PV/diesel hybrid system, located at the Florida Solar Energy Center (FSEC), which was controlled and monitored at a "remote" hub via Ka-band satellite link connecting two 1/4 Watt USATs in a SCADA arrangement. The robustness of the communications link was tested for remote monitoring of the health and performance of a PV/diesel hybrid system, and for investigating load control and battery charging strategies to maximize battery capacity and lifetime, and minimize loss of critical load probability. Baseline hardware performance test results demonstrated that continuous two-second data transfers can be accomplished under clear sky conditions with an error rate of less than 1%. The delay introduced by the satellite (1/4 sec) was transparent to synchronization of satellite modem as well as to the PV/diesel-hybrid computer. End-to-end communications link recovery times were less than 36 seconds for loss of power and less than one second for loss of link. The system recovered by resuming operation without any manual intervention, which is important since the 4 dB margin is not sufficient to prevent loss of the satellite link during moderate to heavy rain. Hybrid operations during loss of communications link continued seamlessly but real-time monitoring was interrupted. For this sub-tropical region, the estimated amount of time that the signal fade will exceed the 4 dB margin is about 10%. These results suggest that data rates of 4800 bps and a link margin of 4 dB with a 1/4 Watt transmitter are sufficient for end-to-end operation in this SCADA application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lumsdaine, Andrew
2013-03-08
The main purpose of the Coordinated Infrastructure for Fault Tolerance in Systems initiative has been to conduct research with a goal of providing end-to-end fault tolerance on a systemwide basis for applications and other system software. While fault tolerance has been an integral part of most high-performance computing (HPC) system software developed over the past decade, it has been treated mostly as a collection of isolated stovepipes. Visibility and response to faults has typically been limited to the particular hardware and software subsystems in which they are initially observed. Little fault information is shared across subsystems, allowing little flexibility ormore » control on a system-wide basis, making it practically impossible to provide cohesive end-to-end fault tolerance in support of scientific applications. As an example, consider faults such as communication link failures that can be seen by a network library but are not directly visible to the job scheduler, or consider faults related to node failures that can be detected by system monitoring software but are not inherently visible to the resource manager. If information about such faults could be shared by the network libraries or monitoring software, then other system software, such as a resource manager or job scheduler, could ensure that failed nodes or failed network links were excluded from further job allocations and that further diagnosis could be performed. As a founding member and one of the lead developers of the Open MPI project, our efforts over the course of this project have been focused on making Open MPI more robust to failures by supporting various fault tolerance techniques, and using fault information exchange and coordination between MPI and the HPC system software stack from the application, numeric libraries, and programming language runtime to other common system components such as jobs schedulers, resource managers, and monitoring tools.« less
Ultrasound monitoring of the treatment of clinically significant knee osteoarthritis.
Vojtassak, J; Vojtassak, J
2014-01-01
The study presented an ultrasound (US) monitoring of treatment as a new imaging US method with the results of therapy of clinically significant knee osteoarthritis. X-ray is widely used for knee osteoarthritis classification, which does not involve the evaluation of the soft tissue. High frequency and high resolution US of joints (arthrosonography, echoarthrography) assess not only morphologic but also functional changes in the knee joint. In the prospective study, 110 patients with clinically significant knee osteoarthritis were treated non-operative. US examination and US monitoring of therapy was performed during 24 weeks therapy period. A remission of pathomorphologic (marginal osteofytes) and pathophysiologic (effusion in anterior knee and Baker´s cyst) attributes were evaluated according the US classification. Pathomorphologic attributes changes showed a static state, without remission or progression. Pathophysiologic attributes changes showed a remission during the study period. The highest remission was in the first three weeks, 60 % anterior knee effusion and 62 % Baker´s cyst. At the end of study, no changes from the initial US grade was observed in 16 % of effusion in anterior knee and 22 % of Baker´s cyst. Therapeutic resistant Baker´s cyst was present at the end of study in 36 %. We demonstrated a new method - US monitoring of therapy, which can objectivize the efficiency of treatment of clinically significant knee osteoarthritis. We would recommend US monitoring of therapy for the routine use in orthopedic clinical praxis (Tab. 6, Graph 3, Fig. 3, Ref. 15).
Monitoring of computing resource use of active software releases at ATLAS
NASA Astrophysics Data System (ADS)
Limosani, Antonio; ATLAS Collaboration
2017-10-01
The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed in plots generated using Python visualization libraries and collected into pre-formatted auto-generated Web pages, which allow the ATLAS developer community to track the performance of their algorithms. This information is however preferentially filtered to domain leaders and developers through the use of JIRA and via reports given at ATLAS software meetings. Finally, we take a glimpse of the future by reporting on the expected CPU and RAM usage in benchmark workflows associated with the High Luminosity LHC and anticipate the ways performance monitoring will evolve to understand and benchmark future workflows.
Hybrid wireless sensor network for rescue site monitoring after earthquake
NASA Astrophysics Data System (ADS)
Wang, Rui; Wang, Shuo; Tang, Chong; Zhao, Xiaoguang; Hu, Weijian; Tan, Min; Gao, Bowei
2016-07-01
This paper addresses the design of a low-cost, low-complexity, and rapidly deployable wireless sensor network (WSN) for rescue site monitoring after earthquakes. The system structure of the hybrid WSN is described. Specifically, the proposed hybrid WSN consists of two kinds of wireless nodes, i.e., the monitor node and the sensor node. Then the mechanism and the system configuration of the wireless nodes are detailed. A transmission control protocol (TCP)-based request-response scheme is proposed to allow several monitor nodes to communicate with the monitoring center. UDP-based image transmission algorithms with fast recovery have been developed to meet the requirements of in-time delivery of on-site monitor images. In addition, the monitor node contains a ZigBee module that used to communicate with the sensor nodes, which are designed with small dimensions to monitor the environment by sensing different physical properties in narrow spaces. By building a WSN using these wireless nodes, the monitoring center can display real-time monitor images of the monitoring area and visualize all collected sensor data on geographic information systems. In the end, field experiments were performed at the Training Base of Emergency Seismic Rescue Troops of China and the experimental results demonstrate the feasibility and effectiveness of the monitor system.
Passive and Active Monitoring on a High Performance Research Network.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, Warren
2001-05-01
The bold network challenges described in ''Internet End-to-end Performance Monitoring for the High Energy and Nuclear Physics Community'' presented at PAM 2000 have been tackled by the intrepid administrators and engineers providing the network services. After less than a year, the BaBar collaboration has collected almost 100 million particle collision events in a database approaching 165TB (Tera=10{sup 12}). Around 20TB has been exported via the Internet to the BaBar regional center at IN2P3 in Lyon, France, for processing and around 40 TB of simulated events have been imported to SLAC from Lawrence Livermore National Laboratory (LLNL). An unforseen challenge hasmore » arisen due to recent events and highlighted security concerns at DoE funded labs. New rules and regulations suggest it is only a matter of time before many active performance measurements may not be possible between many sites. Yet, at the same time, the importance of understanding every aspect of the network and eradicating packet loss for high throughput data transfers has become apparent. Work at SLAC to employ passive monitoring using netflow and OC3MON is underway and techniques to supplement and possibly replace the active measurements are being considered. This paper will detail the special needs and traffic characterization of a remarkable research project, and how the networking hurdles have been resolved (or not!) to achieve the required high data throughput. Results from active and passive measurements will be compared, and methods for achieving high throughput and the effect on the network will be assessed along with tools that directly measure throughput and applications used to actually transfer data.« less
Optimized Two-Party Video Chat with Restored Eye Contact Using Graphics Hardware
NASA Astrophysics Data System (ADS)
Dumont, Maarten; Rogmans, Sammy; Maesen, Steven; Bekaert, Philippe
We present a practical system prototype to convincingly restore eye contact between two video chat participants, with a minimal amount of constraints. The proposed six-fold camera setup is easily integrated into the monitor frame, and is used to interpolate an image as if its virtual camera captured the image through a transparent screen. The peer user has a large freedom of movement, resulting in system specifications that enable genuine practical usage. Our software framework thereby harnesses the powerful computational resources inside graphics hardware, and maximizes arithmetic intensity to achieve over real-time performance up to 42 frames per second for 800 ×600 resolution images. Furthermore, an optimal set of fine tuned parameters are presented, that optimizes the end-to-end performance of the application to achieve high subjective visual quality, and still allows for further algorithmic advancement without loosing its real-time capabilities.
SAR calibration technology review
NASA Technical Reports Server (NTRS)
Walker, J. L.; Larson, R. W.
1981-01-01
Synthetic Aperture Radar (SAR) calibration technology including a general description of the primary calibration techniques and some of the factors which affect the performance of calibrated SAR systems are reviewed. The use of reference reflectors for measurement of the total system transfer function along with an on-board calibration signal generator for monitoring the temporal variations of the receiver to processor output is a practical approach for SAR calibration. However, preliminary error analysis and previous experimental measurements indicate that reflectivity measurement accuracies of better than 3 dB will be difficult to achieve. This is not adequate for many applications and, therefore, improved end-to-end SAR calibration techniques are required.
Front-end Electronics for Unattended Measurement (FEUM). Results of Prototype Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conrad, Ryan C.; Keller, Daniel T.; Morris, Scott J.
2015-07-01
The International Atomic Energy Agency (IAEA) deploys unattended monitoring systems to provide continuous monitoring of nuclear material within safeguarded facilities around the world. As the number of unattended monitoring instruments increases, the IAEA is challenged to become more efficient in the implementation of those systems. In 2010, the IAEA initiated the Front-End Electronics for Unattended Measurement (FEUM) project with the goals of greater flexibility in the interfaces to various sensors and data acquisition systems, and improved capabilities for remotely located sensors (e.g., where sensor and front-end electronics might be separated by tens of meters). In consultation with the IAEA, amore » technical evaluation of a candidate FEUM device produced by a commercial vendor has been performed. This evaluation assessed the device against the IAEA’s original technical specifications and a broad range of important parameters that include sensor types, cable lengths and types, industrial electromagnetic noise that can degrade signals from remotely located detectors, and high radiation fields. Testing data, interpretation, findings and recommendations are provided.« less
Sealing ability of MTA, CPM, and MBPc as root-end filling materials: a bacterial leakage study.
Medeiros, Paulo Leal; Bernardineli, Norberti; Cavenago, Bruno Cavalini; Torres, Sérgio Aparecido; Duarte, Marco Antonio Hungaro; Bramante, Clovis Monteiro; Marciano, Marina Angélica
2016-04-01
Objectives To evaluate the sealing ability of three root-end filling materials (white MTA, CPM, and MBPc) using an Enterococcus faecalis leakage model. Material and Methods Seventy single-root extracted human teeth were instrumented and root-ends were resected to prepare 3 mm depth cavities. Root-end preparations were filled with white MTA, CPM, and MBPc cements. Enterococcus faecalis was coronally introduced and the apical portion was immersed in BHI culture medium with phenol red indicator. The bacterial leakage was monitored every 24 h for 4 weeks. The statistical analysis was performed using the Wilcoxon-Gehan test (p<0.05). Results All cements showed bacterial leakage after 24 hours, except for the negative control group. The MBPc showed significantly less bacterial leakage compared with the MTA group (p<0.05). No significant differences were found between the CPM and the other groups. Conclusions The epoxy resin-based cement MBPc had lower bacterial leakage compared with the calcium silicate-based cements MTA and CPM.
Issues in implementing a knowledge-based ECG analyzer for personal mobile health monitoring.
Goh, K W; Kim, E; Lavanya, J; Kim, Y; Soh, C B
2006-01-01
Advances in sensor technology, personal mobile devices, and wireless broadband communications are enabling the development of an integrated personal mobile health monitoring system that can provide patients with a useful tool to assess their own health and manage their personal health information anytime and anywhere. Personal mobile devices, such as PDAs and mobile phones, are becoming more powerful integrated information management tools and play a major role in many people's lives. We focus on designing a health-monitoring system for people who suffer from cardiac arrhythmias. We have developed computer simulation models to evaluate the performance of appropriate electrocardiogram (ECG) analysis techniques that can be implemented on personal mobile devices. This paper describes an ECG analyzer to perform ECG beat and episode detection and classification. We have obtained promising preliminary results from our study. Also, we discuss several key considerations when implementing a mobile health monitoring solution. The mobile ECG analyzer would become a front-end patient health data acquisition module, which is connected to the Personal Health Information Management System (PHIMS) for data repository.
Service Management Database for DSN Equipment
NASA Technical Reports Server (NTRS)
Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed
2009-01-01
This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.
Cho, Eunsoo; Capin, Philip; Roberts, Greg; Vaughn, Sharon
2017-07-01
Within multitiered instructional delivery models, progress monitoring is a key mechanism for determining whether a child demonstrates an adequate response to instruction. One measure commonly used to monitor the reading progress of students is oral reading fluency (ORF). This study examined the extent to which ORF slope predicts reading comprehension outcomes for fifth-grade struggling readers ( n = 102) participating in an intensive reading intervention. Quantile regression models showed that ORF slope significantly predicted performance on a sentence-level fluency and comprehension assessment, regardless of the students' reading skills, controlling for initial ORF performance. However, ORF slope was differentially predictive of a passage-level comprehension assessment based on students' reading skills when controlling for initial ORF status. Results showed that ORF explained unique variance for struggling readers whose posttest performance was at the upper quantiles at the end of the reading intervention, but slope was not a significant predictor of passage-level comprehension for students whose reading problems were the most difficult to remediate.
Lopes Antunes, Ana Carolina; Dórea, Fernanda; Halasa, Tariq; Toft, Nils
2016-05-01
Surveillance systems are critical for accurate, timely monitoring and effective disease control. In this study, we investigated the performance of univariate process monitoring control algorithms in detecting changes in seroprevalence for endemic diseases. We also assessed the effect of sample size (number of sentinel herds tested in the surveillance system) on the performance of the algorithms. Three univariate process monitoring control algorithms were compared: Shewart p Chart(1) (PSHEW), Cumulative Sum(2) (CUSUM) and Exponentially Weighted Moving Average(3) (EWMA). Increases in seroprevalence were simulated from 0.10 to 0.15 and 0.20 over 4, 8, 24, 52 and 104 weeks. Each epidemic scenario was run with 2000 iterations. The cumulative sensitivity(4) (CumSe) and timeliness were used to evaluate the algorithms' performance with a 1% false alarm rate. Using these performance evaluation criteria, it was possible to assess the accuracy and timeliness of the surveillance system working in real-time. The results showed that EWMA and PSHEW had higher CumSe (when compared with the CUSUM) from week 1 until the end of the period for all simulated scenarios. Changes in seroprevalence from 0.10 to 0.20 were more easily detected (higher CumSe) than changes from 0.10 to 0.15 for all three algorithms. Similar results were found with EWMA and PSHEW, based on the median time to detection. Changes in the seroprevalence were detected later with CUSUM, compared to EWMA and PSHEW for the different scenarios. Increasing the sample size 10 fold halved the time to detection (CumSe=1), whereas increasing the sample size 100 fold reduced the time to detection by a factor of 6. This study investigated the performance of three univariate process monitoring control algorithms in monitoring endemic diseases. It was shown that automated systems based on these detection methods identified changes in seroprevalence at different times. Increasing the number of tested herds would lead to faster detection. However, the practical implications of increasing the sample size (such as the costs associated with the disease) should also be taken into account. Copyright © 2016 Elsevier B.V. All rights reserved.
De Beer, T R M; Vercruysse, P; Burggraeve, A; Quinten, T; Ouyang, J; Zhang, X; Vervaet, C; Remon, J P; Baeyens, W R G
2009-09-01
The aim of the present study was to examine the complementary properties of Raman and near infrared (NIR) spectroscopy as PAT tools for the fast, noninvasive, nondestructive and in-line process monitoring of a freeze drying process. Therefore, Raman and NIR probes were built in the freeze dryer chamber, allowing simultaneous process monitoring. A 5% (w/v) mannitol solution was used as model for freeze drying. Raman and NIR spectra were continuously collected during freeze drying (one Raman and NIR spectrum/min) and the spectra were analyzed using principal component analysis (PCA) and multivariate curve resolution (MCR). Raman spectroscopy was able to supply information about (i) the mannitol solid state throughout the entire process, (ii) the endpoint of freezing (endpoint of mannitol crystallization), and (iii) several physical and chemical phenomena occurring during the process (onset of ice nucleation, onset of mannitol crystallization). NIR spectroscopy proved to be a more sensitive tool to monitor the critical aspects during drying: (i) endpoint of ice sublimation and (ii) monitoring the release of hydrate water during storage. Furthermore, via NIR spectroscopy some Raman observations were confirmed: start of ice nucleation, end of mannitol crystallization and solid state characteristics of the end product. When Raman and NIR monitoring were performed on the same vial, the Raman signal was saturated during the freezing step caused by reflected NIR light reaching the Raman detector. Therefore, NIR and Raman measurements were done on a different vial. Also the importance of the position of the probes (Raman probe above the vial and NIR probe at the bottom of the sidewall of the vial) in order to obtain all required critical information is outlined. Combining Raman and NIR spectroscopy for the simultaneous monitoring of freeze drying allows monitoring almost all critical freeze drying process aspects. Both techniques do not only complement each other, they also provided mutual confirmation of specific conclusions.
A Procedural Electroencephalogram Simulator for Evaluation of Anesthesia Monitors.
Petersen, Christian Leth; Görges, Matthias; Massey, Roslyn; Dumont, Guy Albert; Ansermino, J Mark
2016-11-01
Recent research and advances in the automation of anesthesia are driving the need to better understand electroencephalogram (EEG)-based anesthesia end points and to test the performance of anesthesia monitors. This effort is currently limited by the need to collect raw EEG data directly from patients. A procedural method to synthesize EEG signals was implemented in a mobile software application. The application is capable of sending the simulated signal to an anesthesia depth of hypnosis monitor. Systematic sweeps of the simulator generate functional monitor response profiles reminiscent of how network analyzers are used to test electronic components. Three commercial anesthesia monitors (Entropy, NeuroSENSE, and BIS) were compared with this new technology, and significant response and feature variations between the monitor models were observed; this includes reproducible, nonmonotonic apparent multistate behavior and significant hysteresis at light levels of anesthesia. Anesthesia monitor response to a procedural simulator can reveal significant differences in internal signal processing algorithms. The ability to synthesize EEG signals at different anesthetic depths potentially provides a new method for systematically testing EEG-based monitors and automated anesthesia systems with all sensor hardware fully operational before human trials.
High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering
NASA Technical Reports Server (NTRS)
Maly, K.
1998-01-01
Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated with the monitoring architecture to reduce the volume of event traffic flow in the system, and thereby reduce the intrusiveness of the monitoring process. We are developing an event filtering architecture to efficiently process the large volume of event traffic generated by LSD systems (such as distributed interactive applications). This filtering architecture is used to monitor collaborative distance learning application for obtaining debugging and feedback information. Our architecture supports the dynamic (re)configuration and optimization of event filters in large-scale distributed systems. Our work represents a major contribution by (1) survey and evaluating existing event filtering mechanisms In supporting monitoring LSD systems and (2) devising an integrated scalable high- performance architecture of event filtering that spans several kev application domains, presenting techniques to improve the functionality, performance and scalability. This paper describes the primary characteristics and challenges of developing high-performance event filtering for monitoring LSD systems. We survey existing event filtering mechanisms and explain key characteristics for each technique. In addition, we discuss limitations with existing event filtering mechanisms and outline how our architecture will improve key aspects of event filtering.
NASA Astrophysics Data System (ADS)
Seyler, F.; Bonnet, M.-P.; Calmant, S.; Cauhopé, M.; Cazenave, A.; Cochonneau, G.; Divol, J.; Do-Minh, K.; Frappart, F.; Gennero, M.-C.; Guyenne-Blin, K.; Huynh, F.; Leon, J. G.; Mangeas, M.; Mercier, F.; Rocquelain, GH.; Tocqueville, L.; Zanifé, O.-Z.
2006-07-01
CASH « Contribution of spatial altimetry to hydrology » aims at the definition of a global, standard, fast and long term access to a set of hydrological data concerning the greatest river basins in the world. The key questions to be answered are: what are the conditions for monitoring river water stages from altimetric radar data and how is it possible to combine altimetric data with other spatial sources or/and in-situ data in order to deliver useful parameters for hydrology community, both scientific and end users. The CASH project is ending mid-May of 2006 and there is yet a lot of tasks to be performed for altimetric heigths of continental water bodies becoming part of the scientific and end-users hydrologists day-to- day practice. The project has nethertheless delineated the way this use could be improved in a near future, and opened very interesting perspectives for ungauged or poorly gauged great basins in the world.
Long-term performance monitoring of hardwood timber bridges in Pennsylvania
James P. Wacker; Carlito Calil; Lola E. Hislop; Paula D. Hilbrich Lee; James A. Kainz
2004-01-01
Several hardwood timber bridges were constructed in Pennsylvania during the early 1990s. This report summarizes the long-term field performance of seven stress-laminated deck bridges over a 4-year period beginning August 1997 and ending July 2001. Data collected include lumber moisture content, static load test deflection measurements, and bridge condition assessments...
High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers
NASA Astrophysics Data System (ADS)
Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.
2017-12-01
The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.
HAZBOT - A hazardous materials emergency response mobile robot
NASA Technical Reports Server (NTRS)
Stone, H. W.; Edmonds, G.
1992-01-01
The authors describe the progress that has been made towards the development of a mobile robot that can be used by hazardous materials emergency response teams to perform a variety of tasks including incident localization and characterization, hazardous material identification/classification, site surveillance and monitoring, and ultimately incident mitigation. In September of 1991, the HAZBOT II vehicle performed its first end-to-end demonstration involving a scenario in which the vehicle: navigated to the incident location from a distant (150-200 ft.) deployment site; entered a building through a door with thumb latch style handle and door closer; located and navigated to the suspected incident location (a chemical storeroom); unlocked and opened the storeroom's door; climbed over the storeroom's 12 in. high threshold to enter the storeroom; and located and identified a broken container of benzene.
HAZBOT - A hazardous materials emergency response mobile robot
NASA Astrophysics Data System (ADS)
Stone, H. W.; Edmonds, G.
The authors describe the progress that has been made towards the development of a mobile robot that can be used by hazardous materials emergency response teams to perform a variety of tasks including incident localization and characterization, hazardous material identification/classification, site surveillance and monitoring, and ultimately incident mitigation. In September of 1991, the HAZBOT II vehicle performed its first end-to-end demonstration involving a scenario in which the vehicle: navigated to the incident location from a distant (150-200 ft.) deployment site; entered a building through a door with thumb latch style handle and door closer; located and navigated to the suspected incident location (a chemical storeroom); unlocked and opened the storeroom's door; climbed over the storeroom's 12 in. high threshold to enter the storeroom; and located and identified a broken container of benzene.
Detecting Abnormal Machine Characteristics in Cloud Infrastructures
NASA Technical Reports Server (NTRS)
Bhaduri, Kanishka; Das, Kamalika; Matthews, Bryan L.
2011-01-01
In the cloud computing environment resources are accessed as services rather than as a product. Monitoring this system for performance is crucial because of typical pay-peruse packages bought by the users for their jobs. With the huge number of machines currently in the cloud system, it is often extremely difficult for system administrators to keep track of all machines using distributed monitoring programs such as Ganglia1 which lacks system health assessment and summarization capabilities. To overcome this problem, we propose a technique for automated anomaly detection using machine performance data in the cloud. Our algorithm is entirely distributed and runs locally on each computing machine on the cloud in order to rank the machines in order of their anomalous behavior for given jobs. There is no need to centralize any of the performance data for the analysis and at the end of the analysis, our algorithm generates error reports, thereby allowing the system administrators to take corrective actions. Experiments performed on real data sets collected for different jobs validate the fact that our algorithm has a low overhead for tracking anomalous machines in a cloud infrastructure.
Disaggregating Hot Water Use and Predicting Hot Water Waste in Five Test Homes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henderson, H.; Wade, J.
2014-04-01
While it is important to make the equipment (or 'plant') in a residential hot water system more efficient, the hot water distribution system also affects overall system performance and energy use. Energy wasted in heating water that is not used is estimated to be on the order of 10 to 30 percent of total domestic hot water (DHW) energy use. This field monitoring project installed temperature sensors on the distribution piping (on trunks and near fixtures) and programmed a data logger to collect data at 5 second intervals whenever there was a hot water draw. This data was used tomore » assign hot water draws to specific end uses in the home as well as to determine the portion of each hot water that was deemed useful (i.e., above a temperature threshold at the fixture). Five houses near Syracuse NY were monitored. Overall, the procedures to assign water draws to each end use were able to successfully assign about 50% of the water draws, but these assigned draws accounted for about 95% of the total hot water use in each home. The amount of hot water deemed as useful ranged from low of 75% at one house to a high of 91% in another. At three of the houses, new water heaters and distribution improvements were implemented during the monitoring period and the impact of these improvements on hot water use and delivery efficiency were evaluated.« less
Monitoring of Microbes in Drinking Water
Internationally there is a move towards managing the provision of safe drinking water by direct assessment of the performance of key pathogen barriers (critical control points), rather than end point testing (i.e. in drinking water). For fecal pathogens that breakthrough the vari...
Enabling end-user network monitoring via the multicast consolidated proxy monitor
NASA Astrophysics Data System (ADS)
Kanwar, Anshuman; Almeroth, Kevin C.; Bhattacharyya, Supratik; Davy, Matthew
2001-07-01
The debugging of problems in IP multicast networks relies heavily on an eclectic set of stand-alone tools. These tools traditionally neither provide a consistent interface nor do they generate readily interpretable results. We propose the ``Multicast Consolidated Proxy Monitor''(MCPM), an integrated system for collecting, analyzing and presenting multicast monitoring results to both the end user and the network operator at the user's Internet Service Provider (ISP). The MCPM accesses network state information not normally visible to end users and acts as a proxy for disseminating this information. Functionally, through this architecture, we aim to a) provide a view of the multicast network at varying levels of granularity, b) provide end users with a limited ability to query the multicast infrastructure in real time, and c) protect the infrastructure from overwhelming amount of monitoring load through load control. Operationally, our scheme allows scaling to the ISPs dimensions, adaptability to new protocols (introduced as multicast evolves), threshold detection for crucial parameters and an access controlled, customizable interface design. Although the multicast scenario is used to illustrate the benefits of consolidated monitoring, the ultimate aim is to scale the scheme to unicast IP networks.
Baranowski, Jacek; Delshad, Baz; Ahn, Henrik
2017-01-01
After implantation of a continuous-flow left ventricular assist device (LVAD), left atrial pressure (LAP) monitoring allows for the precise management of intravascular volume, inotropic therapy, and pump speed. In this case series of 4 LVAD recipients, we report the first clinical use of this wireless pressure sensor for the long-term monitoring of LAP during LVAD support. A wireless microelectromechanical system pressure sensor (Titan, ISS Inc., Ypsilanti, MI) was placed in the left atrium in four patients at the time of LVAD implantation. Titan sensor LAP was measured in all four patients on the intensive care unit and in three patients at home. Ramped speed tests were performed using LAP and echocardiography in three patients. The left ventricular end-diastolic diameter (cm), flow (L/min), power consumption (W), and blood pressure (mm Hg) were measured at each step. Measurements were performed over 36, 84, 137, and 180 days, respectively. The three discharged patients had equipment at home and were able to perform daily recordings. There were significant correlations between sensor pressure and pump speed, LV and LA size and pulmonary capillary wedge pressure, respectively (r = 0.92–0.99, p < 0.05). There was no device failure, and there were no adverse consequences of its use. PMID:27676410
4-Pyridoxic Acid in the Spent Dialysate: Contribution to Fluorescence and Optical Monitoring
Kalle, Sigrid; Tanner, Risto; Arund, Jürgen; Tomson, Ruth; Luman, Merike; Fridolin, Ivo
2016-01-01
Aim In this work we estimated the contribution of the fluorescence of 4-pyridoxic acid (4-PA) to the total fluorescence of spent dialysate with the aim of evaluating the on-line monitoring of removal of this vitamin B-6 metabolite from the blood of patients with end-stage renal disease (ESRD). Methods Spectrofluorometric analysis of spent dialysate, collected from hemodialysis and hemodiafiltration sessions of 10 patients receiving regularly pyridoxine injections after dialysis treatment, was performed in the range of Ex/Em 220–500 nm. 4-PA in dialysate samples was identified and quantified using HPLC with fluorescent and MS/MS detection. Results Averaged HPLC chromatogram of spent dialysate had many peaks in the wavelength region of Ex320/Em430 nm where 4-PA was the highest peak with contribution of 42.2±17.0% at the beginning and 47.7±18.0% in the end of the dialysis. High correlation (R = 0.88–0.95) between 4-PA concentration and fluorescence intensity of spent dialysate was found in the region of Ex310-330/Em415-500 nm, respectively. Conclusion 4-PA elimination from the blood of ESRD patients can be potentially followed using monitoring of the fluorescence of the spent dialysate during dialysis treatments. PMID:27598005
4-Pyridoxic Acid in the Spent Dialysate: Contribution to Fluorescence and Optical Monitoring.
Kalle, Sigrid; Tanner, Risto; Arund, Jürgen; Tomson, Ruth; Luman, Merike; Fridolin, Ivo
2016-01-01
In this work we estimated the contribution of the fluorescence of 4-pyridoxic acid (4-PA) to the total fluorescence of spent dialysate with the aim of evaluating the on-line monitoring of removal of this vitamin B-6 metabolite from the blood of patients with end-stage renal disease (ESRD). Spectrofluorometric analysis of spent dialysate, collected from hemodialysis and hemodiafiltration sessions of 10 patients receiving regularly pyridoxine injections after dialysis treatment, was performed in the range of Ex/Em 220-500 nm. 4-PA in dialysate samples was identified and quantified using HPLC with fluorescent and MS/MS detection. Averaged HPLC chromatogram of spent dialysate had many peaks in the wavelength region of Ex320/Em430 nm where 4-PA was the highest peak with contribution of 42.2±17.0% at the beginning and 47.7±18.0% in the end of the dialysis. High correlation (R = 0.88-0.95) between 4-PA concentration and fluorescence intensity of spent dialysate was found in the region of Ex310-330/Em415-500 nm, respectively. 4-PA elimination from the blood of ESRD patients can be potentially followed using monitoring of the fluorescence of the spent dialysate during dialysis treatments.
A configurable electronics system for the ESS-Bilbao beam position monitors
NASA Astrophysics Data System (ADS)
Muguira, L.; Belver, D.; Etxebarria, V.; Varnasseri, S.; Arredondo, I.; del Campo, M.; Echevarria, P.; Garmendia, N.; Feuchtwanger, J.; Jugo, J.; Portilla, J.
2013-09-01
A versatile and configurable system has been developed in order to monitorize the beam position and to meet all the requirements of the future ESS-Bilbao Linac. At the same time the design has been conceived to be open and configurable so that it could eventually be used in different kinds of accelerators, independent of the charged particle, with minimal change. The design of the Beam Position Monitors (BPMs) system includes a test bench both for button-type pick-ups (PU) and striplines (SL), the electronic units and the control system. The electronic units consist of two main parts. The first part is an Analog Front-End (AFE) unit where the RF signals are filtered, conditioned and converted to base-band. The second part is a Digital Front-End (DFE) unit which is based on an FPGA board where the base-band signals are sampled in order to calculate the beam position, the amplitude and the phase. To manage the system a Multipurpose Controller (MC) developed at ESSB has been used. It includes the FPGA management, the EPICS integration and Archiver Instances. A description of the system and a comparison between the performance of both PU and SL BPM designs measured with this electronics system are fully described and discussed.
A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Michelle M.; Wu, Chase Q.
2013-11-07
Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization formore » this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.« less
Microtechnology in Space: NASA's Lab-on-a-Chip Applications Development Program
NASA Technical Reports Server (NTRS)
Monaco, Lisa; Spearing, Scott; Jenkins, Andy; Symonds, Wes; Mayer, Derek; Gouldie, Edd; Wainwright, Norm; Fries, Marc; Maule, Jake; Toporski, Jan
2004-01-01
NASA's Marshall Space Flight Center (MSFC) Lab on a Chip Application Development LOCAD) team has worked with microfluidic technology for the past few years in an effort to support NASA's Mission. In that time, such microfluidic based Lab-on-a-Chip (LOC) systems have become common technology in clinical and diagnostic laboratories. The approach is most attractive due to its highly miniaturized platform and ability to perform reagent handling (i-e., dilution, mixing, separation) and diagnostics for multiple reactions in an integrated fashion. LOCAD, along with Caliper Life Sciences has successfully developed the first LOC device for macromolecular crystallization using a workstation acquired specifically for designing custom chips, the Caliper 42. LOCAD uses this, along with a novel MSFC-designed and built workstation for microfluidic development. The team has a cadre of LOC devices that can be used to perform initial feasibility testing to determine the efficacy of the LOC approach for a specific application. Once applicability has been established, the LOCAD team, along with the Army's Aviation and Missile Command microfabrication facility, can then begin to custom design and fabricate a device per the user's specifications. This presentation will highlight the LOCAD team's proven and unique expertise that has been utilized to provide end to end capabilities associated with applying microfluidics for applications that include robotic life detection instrumentation, crew health monitoring and microbial and environmental monitoring for human Exploration.
Pikkemaat, Robert; Lundin, Stefan; Stenqvist, Ola; Hilgers, Ralf-Dieter; Leonhardt, Steffen
2014-07-01
Currently, the monitoring of cardiac output (CO) and stroke volume (SV) is mainly performed using invasive techniques. Therefore, performing CO monitoring noninvasively by means of electrical impedance tomography (EIT) would be advantageous for intensive care. Our hypothesis was that, by means of EIT, it is possible to assess heart rate (HR) and to quantify changes in SV due to changes in ventilator settings. CO (HR and SV) of 14 pigs (32-40 kg body weight) was changed by incremental increases in positive end-expiratory pressure levels (0, 5, 10, 15, and 20 cm·H2O; ramp maneuver). This ramp maneuver was applied 4 times in each animal, yielding 43 evaluable single experiments. At each positive end-expiratory pressure level, SV was assessed by transpulmonary thermodilution using a PiCCO device. EIT data were acquired using a Dräger EIT Evaluation Kit 2. The EIT-based SV-related signal, Z(SV) (in [AU]), showed only a weak correlation (after excluding 2 measurements) with SV(TTD) of r = 0.58 (95% confidence interval, 0.43-0.71). If Z(SV) is calibrated by the reference 1 time for each experiment (defined as SVEIT), the correlation is approximately 0.85 (95% confidence interval, 0.78-0.90). A possible reason for the moderate correlation is the unexpected scaling pattern, leading to amplification of the cardiac impedance signal, found in some animals. The scaling is probably due to the imperfect reconstruction (i.e., a change of sensitivity) of the EIT images or to a change in the position of the heart. The hypothesis that EIT can be used to monitor CO and SV was confirmed, but further studies are required before this technique can be applied in clinical practice. HR was determined robustly and accurately. For SV monitoring, promising results were obtained in 80% of the experiments. However, unexpected scaling of the cardiac EIT signal causing inaccurate estimation of SV remains an issue. Before robust assessment of SV by EIT is suitable for clinical practice, the cause of and compensation for undesired scaling effects need to be investigated.
Zao, John K.; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping
2014-01-01
EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system. PMID:24917804
Zao, John K; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping
2014-01-01
EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system.
Molecular Profiles for Lung Cancer Pathogenesis and Detection in US Veterans
2012-10-01
will be further strengthened via Multiple Reaction Monitoring ( MRM ) performed on the remaining samples by the Vanderbilt group. MRM using mass...proteomics detects all protein changes in the sample in an unfocused fashion, MRM is targeted and highly selective, allowing us to specifically look for...proteins of interest. To this end, we have generated a list of candidate proteins for MRM utilizing shotgun proteomic, mRNA array, and miRNA array
Bright, Molly G; Murphy, Kevin
2013-12-01
Cerebrovascular reactivity (CVR) can be mapped using BOLD fMRI to provide a clinical insight into vascular health that can be used to diagnose cerebrovascular disease. Breath-holds are a readily accessible method for producing the required arterial CO2 increases but their implementation into clinical studies is limited by concerns that patients will demonstrate highly variable performance of breath-hold challenges. This study assesses the repeatability of CVR measurements despite poor task performance, to determine if and how robust results could be achieved with breath-holds in patients. Twelve healthy volunteers were scanned at 3 T. Six functional scans were acquired, each consisting of 6 breath-hold challenges (10, 15, or 20 s duration) interleaved with periods of paced breathing. These scans simulated the varying breath-hold consistency and ability levels that may occur in patient data. Uniform ramps, time-scaled ramps, and end-tidal CO2 data were used as regressors in a general linear model in order to measure CVR at the grey matter, regional, and voxelwise level. The intraclass correlation coefficient (ICC) quantified the repeatability of the CVR measurement for each breath-hold regressor type and scale of interest across the variable task performances. The ramp regressors did not fully account for variability in breath-hold performance and did not achieve acceptable repeatability (ICC<0.4) in several regions analysed. In contrast, the end-tidal CO2 regressors resulted in "excellent" repeatability (ICC=0.82) in the average grey matter data, and resulted in acceptable repeatability in all smaller regions tested (ICC>0.4). Further analysis of intra-subject CVR variability across the brain (ICCspatial and voxelwise correlation) supported the use of end-tidal CO2 data to extract robust whole-brain CVR maps, despite variability in breath-hold performance. We conclude that the incorporation of end-tidal CO2 monitoring into scanning enables robust, repeatable measurement of CVR that makes breath-hold challenges suitable for routine clinical practice. © 2013.
International Space Station Major Constituent Analyzer On-Orbit Performance
NASA Technical Reports Server (NTRS)
Gardner, Ben D.; Erwin, Phillip M.; Thoresen, Souzan; Wiedemann, Rachel; Matty, Chris
2015-01-01
The Major Constituent Analyzer is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic change-out, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. Improvements to ion pump operation and ion source tuning have improved lifetime performance of the current ORU 02 design. The most recent ORU 02 analyzer assemblies, as well as ORU 08, have operated nominally. For ORU 02, the ion source filaments and ion pump lifetime continue to be key determinants of MCA performance and logistical support. Monitoring several key parameters provides the capacity to monitor ORU health and properly anticipate end of life.
International Space Station Major Constituent Analyzer On-Orbit Performance
NASA Technical Reports Server (NTRS)
Gardner, Ben D.; Erwin, Phillip M.; Thoresen, Souzan; Granahan, John; Matty, Chris
2012-01-01
The Major Constituent Analyzer is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic changeout, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. Over the past two years, two ORU 02 analyzer assemblies have operated nominally while two others have experienced premature on-orbit failures. These failures as well as nominal performances demonstrate that ORU 02 performance remains a key determinant of MCA performance and logistical support. It can be shown that monitoring several key parameters can maximize the capacity to monitor ORU health and properly anticipate end of life. Improvements to ion pump operation and ion source tuning are expected to improve lifetime performance of the current ORU 02 design.
Haire, Julia Christine Lydia; Ferguson, Sally Anne; Tilleard, James D; Negus, Paul; Dorrian, Jillian; Thomas, Matthew Jw
2012-06-01
To evaluate the effect of working consecutive night shifts on sleep time, prior wakefulness, perceived levels of fatigue and psychomotor performance in a group of Australian emergency registrars. A prospective observational study with a repeated within-subjects component was conducted. Sleep time was determined using sleep diaries and activity monitors. Subjective fatigue levels and reciprocal reaction times were evaluated before and after day and night shifts. A total of 11 registrars participated in the study with 120 shifts analysed. Sleep time was found to be similar during consecutive night and day shifts. The mean number of hours spent awake before the end of a night shift was 14.33. Subjective fatigue scores were worst at the end of a night shift. There was no difference in reciprocal reaction time between the end of night shift and the start of day shift. Registrars sleep a similar amount of time surrounding night and day shifts. Despite reporting the highest levels of fatigue at the end of a night shift, there is no significant difference in reaction times at the end of night shift compared with the beginning of day shift. This correlates with the finding that at the end of night shift the registrars have been awake for less than 16 h, which is the point at which psychomotor performance is expected to decline. © 2012 The Authors. EMA © 2012 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
Ten Haaf, Twan; van Staveren, Selma; Iannetta, Danilo; Roelands, Bart; Meeusen, Romain; Piacentini, Maria F; Foster, Carl; Koenderman, Leo; Daanen, Hein A M; de Koning, Jos J
2018-04-01
Reaction time has been proposed as a training monitoring tool, but to date, results are equivocal. Therefore, it was investigated whether reaction time can be used as a monitoring tool to establish overreaching. The study included 30 subjects (11 females and 19 males, age: 40.8 [10.8] years, VO 2max : 51.8 [6.3] mL/kg/min) who participated in an 8-day cycling event. The external exercise load increased approximately 900% compared with the preparation period. Performance was measured before and after the event using a maximal incremental cycling test. Subjects with decreased performance after the event were classified as functionally overreached (FOR) and others as acutely fatigued (AF). A choice reaction time test was performed 2 weeks before (pre), 1 week after (post), and 5 weeks after (follow-up), as well as at the start and end of the event. A total of 14 subjects were classified as AF and 14 as FOR (2 subjects were excluded). During the event, reaction time at the end was 68 ms (95% confidence interval, 46-89) faster than at the start. Reaction time post event was 41 ms (95% confidence interval, 12-71) faster than pre event and follow-up was 55 ms faster (95% confidence interval, 26-83). The time by class interaction was not significant during (P = .26) and after (P = .43) the event. Correlations between physical performance and reaction time were not significant (all Ps > .30). No differences in choice reaction time between AF and FOR subjects were observed. It is suggested that choice reaction time is not valid for early detection of overreaching in the field.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farnham, Irene
Corrective Action Unit (CAU) 98: Frenchman Flat on the Nevada National Security Site was the location of 10 underground nuclear tests. CAU 98 underwent a series of investigations and actions in accordance with the Federal Facility Agreement and Consent Order to assess contamination of groundwater by radionuclides from the tests. A Closure Report completed that process in 2016 and called for long-term monitoring, use restrictions (URs), and institutional controls to protect the public and environment from potential exposure to contaminated groundwater. Three types of monitoring are performed for CAU 98: water quality, water level, and institutional control. These are evaluatedmore » to determine whether the UR boundaries remain protective of human health and the environment, and to ensure that the regulatory boundary objectives are being met. Additionally, monitoring data are used to evaluate consistency with the groundwater flow and contaminant transport models because the contaminant boundaries (CBs) calculated with the models are the primary basis of the UR boundaries. In summary, the monitoring results from 2016 indicate the regulatory controls on the closure of CAU 98 remain effective in protection of human health and the environment. Recommendations resulting from this first year of monitoring activities include formally incorporating wells UE-5 PW-1, UE-5 PW-2, and UE-5 PW-3 into the groundwater-level monitoring network given their strategic location in the basin; and early development of a basis for trigger levels for the groundwater-level monitoring given the observed trends. Additionally, it is recommended to improve the Real Estate/Operations Permit process for capturing information important for evaluating the impact of activities on groundwater resources, and to shift the reporting requirement for this annual report from the second quarter of the federal fiscal year (end of March) to the second quarter of the calendar year (end of June).« less
Electronic nicotine delivery systems: adult use and awareness of the 'e-cigarette' in the USA.
Regan, Annette K; Promoff, Gabbi; Dube, Shanta R; Arrazola, Rene
2013-01-01
Electronic nicotine delivery systems (ENDS), also referred to as electronic cigarettes or e-cigarettes, were introduced into the US market in 2007. Despite concerns regarding the long-term health impact of this product, there is little known about awareness and use of ENDS among adults in the USA. A consumer-based mail-in survey (ConsumerStyles) was completed by 10,587 adults (≥ 18 years) in 2009 and 10,328 adults in 2010. Data from these surveys were used to monitor awareness, ever use and past month use of ENDS from 2009 to 2010 and to assess demographic characteristics and tobacco use of ENDS users. In this US sample, awareness of ENDS doubled from 16.4% in 2009 to 32.2% in 2010 and ever use more than quadrupled from 2009 (0.6%) to 2010 (2.7%). Ever use of ENDS was most common among women and those with lower education, although these were not the groups who had heard of ENDS most often. Current smokers and tobacco users were most likely to try ENDS. However, current smokers who had tried ENDS did not say they planned to quit smoking more often than smokers who had never tried them. Given the large increase in awareness and ever use of ENDS during this 1-year period and the unknown impact of ENDS use on cigarette smoking behaviours and long-term health, continued monitoring of these products is needed.
Single Common Powertrain Lubricant (SCPL) Development. Part 3
2015-02-01
completed to assess the condition of the piston skirts, ring faces, and cylinder liners . This provides quasi-real time monitoring of the oils performance in...was most likely attributed to further scuffing of cylinder 2R. During the teardown and ratings at the end of testing, liner 2R was found to be 90...monitored and compared. The evaluated parameters of the test included the piston ring wear, cylinder liner wear, lead bearing corrosion, along with lubricant
Blood pressure self-monitoring in pregnancy: examining feasibility in a prospective cohort study.
Tucker, Katherine L; Taylor, Kathryn S; Crawford, Carole; Hodgkinson, James A; Bankhead, Clare; Carver, Tricia; Ewers, Elizabeth; Glogowska, Margaret; Greenfield, Sheila M; Ingram, Lucy; Hinton, Lisa; Khan, Khalid S; Locock, Louise; Mackillop, Lucy; McCourt, Christine; Pirie, Alexander M; Stevens, Richard; McManus, Richard J
2017-12-28
Raised blood pressure (BP) affects approximately 10% of pregnancies worldwide, and a high proportion of affected women develop pre-eclampsia. This study aimed to evaluate the feasibility of self-monitoring of BP in pregnancy in women at higher risk of pre-eclampsia. This prospective cohort study of self-monitoring BP in pregnancy was carried out in two hospital trusts in Birmingham and Oxford and thirteen primary care practices in Oxfordshire. Eligible women were those defined by the UK National Institute for Health and Care Excellence (NICE) guidelines as at higher risk of pre-eclampsia. A total of 201 participants were recruited between 12 and 16 weeks of pregnancy and were asked to take two BP readings twice daily three times a week through their pregnancy. Primary outcomes were recruitment, retention and persistence of self-monitoring. Study recruitment and retention were analysed with descriptive statistics. Survival analysis was used to evaluate the persistence of self-monitoring and the performance of self-monitoring in the early detection of gestational hypertension, compared to clinic BP monitoring. Secondary outcomes were the mean clinic and self-monitored BP readings and the performance of self-monitoring in the detection of gestational hypertension and pre-eclampsia compared to clinic BP. Of 201 women recruited, 161 (80%) remained in the study at 36 weeks or to the end of their pregnancy, 162 (81%) provided any home readings suitable for analysis, 148 (74%) continued to self-monitor at 20 weeks and 107 (66%) at 36 weeks. Self-monitored readings were similar in value to contemporaneous matched clinic readings for both systolic and diastolic BP. Of the 23 who developed gestational hypertension or pre-eclampsia and self-monitored, 9 (39%) had a raised home BP prior to a raised clinic BP. Self-monitoring of BP in pregnancy is feasible and has potential to be useful in the early detection of gestational hypertensive disorders but maintaining self-monitoring throughout pregnancy requires support and probably enhanced training.
Nanoimprinting on optical fiber end faces for chemical sensing
NASA Astrophysics Data System (ADS)
Kostovski, G.; White, D. J.; Mitchell, A.; Austin, M. W.; Stoddart, P. R.
2008-04-01
Optical fiber surface-enhanced Raman scattering (SERS) sensors offer a potential solution to monitoring low chemical concentrations in-situ or in remote sensing scenarios. We demonstrate the use of nanoimprint lithography to fabricate SERS-compatible nanoarrays on the end faces of standard silica optical fibers. The antireflective nanostructure found on cicada wings was used as a convenient template for the nanoarray, as high sensitivity SERS substrates have previously been demonstrated on these surfaces. Coating the high fidelity replicas with silver creates a dense array of regular nanoscale plasmonic resonators. A monolayer of thiophenol was used as a low concentration analyte, from which strong Raman spectra were collected using both direct endface illumination and through-fiber interrogation. This unique combination of nanoscale replication with optical fibers demonstrates a high-resolution, low-cost approach to fabricating high-performance optical fiber chemical sensors.
Structural Performance of Aluminum and Stainless Steel Pyramidal Truss Core Sandwich Panels
2009-07-01
PERFORMING ORGANIZATION NAME(S) AND ADDRESS( ES ) U.S. Army Research Laboratory ATTN: RDRL-WMM-D Aberdeen Proving Ground, MD 21005-5069 8. PERFORMING...ORGANIZATION REPORT NUMBER ARL-TR-4867 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS( ES ) 10. SPONSOR/MONITOR’S ACRONYM(S) 11...Instron- Satec 4-post hydraulic test frame, with a capacity of 1 million lb. The samples were sandwiched between hardened end plates to protect the
DESIGN OF MEDICAL RADIOMETER FRONT-END FOR IMPROVED PERFORMANCE
Klemetsen, Ø.; Birkelund, Y.; Jacobsen, S. K.; Maccarini, P. F.; Stauffer, P. R.
2011-01-01
We have investigated the possibility of building a singleband Dicke radiometer that is inexpensive, small-sized, stable, highly sensitive, and which consists of readily available microwave components. The selected frequency band is at 3.25–3.75 GHz which provides a reasonable compromise between spatial resolution (antenna size) and sensing depth for radiometry applications in lossy tissue. Foreseen applications of the instrument are non-invasive temperature monitoring for breast cancer detection and temperature monitoring during heating. We have found off-the-shelf microwave components that are sufficiently small (< 5 mm × 5 mm) and which offer satisfactory overall sensitivity. Two different Dicke radiometers have been realized: one is a conventional design with the Dicke switch at the front-end to select either the antenna or noise reference channels for amplification. The second design places a matched pair of low noise amplifiers in front of the Dicke switch to reduce system noise figure. Numerical simulations were performed to test the design concepts before building prototype PCB front-end layouts of the radiometer. Both designs provide an overall power gain of approximately 50 dB over a 500 MHz bandwidth centered at 3.5 GHz. No stability problems were observed despite using triple-cascaded amplifier configurations to boost the thermal signals. The prototypes were tested for sensitivity after calibration in two different water baths. Experiments showed superior sensitivity (36% higher) when implementing the low noise amplifier before the Dicke switch (close to the antenna) compared to the other design with the Dicke switch in front. Radiometer performance was also tested in a multilayered phantom during alternating heating and radiometric reading. Empirical tests showed that for the configuration with Dicke switch first, the switch had to be locked in the reference position during application of microwave heating to avoid damage to the active components (amplifiers and power meter). For the configuration with a low noise amplifier up front, damage would occur to the active components of the radiometer if used in presence of the microwave heating antenna. Nevertheless, this design showed significantly improved sensitivity of measured temperatures and merits further investigation to determine methods of protecting the radiometer for amplifier first front ends. PMID:21779411
Early Detection of Physical Activity for People With Type 1 Diabetes Mellitus.
Dasanayake, Isuru S; Bevier, Wendy C; Castorino, Kristin; Pinsker, Jordan E; Seborg, Dale E; Doyle, Francis J; Dassau, Eyal
2015-06-30
Early detection of exercise in individuals with type 1 diabetes mellitus (T1DM) may allow changes in therapy to prevent hypoglycemia. Currently there is limited experience with automated methods that detect the onset and end of exercise in this population. We sought to develop a novel method to quickly and reliably detect the onset and end of exercise in these individuals before significant changes in blood glucose (BG) occur. Sixteen adults with T1DM were studied as outpatients using a diary, accelerometer, heart rate monitor, and continuous glucose monitor for 2 days. These data were used to develop a principal component analysis based exercise detection method. Subjects also performed 60 and 30 minute exercise sessions at 30% and 50% predicted heart rate reserve (HRR), respectively. The detection method was applied to the exercise sessions to determine how quickly the detection of start and end of exercise occurred relative to change in BG. Mild 30% HRR and moderate 50% HRR exercise onset was identified in 6 ± 3 and 5 ± 2 (mean ± SD) minutes, while completion was detected in 3 ± 8 and 6 ± 5 minutes, respectively. BG change from start of exercise to detection time was 1 ± 6 and -1 ± 3 mg/dL, and, from the end of exercise to detection time was 6 ± 4 and -17 ± 13 mg/dL, respectively, for the 2 exercise sessions. False positive and negative ratios were 4 ± 2% and 21 ± 22%. The novel method for exercise detection identified the onset and end of exercise in approximately 5 minutes, with an average BG change of only -6 mg/dL. © 2015 Diabetes Technology Society.
Feelings of helplessness increase ERN amplitudes in healthyindividuals
Pfabigan, D.M.; Pintzinger, N.M.; Siedek, D.R.; Lamm, C.; Derntl, B.; Sailer, U.
2013-01-01
Experiencing feelings of helplessness has repeatedly been reported to contribute to depressive symptoms and negative affect. In turn, depression and negative affective states are associated, among others, with impairments in performance monitoring. Thus, the question arises whether performance monitoring is also affected by feelings of helplessness. To this end, after the induction of feelings of helplessness via an unsolvable reasoning task, 37 participants (20 females) performed a modified version of a Flanker task. Based on a previously validated questionnaire, 17 participants were classified as helpless and 20 as not-helpless. Behavioral measures revealed no differences between helpless and not-helpless individuals. However, we observed enhanced Error-Related Negativity (ERN) amplitude differences between erroneous and correct responses in the helpless compared to the not-helpless group. Furthermore, correlational analysis revealed that higher scores of helplessness were associated with increased ERN difference scores. No influence of feelings of helplessness on later stages of performance monitoring was observed as indicated by Error-Positivity (Pe) amplitude. The present study is the first to demonstrate that feelings of helplessness modulate the neuronal correlates of performance monitoring. Thus, even a short-lasting subjective state manipulation can lead to ERN amplitude variation, probably via modulation of mesencephalic dopamine activity. PMID:23267824
Periodization and physical performance in elite female soccer players.
Mara, Jocelyn K; Thompson, Kevin G; Pumpa, Kate L; Ball, Nick B
2015-07-01
To investigate the variation in training demands, physical performance, and player well-being across a women's soccer season. Seventeen elite female players wore GPS tracking devices during every training session (N=90) throughout 1 national-league season. Intermittent high-speed-running capacity and 5-, 15-, and 25-m-sprint testing were conducted at the beginning of preseason, end of preseason, midseason, and end of season. In addition, subjective well-being measures were self-reported daily by players over the course of the season. Time over 5 m was lowest at the end of preseason (mean 1.148 s, SE 0.017 s) but then progressively deteriorated to the end of the season (P<.001). Sprint performance over 15 m improved by 2.8% (P=.013) after preseason training, while 25-m-sprint performance peaked at midseason, with a 3.1% (P=.05) improvement from the start of preseason, before declining at the end of season (P=.023). Training demands varied between phases, with total distance and high-speed distance greatest during preseason before decreasing (P<.001) during the early- and late-season phases. Endurance capacity and well-being measures did not change across training phases. Monitoring training demands and subsequent physical performance in elite female soccer players allow coaches to ensure that training periodization goals are being met and related positive training adaptations are being elicited.
Maritime microwave radar and electro-optical data fusion for homeland security
NASA Astrophysics Data System (ADS)
Seastrand, Mark J.
2004-09-01
US Customs is responsible for monitoring all incoming air and maritime traffic, including the island of Puerto Rico as a US territory. Puerto Rico offers potentially obscure points of entry to drug smugglers. This environment sets forth a formula for an illegal drug trade - based relatively near the continental US. The US Customs Caribbean Air and Marine Operations Center (CAMOC), located in Puntas Salinas, has the charter to monitor maritime and Air Traffic Control (ATC) radars. The CAMOC monitors ATC radars and advises the Air and Marine Branch of US Customs of suspicious air activity. In turn, the US Coast Guard and/or US Customs will launch air and sea assets as necessary. The addition of a coastal radar and camera system provides US Customs a maritime monitoring capability for the northwestern end of Puerto Rico (Figure 1). Command and Control of the radar and camera is executed at the CAMOC, located 75 miles away. The Maritime Microwave Surveillance Radar performs search, primary target acquisition and target tracking while the Midwave Infrared (MWIR) camera performs target identification. This wide area surveillance, using a combination of radar and MWIR camera, offers the CAMOC a cost and manpower effective approach to monitor, track and identify maritime targets.
Morawska, Lidia; Thai, Phong K; Liu, Xiaoting; Asumadu-Sakyi, Akwasi; Ayoko, Godwin; Bartonova, Alena; Bedini, Andrea; Chai, Fahe; Christensen, Bryce; Dunbabin, Matthew; Gao, Jian; Hagler, Gayle S W; Jayaratne, Rohan; Kumar, Prashant; Lau, Alexis K H; Louie, Peter K K; Mazaheri, Mandana; Ning, Zhi; Motta, Nunzio; Mullins, Ben; Rahman, Md Mahmudur; Ristovski, Zoran; Shafiei, Mahnaz; Tjondronegoro, Dian; Westerdahl, Dane; Williams, Ron
2018-07-01
Over the past decade, a range of sensor technologies became available on the market, enabling a revolutionary shift in air pollution monitoring and assessment. With their cost of up to three orders of magnitude lower than standard/reference instruments, many avenues for applications have opened up. In particular, broader participation in air quality discussion and utilisation of information on air pollution by communities has become possible. However, many questions have been also asked about the actual benefits of these technologies. To address this issue, we conducted a comprehensive literature search including both the scientific and grey literature. We focused upon two questions: (1) Are these technologies fit for the various purposes envisaged? and (2) How far have these technologies and their applications progressed to provide answers and solutions? Regarding the former, we concluded that there is no clear answer to the question, due to a lack of: sensor/monitor manufacturers' quantitative specifications of performance, consensus regarding recommended end-use and associated minimal performance targets of these technologies, and the ability of the prospective users to formulate the requirements for their applications, or conditions of the intended use. Numerous studies have assessed and reported sensor/monitor performance under a range of specific conditions, and in many cases the performance was concluded to be satisfactory. The specific use cases for sensors/monitors included outdoor in a stationary mode, outdoor in a mobile mode, indoor environments and personal monitoring. Under certain conditions of application, project goals, and monitoring environments, some sensors/monitors were fit for a specific purpose. Based on analysis of 17 large projects, which reached applied outcome stage, and typically conducted by consortia of organizations, we observed that a sizable fraction of them (~ 30%) were commercial and/or crowd-funded. This fact by itself signals a paradigm change in air quality monitoring, which previously had been primarily implemented by government organizations. An additional paradigm-shift indicator is the growing use of machine learning or other advanced data processing approaches to improve sensor/monitor agreement with reference monitors. There is still some way to go in enhancing application of the technologies for source apportionment, which is of particular necessity and urgency in developing countries. Also, there has been somewhat less progress in wide-scale monitoring of personal exposures. However, it can be argued that with a significant future expansion of monitoring networks, including indoor environments, there may be less need for wearable or portable sensors/monitors to assess personal exposure. Traditional personal monitoring would still be valuable where spatial variability of pollutants of interest is at a finer resolution than the monitoring network can resolve. Copyright © 2018 Elsevier Ltd. All rights reserved.
Data Intensive Computing on Amazon Web Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Magana-Zook, S. A.
The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was setmore » up as a collaboration between GMP and Livermore Computing (LC).« less
Detection and Processing Techniques of FECG Signal for Fetal Monitoring
2009-01-01
Fetal electrocardiogram (FECG) signal contains potentially precise information that could assist clinicians in making more appropriate and timely decisions during labor. The ultimate reason for the interest in FECG signal analysis is in clinical diagnosis and biomedical applications. The extraction and detection of the FECG signal from composite abdominal signals with powerful and advance methodologies are becoming very important requirements in fetal monitoring. The purpose of this review paper is to illustrate the various methodologies and developed algorithms on FECG signal detection and analysis to provide efficient and effective ways of understanding the FECG signal and its nature for fetal monitoring. A comparative study has been carried out to show the performance and accuracy of various methods of FECG signal analysis for fetal monitoring. Finally, this paper further focused some of the hardware implementations using electrical signals for monitoring the fetal heart rate. This paper opens up a passage for researchers, physicians, and end users to advocate an excellent understanding of FECG signal and its analysis procedures for fetal heart rate monitoring system. PMID:19495912
Low energy physical activity recognition system on smartphones.
Soria Morillo, Luis Miguel; Gonzalez-Abril, Luis; Ortega Ramirez, Juan Antonio; de la Concepcion, Miguel Angel Alvarez
2015-03-03
An innovative approach to physical activity recognition based on the use of discrete variables obtained from accelerometer sensors is presented. The system first performs a discretization process for each variable, which allows efficient recognition of activities performed by users using as little energy as possible. To this end, an innovative discretization and classification technique is presented based on the χ2 distribution. Furthermore, the entire recognition process is executed on the smartphone, which determines not only the activity performed, but also the frequency at which it is carried out. These techniques and the new classification system presented reduce energy consumption caused by the activity monitoring system. The energy saved increases smartphone usage time to more than 27 h without recharging while maintaining accuracy.
Double Modification of Polymer End Groups through Thiolactone Chemistry.
Driessen, Frank; Martens, Steven; Meyer, Bernhard De; Du Prez, Filip E; Espeel, Pieter
2016-06-01
A straightforward synthetic procedure for the double modification and polymer-polymer conjugation of telechelic polymers is performed through amine-thiol-ene conjugation. Thiolactone end-functionalized polymers are prepared via two different methods, through controlled radical polymerization of a thiolactone-containing initiator, or by modification of available end-functionalized polymers. Next, these different linear polymers are treated with a variety of amine/acrylate-combinations in a one-pot procedure, creating a library of tailored end-functionalized polymers. End group conversions are monitored via SEC, NMR, and MALDI-TOF analysis, confirming the quantitative modification after each step. Finally, this strategy is applied for the synthesis of block copolymers via polymer-polymer conjugation and the successful outcome is analyzed via LCxSEC measurements. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Storage element performance optimization for CMS analysis jobs
NASA Astrophysics Data System (ADS)
Behrmann, G.; Dahlblom, J.; Guldmyr, J.; Happonen, K.; Lindén, T.
2012-12-01
Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resources (Compute Element, CE) and storage resources (Storage Element, SE). The vast amount of data that needs to processed from the Large Hadron Collider (LHC) experiments requires good and efficient use of the available resources. Having a good CPU efficiency for the end users analysis jobs requires that the performance of the storage system is able to scale with I/O requests from hundreds or even thousands of simultaneous jobs. In this presentation we report on the work on improving the SE performance at the Helsinki Institute of Physics (HIP) Tier-2 used for the Compact Muon Experiment (CMS) at the LHC. Statistics from CMS grid jobs are collected and stored in the CMS Dashboard for further analysis, which allows for easy performance monitoring by the sites and by the CMS collaboration. As part of the monitoring framework CMS uses the JobRobot which sends every four hours 100 analysis jobs to each site. CMS also uses the HammerCloud tool for site monitoring and stress testing and it has replaced the JobRobot. The performance of the analysis workflow submitted with JobRobot or HammerCloud can be used to track the performance due to site configuration changes, since the analysis workflow is kept the same for all sites and for months in time. The CPU efficiency of the JobRobot jobs at HIP was increased approximately by 50 % to more than 90 %, by tuning the SE and by improvements in the CMSSW and dCache software. The performance of the CMS analysis jobs improved significantly too. Similar work has been done on other CMS Tier-sites, since on average the CPU efficiency for CMSSW jobs has increased during 2011. Better monitoring of the SE allows faster detection of problems, so that the performance level can be kept high. The next storage upgrade at HIP consists of SAS disk enclosures which can be stress tested on demand with HammerCloud workflows, to make sure that the I/O-performance is good.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lingerfelt, Eric J; Messer, II, Otis E
2017-01-02
The Bellerophon software system supports CHIMERA, a production-level HPC application that simulates the evolution of core-collapse supernovae. Bellerophon enables CHIMERA's geographically dispersed team of collaborators to perform job monitoring and real-time data analysis from multiple supercomputing resources, including platforms at OLCF, NERSC, and NICS. Its multi-tier architecture provides an encapsulated, end-to-end software solution that enables the CHIMERA team to quickly and easily access highly customizable animated and static views of results from anywhere in the world via a cross-platform desktop application.
Lymeus, Freddie; Lindberg, Per; Hartig, Terry
2018-03-01
Mindfulness courses conventionally use effortful, focused meditation to train attention. In contrast, natural settings can effortlessly support state mindfulness and restore depleted attention resources, which could facilitate meditation. We performed two studies that compared conventional training with restoration skills training (ReST) that taught low-effort open monitoring meditation in a garden over five weeks. Assessments before and after meditation on multiple occasions showed that ReST meditation increasingly enhanced attention performance. Conventional meditation enhanced attention initially but increasingly incurred effort, reflected in performance decrements toward the course end. With both courses, attentional improvements generalized in the first weeks of training. Against established accounts, the generalized improvements thus occurred before any effort was incurred by the conventional exercises. We propose that restoration rather than attention training can account for early attentional improvements with meditation. ReST holds promise as an undemanding introduction to mindfulness and as a method to enhance restoration in nature contacts. Copyright © 2018 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Montane, Angelica; Chesterfield, Ray
2005-01-01
This document summarizes the results obtained by the AprenDes project in 2004, the project's first year of implementation. It provides the principal findings on program performance from a baseline in May 2004 to the end of the school year (late October 2004). Progress on a number of project objectives related to decentralized school- and…
78 FR 59317 - Federal Acquisition Regulation; Ending Trafficking in Persons
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-26
..., emergency preparedness, safety equipment, sanitation and access to food and water. (g) A monitoring... supplies acquired, or the services required to be performed, outside the United States exceeds $500,000... involve services or supplies susceptible to trafficking. The compliance plan must be provided to the...
Monitoring of vapor phase polycyclic aromatic hydrocarbons
Vo-Dinh, Tuan; Hajaligol, Mohammad R.
2004-06-01
An apparatus for monitoring vapor phase polycyclic aromatic hydrocarbons in a high-temperature environment has an excitation source producing electromagnetic radiation, an optical path having an optical probe optically communicating the electromagnetic radiation received at a proximal end to a distal end, a spectrometer or polychromator, a detector, and a positioner coupled to the first optical path. The positioner can slidably move the distal end of the optical probe to maintain the distal end position with respect to an area of a material undergoing combustion. The emitted wavelength can be directed to a detector in a single optical probe 180.degree. backscattered configuration, in a dual optical probe 180.degree. backscattered configuration or in a dual optical probe 90.degree. side scattered configuration. The apparatus can be used to monitor an emitted wavelength of energy from a polycyclic aromatic hydrocarbon as it fluoresces in a high temperature environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, T; Cho, M; Kang, S
Purpose: To improve the setup accuracy of thermoplastic mask, we developed a new monitoring method based on force sensing technology and evaluated its feasibility. Methods: The thermoplastic mask setup monitoring system consists of a force sensing resistor sensor unit, a signal transport device, a control PC and an in-house software. The system is designed to monitor pressure variation between the mask and patient in real time. It also provides a warning to the user when there is a possibility of movement. A preliminary study was performed to evaluate the reliability of the sensor unit and developed monitoring system with amore » head phantom. Then, a simulation study with volunteers was conducted to evaluate the feasibility of the monitoring system. Note that the sensor unit can have multiple end-sensors and every end-sensor was confirmed to be within 2% reliability in pressure reading through a screening test. Results: To evaluate the reproducibility of the proposed monitoring system in practice, we simulated a mask setup with the head phantom. FRS sensors were attached on the face of the head phantom and pressure was monitored. For 3 repeated mask setups on the phantom, the variation of the pressure was less than 3% (only 1% larger than 2% potential uncertainty confirmed in the screening test). In the volunteer study, we intended to verify that the system could detect patient movements within the mask. Thus, volunteers were asked to turn their head or lift their chin. The system was able to detect movements effectively, confirming the clinical feasibility of the monitoring system developed. Conclusion: Through the proposed setup monitoring method, it is possible to monitor patient motion inside a mask in real time, which has never been possible with most commonly used systems using non-radiographic technology such as infrared camera system and surface imaging system. This work was supported by the Radiation Technology R&D program (No. 2013M2A2A7043498) and the Mid-career Researcher Program (2014R1A2A1A10050270) through the National Research Foundation of Korea funded by the Ministry of Science, ICT&Future Planning.« less
Kuzay, Tuncer M.; Shu, Deming
1995-01-01
A photon beam position monitor for use in the front end of a beamline of a high heat flux and high energy photon source such as a synchrotron radiation storage ring detects and measures the position and, when a pair of such monitors are used in tandem, the slope of a photon beam emanating from an insertion device such as a wiggler or an undulator inserted in the straight sections of the ring. The photon beam position monitor includes a plurality of spaced blades for precisely locating the photon beam, with each blade comprised of chemical vapor deposition (CVD) diamond with an outer metal coating of a photon sensitive metal such as tungsten, molybdenum, etc., which combination emits electrons when a high energy photon beam is incident upon the blade. Two such monitors are contemplated for use in the front end of the beamline, with the two monitors having vertically and horizontally offset detector blades to avoid blade "shadowing". Provision is made for aligning the detector blades with the photon beam and limiting detector blade temperature during operation.
Aquarius Whole Range Calibration: Celestial Sky, Ocean, and Land Targets
NASA Technical Reports Server (NTRS)
Dinnat, Emmanuel P.; Le Vine, David M.; Bindlish, Rajat; Piepmeier, Jeffrey R.; Brown, Shannon T.
2014-01-01
Aquarius is a spaceborne instrument that uses L-band radiometers to monitor sea surface salinity globally. Other applications of its data over land and the cryosphere are being developed. Combining its measurements with existing and upcoming L-band sensors will allow for long term studies. For that purpose, the radiometers calibration is critical. Aquarius measurements are currently calibrated over the oceans. They have been found too cold at the low end (celestial sky) of the brightness temperature scale, and too warm at the warm end (land and ice). We assess the impact of the antenna pattern model on the biases and propose a correction. We re-calibrate Aquarius measurements using the corrected antenna pattern and measurements over the Sky and oceans. The performances of the new calibration are evaluated using measurements over well instrument land sites.
Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; ÓLaighin, Gearóid
2017-01-01
Aims The study aims were to evaluate the validity of two commercially available swimming activity monitors for quantifying temporal and kinematic swimming variables. Methods Ten national level swimmers (5 male, 5 female; 15.3±1.3years; 164.8±12.9cm; 62.4±11.1kg; 425±66 FINA points) completed a set protocol comprising 1,500m of swimming involving all four competitive swimming strokes. Swimmers wore the Finis Swimsense and the Garmin Swim activity monitors throughout. The devices automatically identified stroke type, swim distance, lap time, stroke count, stroke rate, stroke length and average speed. Video recordings were also obtained and used as a criterion measure to evaluate performance. Results A significant positive correlation was found between the monitors and video for the identification of each of the four swim strokes (Garmin: X2 (3) = 31.292, p<0.05; Finis:X2 (3) = 33.004, p<0.05). No significant differences were found for swim distance measurements. Swimming laps performed in the middle of a swimming interval showed no significant difference from the criterion (Garmin: bias -0.065, 95% confidence intervals -3.828–6.920; Finis bias -0.02, 95% confidence intervals -3.095–3.142). However laps performed at the beginning and end of an interval were not as accurately timed. Additionally, a statistical difference was found for stroke count measurements in all but two occasions (p<0.05). These differences affect the accuracy of stroke rate, stroke length and average speed scores reported by the monitors, as all of these are derived from lap times and stroke counts. Conclusions Both monitors were found to operate with a relatively similar performance level and appear suited for recreational use. However, issues with feature detection accuracy may be related to individual variances in stroke technique. It is reasonable to expect that this level of error would increase when the devices are used by recreational swimmers rather than elite swimmers. Further development to improve accuracy of feature detection algorithms, specifically for lap time and stroke count, would also increase their suitability within competitive settings. PMID:28178301
Mooney, Robert; Quinlan, Leo R; Corley, Gavin; Godfrey, Alan; Osborough, Conor; ÓLaighin, Gearóid
2017-01-01
The study aims were to evaluate the validity of two commercially available swimming activity monitors for quantifying temporal and kinematic swimming variables. Ten national level swimmers (5 male, 5 female; 15.3±1.3years; 164.8±12.9cm; 62.4±11.1kg; 425±66 FINA points) completed a set protocol comprising 1,500m of swimming involving all four competitive swimming strokes. Swimmers wore the Finis Swimsense and the Garmin Swim activity monitors throughout. The devices automatically identified stroke type, swim distance, lap time, stroke count, stroke rate, stroke length and average speed. Video recordings were also obtained and used as a criterion measure to evaluate performance. A significant positive correlation was found between the monitors and video for the identification of each of the four swim strokes (Garmin: X2 (3) = 31.292, p<0.05; Finis:X2 (3) = 33.004, p<0.05). No significant differences were found for swim distance measurements. Swimming laps performed in the middle of a swimming interval showed no significant difference from the criterion (Garmin: bias -0.065, 95% confidence intervals -3.828-6.920; Finis bias -0.02, 95% confidence intervals -3.095-3.142). However laps performed at the beginning and end of an interval were not as accurately timed. Additionally, a statistical difference was found for stroke count measurements in all but two occasions (p<0.05). These differences affect the accuracy of stroke rate, stroke length and average speed scores reported by the monitors, as all of these are derived from lap times and stroke counts. Both monitors were found to operate with a relatively similar performance level and appear suited for recreational use. However, issues with feature detection accuracy may be related to individual variances in stroke technique. It is reasonable to expect that this level of error would increase when the devices are used by recreational swimmers rather than elite swimmers. Further development to improve accuracy of feature detection algorithms, specifically for lap time and stroke count, would also increase their suitability within competitive settings.
Data analysis of photon beam position at PLS-II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ko, J.; Shin, S., E-mail: tlssh@postech.ac.kr; Huang, Jung-Yun
In the third generation light source, photon beam position stability is critical issue on user experiment. Generally photon beam position monitors have been developed for the detection of the real photon beam position and the position is controlled by feedback system in order to keep the reference photon beam position. In the PLS-II, photon beam position stability for front end of particular beam line, in which photon beam position monitor is installed, has been obtained less than rms 1μm for user service period. Nevertheless, detail analysis for photon beam position data in order to demonstrate the performance of photon beammore » position monitor is necessary, since it can be suffers from various unknown noises. (for instance, a back ground contamination due to upstream or downstream dipole radiation, undulator gap dependence, etc.) In this paper, we will describe the start to end study for photon beam position stability and the Singular Value Decomposition (SVD) analysis to demonstrate the reliability on photon beam position data.« less
Belliato, Mirko; Degani, Antonella; Buffa, Antonino; Sciutti, Fabio; Pagani, Michele; Pellegrini, Carlo; Iotti, Giorgio Antonio
2017-10-01
Monitoring veno-venous extracorporeal membrane oxygenation (vvECMO) during 76 days of continuous support in a 42-years old patient with end-stage pulmonary disease, listed for double-lung transplantation. Applying a new monitor (Landing ® , Eurosets, Medolla, Italy) and describing how measured and calculated parameters can be used to understand the variable interdependency between artificial membrane lung (ML) and patient native lung (NL). During vvECMO, in order to understand how the respiratory function is shared between ML and NL, ideally we should obtain data about oxygen transfer and CO 2 removal, both by ML and NL. Measurements for NL can be made on the mechanical ventilator. Measurements for ML are typically made from gas analysis on blood samples drawn from the ECMO system before and after the oxygenator, and therefore are non-continuous. Differently, the Landing monitor provides a continuous measurement of the oxygen transfer from the ML, combined with hemoglobin level, saturation of drained blood and saturation of reinfused blood. Moreover, the Landing monitor provides hemodynamics data about circulation through the ECMO system, with blood flow, pre-oxygenator pressure and post-oxygenator pressure. Of note, measurements include the drain negative pressure, whose monitoring may be particularly useful to prevent hemolysis. Real-time monitoring of vvECMO provides data helpful to understand the complex picture of a patient with severely damaged lungs on one side and an artificial lung on the other side. Data from vvECMO monitoring may help to adapt the settings of both mechanical ventilator and vvECMO. Data about oxygen transfer by the oxygenator are important to evaluate the performance of the device and may help to avoid unnecessary replacements, thus reducing risks and costs.
Development and Comparison of Technical Solutions for Electricity Monitoring Equipment
NASA Astrophysics Data System (ADS)
Potapovs, A.; Obushevs, A.
2017-12-01
The paper focuses on the elaboration of a demand-side management platform for optimal energy management strategies; the topicality is related to the description and comparison of the developed electricity monitoring and control equipment. The article describes two versions based on Atmega328 and STM32 microcontrollers, a lower and higher level of precision, and other distinct performance parameters. At the end of the article, the results of the testing of the two types of equipment are given and their comparison is made.
NASA Astrophysics Data System (ADS)
Gulde, S. T.; Kolm, M. G.; Smith, D. J.; Maurer, R.; Bazalgette Courrèges-Lacoste, G.; Sallusti, M.; Bagnasco, G.
2017-11-01
SENTINEL 4 is an imaging UVN (UV-VIS-NIR) spectrometer, developed by Airbus Defence and Space under ESA contract in the frame of the joint European Union (EU)/ESA COPERNICUS program. The mission objective is the operational monitoring of trace gas concentrations for atmospheric chemistry and climate applications. To this end SENTINEL 4 will provide accurate measurements of key atmospheric constituents such as ozone, nitrogen dioxide, sulfur dioxide, formaldehyde, as well as aerosol and cloud properties.
Fiber-optic epoxy composite cure sensor. II. Performance characteristics
NASA Astrophysics Data System (ADS)
Lam, Kai-Yuen; Afromowitz, Martin A.
1995-09-01
The performance of a fiber-optic epoxy composite cure sensor, as previously proposed, depends on the optical properties and the reaction kinetics of the epoxy. The reaction kinetics of a typical epoxy system are presented. It is a third-order autocatalytic reaction with a peak observed in each isothermal reaction-rate curve. A model is derived to describe the performance characteristics of the epoxy cure sensor. If a composite coupon is cured at an isothermal temperature, the sensor signal can be used to predict the time when the gel point occurs and to monitor the cure process. The sensor is also shown to perform well in nonstoichiometric epoxy matrices. In addition the sensor can detect the end of the cure without calibration.
New mainstream double-end carbon dioxide capnograph for human respiration
NASA Astrophysics Data System (ADS)
Yang, Jiachen; An, Kun; Wang, Bin; Wang, Lei
2010-11-01
Most of the current respiratory devices for monitoring CO2 concentration use the side-stream structure. In this work, we engage to design a new double-end mainstream device for monitoring CO2 concentration of gas breathed out of the human body. The device can accurately monitor the cardiopulmonary status during anesthesia and mechanical ventilation in real time. Meanwhile, to decrease the negative influence of device noise and the low sample precision caused by temperature drift, wavelet packet denoising and temperature drift compensation are used. The new capnograph is proven by clinical trials to be helpful in improving the accuracy of capnography.
Forensic analysis of online marketing for electronic nicotine delivery systems.
Cobb, Nathan K; Brookover, Jody; Cobb, Caroline O
2015-03-01
Electronic nicotine delivery systems (ENDS) are growing in awareness and use in the USA. They are currently unregulated as the Food and Drug Administration has yet to assert jurisdiction under its tobacco authority over these products, and a US Court of Appeals held they cannot be regulated as drugs/delivery devices if they are not marketed for a therapeutic purpose. Observation of the current online marketplace suggests ENDS, like some nutraceutical products, are being promoted using affiliate marketing techniques using claims concerning purported health benefits. This study performed a forensic analysis to characterise the relationships between online ENDS affiliate advertisements and ENDS sellers, and evaluated descriptive content on advertisements and websites to inform future policy and regulatory efforts. A purposive sampling strategy was used to identify three forms of ENDS advertising. Web proxy software recorded identifiable objects and their ties to each other. Network analysis of these ties followed, as well as analysis of descriptive content on advertisements and websites identified. The forensic analysis included four ENDS advertisements, two linked affiliate websites, and two linked seller websites, and demonstrated a multilevel relationship between advertisements and sellers with multiple layers of redirection. Descriptive analysis indicated that advertisements and affiliates, but not linked sellers, included smoking cessation claims. Results suggest that ENDS sellers may be trying to distance marketing efforts containing unsubstantiated claims from sales. A separate descriptive analysis of 20 ENDS seller web pages indicated that the use of affiliate marketing by sellers may be widespread. These findings support increased monitoring and regulation of ENDS marketing to prevent deceptive marketing tactics and ensure consumer safety. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Chen, Ting; Zhang, Miao; Jabbour, Salma; Wang, Hesheng; Barbee, David; Das, Indra J; Yue, Ning
2018-04-10
Through-plane motion introduces uncertainty in three-dimensional (3D) motion monitoring when using single-slice on-board imaging (OBI) modalities such as cine MRI. We propose a principal component analysis (PCA)-based framework to determine the optimal imaging plane to minimize the through-plane motion for single-slice imaging-based motion monitoring. Four-dimensional computed tomography (4DCT) images of eight thoracic cancer patients were retrospectively analyzed. The target volumes were manually delineated at different respiratory phases of 4DCT. We performed automated image registration to establish the 4D respiratory target motion trajectories for all patients. PCA was conducted using the motion information to define the three principal components of the respiratory motion trajectories. Two imaging planes were determined perpendicular to the second and third principal component, respectively, to avoid imaging with the primary principal component of the through-plane motion. Single-slice images were reconstructed from 4DCT in the PCA-derived orthogonal imaging planes and were compared against the traditional AP/Lateral image pairs on through-plane motion, residual error in motion monitoring, absolute motion amplitude error and the similarity between target segmentations at different phases. We evaluated the significance of the proposed motion monitoring improvement using paired t test analysis. The PCA-determined imaging planes had overall less through-plane motion compared against the AP/Lateral image pairs. For all patients, the average through-plane motion was 3.6 mm (range: 1.6-5.6 mm) for the AP view and 1.7 mm (range: 0.6-2.7 mm) for the Lateral view. With PCA optimization, the average through-plane motion was 2.5 mm (range: 1.3-3.9 mm) and 0.6 mm (range: 0.2-1.5 mm) for the two imaging planes, respectively. The absolute residual error of the reconstructed max-exhale-to-inhale motion averaged 0.7 mm (range: 0.4-1.3 mm, 95% CI: 0.4-1.1 mm) using optimized imaging planes, averaged 0.5 mm (range: 0.3-1.0 mm, 95% CI: 0.2-0.8 mm) using an imaging plane perpendicular to the minimal motion component only and averaged 1.3 mm (range: 0.4-2.8 mm, 95% CI: 0.4-2.3 mm) in AP/Lateral orthogonal image pairs. The root-mean-square error of reconstructed displacement was 0.8 mm for optimized imaging planes, 0.6 mm for imaging plane perpendicular to the minimal motion component only, and 1.6 mm for AP/Lateral orthogonal image pairs. When using the optimized imaging planes for motion monitoring, there was no significant absolute amplitude error of the reconstructed motion (P = 0.0988), while AP/Lateral images had significant error (P = 0.0097) with a paired t test. The average surface distance (ASD) between overlaid two-dimensional (2D) tumor segmentation at end-of-inhale and end-of-exhale for all eight patients was 0.6 ± 0.2 mm in optimized imaging planes and 1.4 ± 0.8 mm in AP/Lateral images. The Dice similarity coefficient (DSC) between overlaid 2D tumor segmentation at end-of-inhale and end-of-exhale for all eight patients was 0.96 ± 0.03 in optimized imaging planes and 0.89 ± 0.05 in AP/Lateral images. Both ASD (P = 0.034) and DSC (P = 0.022) were significantly improved in the optimized imaging planes. Motion monitoring using imaging planes determined by the proposed PCA-based framework had significantly improved performance. Single-slice image-based motion tracking can be used for clinical implementations such as MR image-guided radiation therapy (MR-IGRT). © 2018 American Association of Physicists in Medicine.
An explorative study of school performance and antipsychotic medication.
van der Schans, J; Vardar, S; Çiçek, R; Bos, H J; Hoekstra, P J; de Vries, T W; Hak, E
2016-09-21
Antipsychotic therapy can reduce severe symptoms of psychiatric disorders, however, data on school performance among children on such treatment are lacking. The objective was to explore school performance among children using antipsychotic drugs at the end of primary education. A cross-sectional study was conducted using the University Groningen pharmacy database linked to academic achievement scores at the end of primary school (Dutch Cito-test) obtained from Statistics Netherlands. Mean Cito-test scores and standard deviations were obtained for children on antipsychotic therapy and reference children, and statistically compared using analyses of covariance. In addition, differences in subgroups as boys versus girls, ethnicity, household income, and late starters (start date within 12 months of the Cito-test) versus early starters (start date > 12 months before the Cito-test) were tested. In all, data from 7994 children could be linked to Cito-test scores. At the time of the Cito-test, 45 (0.6 %) were on treatment with antipsychotics. Children using antipsychotics scored on average 3.6 points lower than the reference peer group (534.5 ± 9.5). Scores were different across gender and levels of household income (p < 0.05). Scores of early starters were significantly higher than starters within 12 months (533.7 ± 1.7 vs. 524.1 ± 2.6). This first exploration showed that children on antipsychotic treatment have lower school performance compared to the reference peer group at the end of primary school. This was most noticeable for girls, but early starters were less affected than later starters. Due to the observational cross-sectional nature of this study, no causality can be inferred, but the results indicate that school performance should be closely monitored and causes of underperformance despite treatment warrants more research.
49 CFR 238.445 - Automated monitoring.
Code of Federal Regulations, 2010 CFR
2010-10-01
... performance of the following systems or components: (1) Reception of cab signals and train control signals; (2) Truck hunting; (3) Dynamic brake status; (4) Friction brake status; (5) Fire detection systems; (6) Head end power status; (7) Alerter or deadman control; (8) Horn and bell; (9) Wheel slide; (10) Tilt system...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ueno, Kohta
There are many existing buildings with load-bearing mass masonry walls, whose energy performance could be improved with the retrofit of insulation. However, adding insulation to the interior side of walls of such masonry buildings in cold (and wet) climates may cause performance and durability problems. Some concerns, such as condensation and freeze-thaw have known solutions. But wood members embedded in the masonry structure will be colder (and potentially wetter) after an interior insulation retrofit. Moisture content & relative humidity were monitored at joist ends in historic mass brick masonry walls retrofitted with interior insulation in a cold climate (Zone 5A);more » data were collected from 2012-2015. Eleven joist ends were monitored in all four orientations. One limitation of these results is that the renovation is still ongoing, with limited wintertime construction heating and no permanent occupancy to date. Measurements show that many joists ends remain at high moisture contents, especially at north- and east-facing orientations, with constant 100% RH conditions at the worst cases. These high moisture levels are not conducive for wood durability, but no evidence for actual structural damage has been observed. Insulated vs. non-insulated joist pockets do not show large differences. South facing joists have safe (10-15%) moisture contents. Given the uncertainty pointed out by research, definitive guidance on the vulnerability of embedded wood members is difficult to formulate. In high-risk situations, or when a very conservative approach is warranted, the embedded wood member condition can be eliminated entirely, supporting the joist ends outside of the masonry pocket.« less
Aviation safety research and transportation/hazard avoidance and elimination
NASA Technical Reports Server (NTRS)
Sonnenschein, C. M.; Dimarzio, C.; Clippinger, D.; Toomey, D.
1976-01-01
Data collected by the Scanning Laser Doppler Velocimeter System (SLDVS) was analyzed to determine the feasibility of the SLDVS for monitoring aircraft wake vortices in an airport environment. Data were collected on atmospheric vortices and analyzed. Over 1600 landings were monitored at Kennedy International Airport and by the end of the test period 95 percent of the runs with large aircraft were producing usable results in real time. The transport was determined in real time and post analysis using algorithms which performed centroids on the highest amplitude in the thresholded spectrum. Making use of other parameters of the spectrum, vortex flow fields were studied along with the time histories of peak velocities and amplitudes. The post analysis of the data was accomplished with a CDC-6700 computer using several programs developed for LDV data analysis.
Processing and Quality Monitoring for the ATLAS Tile Hadronic Calorimeter Data
NASA Astrophysics Data System (ADS)
Burghgrave, Blake; ATLAS Collaboration
2017-10-01
An overview is presented of Data Processing and Data Quality (DQ) Monitoring for the ATLAS Tile Hadronic Calorimeter. Calibration runs are monitored from a data quality perspective and used as a cross-check for physics runs. Data quality in physics runs is monitored extensively and continuously. Any problems are reported and immediately investigated. The DQ efficiency achieved was 99.6% in 2012 and 100% in 2015, after the detector maintenance in 2013-2014. Changes to detector status or calibrations are entered into the conditions database (DB) during a brief calibration loop between the end of a run and the beginning of bulk processing of data collected in it. Bulk processed data are reviewed and certified for the ATLAS Good Run List if no problem is detected. Experts maintain the tools used by DQ shifters and the calibration teams during normal operation, and prepare new conditions for data reprocessing and Monte Carlo (MC) production campaigns. Conditions data are stored in 3 databases: Online DB, Offline DB for data and a special DB for Monte Carlo. Database updates can be performed through a custom-made web interface.
Pilot study of long-term anaesthesia in broiler chickens.
O'Kane, Peter M; Connerton, Ian F; White, Kate L
2016-01-01
To provide stable anaesthesia of long duration in broiler chickens in order to perform a terminal caecal ligated loop procedure. Prospective experimental study. Seven clinically healthy broiler chickens (Gallus domesticus) aged 27-36 days, weighing 884-2000 g. Anaesthesia was induced and maintained with isoflurane in oxygen. All birds underwent intermittent positive pressure ventilation for the duration. End-tidal carbon dioxide, peripheral haemoglobin oxygen saturation, heart rate and oesophageal temperature were monitored continuously. All birds received intraosseous fluids. Butorphanol (2 mg kg(-1)) was administered intramuscularly at two hourly intervals. Euthanasia by parenteral pentobarbitone was performed at the end of procedure. Stable anaesthesia was maintained in four chickens for durations ranging from 435 to 510 minutes. One bird died and one was euthanized after 130 and 330 minutes, respectively, owing to surgical complications and another died from anaesthetic complication after 285 minutes. Long-term, stable anaesthesia is possible in clinically healthy chickens, provided complications such as hypothermia and hypoventilation are addressed and vital signs are carefully monitored. There are no known previous reports describing monitored, controlled anaesthesia of this duration in chickens. © 2015 The Authors Veterinary Anaesthesia and Analgesia published by John Wiley & Sons Ltd on behalf of Association of Veterinary Anaesthetists and the American College of Veterinary Anesthesia and Analgesia.
OAO-3 end of mission tests report
NASA Technical Reports Server (NTRS)
Kalil, F.; Kull, F. J.; Mcintosh, R.; Ollendorf, S.; Margolies, D. L.; Gemmell, J.; Tasevoli, C. M.; Polidan, R. S.; Kochevar, H.; Chapman, C.
1981-01-01
Twelve engineering type tests were performed on several subsystems and experiment(s) of the OAO 3 spacecraft near its end of mission. The systems tested include: Princeton experiment package (PEP), fine error system guidance, inertial reference unit, star trackers, heat pipes, thermal control coatings, command and data handling, solar array; batteries, and onboard processor/power boost regulator. Generally, the systems performed well for the 8 1/2 years life of OAO 3, although some degradation was noted in the sensitivity of PEP and in the absorptivity of the skin coatings. Battery life was prolonged during the life of the mission in large part by carefully monitoring the charge-discharge cycle with careful attention not to overcharge.
A multi-agent approach to intelligent monitoring in smart grids
NASA Astrophysics Data System (ADS)
Vallejo, D.; Albusac, J.; Glez-Morcillo, C.; Castro-Schez, J. J.; Jiménez, L.
2014-04-01
In this paper, we propose a scalable multi-agent architecture to give support to smart grids, paying special attention to the intelligent monitoring of distribution substations. The data gathered by multiple sensors are used by software agents that are responsible for monitoring different aspects or events of interest, such as normal voltage values or unbalanced intensity values that can end up blowing fuses and decreasing the quality of service of end consumers. The knowledge bases of these agents have been built by means of a formal model for normality analysis that has been successfully used in other surveillance domains. The architecture facilitates the integration of new agents and can be easily configured and deployed to monitor different environments. The experiments have been conducted over a power distribution network.
EARLINET: potential operationality of a research network
NASA Astrophysics Data System (ADS)
Sicard, M.; D'Amico, G.; Comerón, A.; Mona, L.; Alados-Arboledas, L.; Amodeo, A.; Baars, H.; Baldasano, J. M.; Belegante, L.; Binietoglou, I.; Bravo-Aranda, J. A.; Fernández, A. J.; Fréville, P.; García-Vizcaíno, D.; Giunta, A.; Granados-Muñoz, M. J.; Guerrero-Rascado, J. L.; Hadjimitsis, D.; Haefele, A.; Hervo, M.; Iarlori, M.; Kokkalis, P.; Lange, D.; Mamouri, R. E.; Mattis, I.; Molero, F.; Montoux, N.; Muñoz, A.; Muñoz Porcar, C.; Navas-Guzmán, F.; Nicolae, D.; Nisantzi, A.; Papagiannopoulos, N.; Papayannis, A.; Pereira, S.; Preißler, J.; Pujadas, M.; Rizi, V.; Rocadenbosch, F.; Sellegri, K.; Simeonov, V.; Tsaknakis, G.; Wagner, F.; Pappalardo, G.
2015-11-01
In the framework of ACTRIS (Aerosols, Clouds, and Trace Gases Research Infrastructure Network) summer 2012 measurement campaign (8 June-17 July 2012), EARLINET organized and performed a controlled exercise of feasibility to demonstrate its potential to perform operational, coordinated measurements and deliver products in near-real time. Eleven lidar stations participated in the exercise which started on 9 July 2012 at 06:00 UT and ended 72 h later on 12 July at 06:00 UT. For the first time, the single calculus chain (SCC) - the common calculus chain developed within EARLINET for the automatic evaluation of lidar data from raw signals up to the final products - was used. All stations sent in real-time measurements of a 1 h duration to the SCC server in a predefined netcdf file format. The pre-processing of the data was performed in real time by the SCC, while the optical processing was performed in near-real time after the exercise ended. 98 and 79 % of the files sent to SCC were successfully pre-processed and processed, respectively. Those percentages are quite large taking into account that no cloud screening was performed on the lidar data. The paper draws present and future SCC users' attention to the most critical parameters of the SCC product configuration and their possible optimal value but also to the limitations inherent to the raw data. The continuous use of SCC direct and derived products in heterogeneous conditions is used to demonstrate two potential applications of EARLINET infrastructure: the monitoring of a Saharan dust intrusion event and the evaluation of two dust transport models. The efforts made to define the measurements protocol and to configure properly the SCC pave the way for applying this protocol for specific applications such as the monitoring of special events, atmospheric modeling, climate research and calibration/validation activities of spaceborne observations.
NASA Astrophysics Data System (ADS)
Dong, Hancheng; Jin, Xiaoning; Lou, Yangbing; Wang, Changhong
2014-12-01
Lithium-ion batteries are used as the main power source in many electronic and electrical devices. In particular, with the growth in battery-powered electric vehicle development, the lithium-ion battery plays a critical role in the reliability of vehicle systems. In order to provide timely maintenance and replacement of battery systems, it is necessary to develop a reliable and accurate battery health diagnostic that takes a prognostic approach. Therefore, this paper focuses on two main methods to determine a battery's health: (1) Battery State-of-Health (SOH) monitoring and (2) Remaining Useful Life (RUL) prediction. Both of these are calculated by using a filter algorithm known as the Support Vector Regression-Particle Filter (SVR-PF). Models for battery SOH monitoring based on SVR-PF are developed with novel capacity degradation parameters introduced to determine battery health in real time. Moreover, the RUL prediction model is proposed, which is able to provide the RUL value and update the RUL probability distribution to the End-of-Life cycle. Results for both methods are presented, showing that the proposed SOH monitoring and RUL prediction methods have good performance and that the SVR-PF has better monitoring and prediction capability than the standard particle filter (PF).
Self-monitoring high voltage transmission line suspension insulator
Stemler, Gary E.; Scott, Donald N.
1981-01-01
A high voltage transmission line suspension insulator (18 or 22) which monitors its own dielectric integrity. A dielectric rod (10) has one larger diameter end fitting attachable to a transmission line and another larger diameter end fitting attachable to a support tower. The rod is enclosed in a dielectric tube (14) which is hermetically sealed to the rod's end fittings such that a liquidtight space (20) is formed between the rod and the tube. A pressurized dielectric liquid is placed within that space. A discoloring dye placed within this space is used to detect the loss of the pressurized liquid.
Using Deep Learning Model for Meteorological Satellite Cloud Image Prediction
NASA Astrophysics Data System (ADS)
Su, X.
2017-12-01
A satellite cloud image contains much weather information such as precipitation information. Short-time cloud movement forecast is important for precipitation forecast and is the primary means for typhoon monitoring. The traditional methods are mostly using the cloud feature matching and linear extrapolation to predict the cloud movement, which makes that the nonstationary process such as inversion and deformation during the movement of the cloud is basically not considered. It is still a hard task to predict cloud movement timely and correctly. As deep learning model could perform well in learning spatiotemporal features, to meet this challenge, we could regard cloud image prediction as a spatiotemporal sequence forecasting problem and introduce deep learning model to solve this problem. In this research, we use a variant of Gated-Recurrent-Unit(GRU) that has convolutional structures to deal with spatiotemporal features and build an end-to-end model to solve this forecast problem. In this model, both the input and output are spatiotemporal sequences. Compared to Convolutional LSTM(ConvLSTM) model, this model has lower amount of parameters. We imply this model on GOES satellite data and the model perform well.
Results from Solar Reflective Band End-to-End Testing for VIIRS F1 Sensor Using T-SIRCUS
NASA Technical Reports Server (NTRS)
McIntire, Jeff; Moyer, David; McCarthy, James K.; DeLuccia, Frank; Xiong, Xiaoxiong; Butler, James J.; Guenther, Bruce
2011-01-01
Verification of the Visible Infrared Imager Radiometer Suite (VIIRS) End-to-End (E2E) sensor calibration is highly recommended before launch, to identify any anomalies and to improve our understanding of the sensor on-orbit calibration performance. E2E testing of the Reflective Solar Bands (RSB) calibration cycle was performed pre-launch for the VIIRS Fight 1 (F1) sensor at the Ball Aerospace facility in Boulder CO in March 2010. VIIRS reflective band calibration cycle is very similar to heritage sensor MODIS in that solar illumination, via a diffuser, is used to correct for temporal variations in the instrument responsivity. Monochromatic light from the NIST T-SIRCUS was used to illuminate both the Earth View (EV), via an integrating sphere, and the Solar Diffuser (SD) view, through a collimator. The collimator illumination was cycled through a series of angles intended to simulate the range of possible angles for which solar radiation will be incident on the solar attenuation screen on-orbit. Ideally, the measured instrument responsivity (defined here as the ratio of the detector response to the at-sensor radiance) should be the same whether the EV or SD view is illuminated. The ratio of the measured responsivities was determined at each collimator angle and wavelength. In addition, the Solar Diffuser Stability Monitor (SDSM), a ratioing radiometer designed to track the temporal variation in the SD BRF by direct comparison to solar radiation, was illuminated by the collimator. The measured SDSM ratio was compared to the predicted ratio. An uncertainty analysis was also performed on both the SD and SDSM calibrations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, C.S.; af Ekenstam, G.; Sallstrom, M.
1995-07-01
The Swedish Nuclear Power Inspectorate (SKI) and the US Department of Energy (DOE) sponsored work on a Remote Monitoring System (RMS) that was installed in August 1994 at the Barseback Works north of Malmo, Sweden. The RMS was designed to test the front end detection concept that would be used for unattended remote monitoring activities. Front end detection reduces the number of video images recorded and provides additional sensor verification of facility operations. The function of any safeguards Containment and Surveillance (C/S) system is to collect information which primarily is images that verify the operations at a nuclear facility. Barsebackmore » is ideal to test the concept of front end detection since most activities of safeguards interest is movement of spent fuel which occurs once a year. The RMS at Barseback uses a network of nodes to collect data from microwave motion detectors placed to detect the entrance and exit of spent fuel casks through a hatch. A video system using digital compression collects digital images and stores them on a hard drive and a digital optical disk. Data and images from the storage area are remotely monitored via telephone from Stockholm, Sweden and Albuquerque, NM, USA. These remote monitoring stations operated by SKI and SNL respectively, can retrieve data and images from the RMS computer at the Barseback Facility. The data and images are encrypted before transmission. This paper presents details of the RMS and test results of this approach to front end detection of safeguard activities.« less
A Battery Health Monitoring Framework for Planetary Rovers
NASA Technical Reports Server (NTRS)
Daigle, Matthew J.; Kulkarni, Chetan Shrikant
2014-01-01
Batteries have seen an increased use in electric ground and air vehicles for commercial, military, and space applications as the primary energy source. An important aspect of using batteries in such contexts is battery health monitoring. Batteries must be carefully monitored such that the battery health can be determined, and end of discharge and end of usable life events may be accurately predicted. For planetary rovers, battery health estimation and prediction is critical to mission planning and decision-making. We develop a model-based approach utilizing computaitonally efficient and accurate electrochemistry models of batteries. An unscented Kalman filter yields state estimates, which are then used to predict the future behavior of the batteries and, specifically, end of discharge. The prediction algorithm accounts for possible future power demands on the rover batteries in order to provide meaningful results and an accurate representation of prediction uncertainty. The framework is demonstrated on a set of lithium-ion batteries powering a rover at NASA.
A report on upgraded seismic monitoring stations in Myanmar: Station performance and site response
Thiam, Hrin Nei; Min Htwe, Yin Myo; Kyaw, Tun Lin; Tun, Pa Pa; Min, Zaw; Htwe, Sun Hninn; Aung, Tin Myo; Lin, Kyaw Kyaw; Aung, Myat Min; De Cristofaro, Jason; Franke, Mathias; Radman, Stefan; Lepiten, Elouie; Wolin, Emily; Hough, Susan E.
2017-01-01
Myanmar is in a tectonically complex region between the eastern edge of the Himalayan collision zone and the northern end of the Sunda megathrust. Until recently, earthquake monitoring and research efforts have been hampered by a lack of modern instrumentation and communication infrastructure. In January 2016, a major upgrade of the Myanmar National Seismic Network (MNSN; network code MM) was undertaken to improve earthquake monitoring capability. We installed five permanent broadband and strong‐motion seismic stations and real‐time data telemetry using newly improved cellular networks. Data are telemetered to the MNSN hub in Nay Pyi Taw and archived at the Incorporated Research Institutions for Seismology Data Management Center. We analyzed station noise characteristics and site response using noise and events recorded over the first six months of station operation. Background noise characteristics vary across the array, but indicate that the new stations are performing well. MM stations recorded more than 20 earthquakes of M≥4.5 within Myanmar and its immediate surroundings, including an M 6.8 earthquake located northwest of Mandalay on 13 April 2016 and the Mw 6.8 Chauk event on 24 August 2016. We use this new dataset to calculate horizontal‐to‐vertical spectral ratios, which provide a preliminary characterization of site response of the upgraded MM stations.
Measured airtightness of 24 detached houses over periods of up to three years
DOE Office of Scientific and Technical Information (OSTI.GOV)
Proskiw, G.
1995-09-01
Regular airtightness tests were performed on 24 new houses, over periods of up to three years, to evaluate their air barrier systems and to search for evidence, air barrier degradation. Ten of the houses were constructed with polyethylene air barriers while the remaining 14 used an early version of the Airtight Drywall Approach (ADA). The 24 project houses were architecturally similar and of approximately equal size and general layout; stucco was the predominate exterior wall finish. All were exposed to similar terrain shielding. The study found that the airtightness of the polyethylene air barrier houses remained stable over their respectivemore » monitoring periods. Although two of the ten houses demonstrated possible, albeit slight, evidence of airtightness degradation, the magnitude of these changes was small and judged not to be of practical significance. With respect to the critical issue of air barrier degradation, no evidence could be found to indicate polyethylene is unsuited for use as an air barrier material in residential construction. For example, all but one of the polyethylene houses met the airtightness requirements of the Canadian R-2000 Standard for energy efficient housing at the end of their monitoring periods. The study found that the airtightness of the 14 ADA houses also remained stable over their monitoring periods. Although six of the 14 houses displayed possible, but slight, evidence of airtightness degradation, the magnitude of the changes was small and not of practical significance. It was concluded that no evidence could be found to indicate that the ADA system is unsuited for use in residential construction. All 14 ADA houses met the airtightness requirements of the R-2000 Program at the end of their respective monitoring periods.« less
Survey on Monitoring and Quality Controlling of the Mobile Biosignal Delivery.
Pawar, Pravin A; Edla, Damodar R; Edoh, Thierry; Shinde, Vijay; van Beijnum, Bert-Jan
2017-10-31
A Mobile Patient Monitoring System (MPMS) acquires patient's biosignals and transmits them using wireless network connection to the decision-making module or healthcare professional for the assessment of patient's condition. A variety of wireless network technologies such as wireless personal area networks (e.g., Bluetooth), mobile ad-hoc networks (MANET), and infrastructure-based networks (e.g., WLAN and cellular networks) are in practice for biosignals delivery. The wireless network quality-of-service (QoS) requirements of biosignals delivery are mainly specified in terms of required bandwidth, acceptable delay, and tolerable error rate. An important research challenge in the MPMS is how to satisfy QoS requirements of biosignals delivery in the environment characterized by patient mobility, deployment of multiple wireless network technologies, and variable QoS characteristics of the wireless networks. QoS requirements are mainly application specific, while available QoS is largely dependent on QoS provided by wireless network in use. QoS provisioning refers to providing support for improving QoS experience of networked applications. In resource poor conditions, application adaptation may also be required to make maximum use of available wireless network QoS. This survey paper presents a survey of recent developments in the area of QoS provisioning for MPMS. In particular, our contributions are as follows: (1) overview of wireless networks and network QoS requirements of biosignals delivery; (2) survey of wireless networks' QoS performance evaluation for the transmission of biosignals; and (3) survey of QoS provisioning mechanisms for biosignals delivery in MPMS. We also propose integrating end-to-end QoS monitoring and QoS provisioning strategies in a mobile patient monitoring system infrastructure to support optimal delivery of biosignals to the healthcare professionals.
ePave: A Self-Powered Wireless Sensor for Smart and Autonomous Pavement.
Xiao, Jian; Zou, Xiang; Xu, Wenyao
2017-09-26
"Smart Pavement" is an emerging infrastructure for various on-road applications in transportation and road engineering. However, existing road monitoring solutions demand a certain periodic maintenance effort due to battery life limits in the sensor systems. To this end, we present an end-to-end self-powered wireless sensor-ePave-to facilitate smart and autonomous pavements. The ePave system includes a self-power module, an ultra-low-power sensor system, a wireless transmission module and a built-in power management module. First, we performed an empirical study to characterize the piezoelectric module in order to optimize energy-harvesting efficiency. Second, we developed an integrated sensor system with the optimized energy harvester. An adaptive power knob is designated to adjust the power consumption according to energy budgeting. Finally, we intensively evaluated the ePave system in real-world applications to examine the system's performance and explore the trade-off.
ePave: A Self-Powered Wireless Sensor for Smart and Autonomous Pavement
Xiao, Jian; Zou, Xiang
2017-01-01
“Smart Pavement” is an emerging infrastructure for various on-road applications in transportation and road engineering. However, existing road monitoring solutions demand a certain periodic maintenance effort due to battery life limits in the sensor systems. To this end, we present an end-to-end self-powered wireless sensor—ePave—to facilitate smart and autonomous pavements. The ePave system includes a self-power module, an ultra-low-power sensor system, a wireless transmission module and a built-in power management module. First, we performed an empirical study to characterize the piezoelectric module in order to optimize energy-harvesting efficiency. Second, we developed an integrated sensor system with the optimized energy harvester. An adaptive power knob is designated to adjust the power consumption according to energy budgeting. Finally, we intensively evaluated the ePave system in real-world applications to examine the system’s performance and explore the trade-off. PMID:28954430
Wood, Martin James; McMillen, Jason
2014-01-01
This study retrospectively assessed the accuracy of placement of lumbar pedicle screws placed by a single surgeon using a minimally-invasive, intra-operative CT-based computer navigated technique in combination with continuous electromyography (EMG) monitoring. The rates of incorrectly positioned screws were reviewed in the context of the surgeon's experience and learning curve. Data was retrospectively reviewed from all consecutive minimally invasive lumbar fusions performed by the primary author over a period of over 4 years from April 2008 until October 2012. All cases that had utilized computer-assisted intra-operative CT-based image guidance and continuous EMG monitoring to guide percutaneous pedicle screw placement were analysed for the rates of malposition of the pedicle screws. Pedicle screw malposition was defined as having occurred if the screw trajectory was adjusted intraoperatively due to positive EMG responses, or due to breach of the pedicle cortex by more than 2mm on intraoperative CT imaging performed at the end of the instrumentation procedure. Further analysis of the data was undertaken to determine if the rates of malposition changed with the surgeon's experience with the technique. Six hundred and twenty-seven pedicle screws were placed in one hundred and fifty patients. The overall rate of intraoperative malposition and subsequent adjustment of pedicle screw placement was 3.8% (24 of 627 screws). Screw malposition was detected by intraoperative CT imaging. Warning of potential screw misplacement was provided by use of the EMG monitoring. With increased experience with the technique, rates of intraoperative pedicle screw malposition were found to decrease from 5.1% of screws in the first fifty patients, to 2.0% in the last 50 patients. Only one screw was suboptimally placed at the end of surgery, which did not result in a neurological deficit. The use of CT-based computer-assisted navigation in combination with continuous EMG monitoring during percutaneous transpedicular screw placement results in very low rates of malposition and neural injury that compare favourably with previously reported rates. Pedicle screw placement accuracy continues to improve as the surgeon becomes more experienced with the technique.
McMillen, Jason
2014-01-01
Objective This study retrospectively assessed the accuracy of placement of lumbar pedicle screws placed by a single surgeon using a minimally-invasive, intra-operative CT-based computer navigated technique in combination with continuous electromyography (EMG) monitoring. The rates of incorrectly positioned screws were reviewed in the context of the surgeon's experience and learning curve. Methods Data was retrospectively reviewed from all consecutive minimally invasive lumbar fusions performed by the primary author over a period of over 4 years from April 2008 until October 2012. All cases that had utilized computer-assisted intra-operative CT-based image guidance and continuous EMG monitoring to guide percutaneous pedicle screw placement were analysed for the rates of malposition of the pedicle screws. Pedicle screw malposition was defined as having occurred if the screw trajectory was adjusted intraoperatively due to positive EMG responses, or due to breach of the pedicle cortex by more than 2mm on intraoperative CT imaging performed at the end of the instrumentation procedure. Further analysis of the data was undertaken to determine if the rates of malposition changed with the surgeon's experience with the technique. Results Six hundred and twenty-seven pedicle screws were placed in one hundred and fifty patients. The overall rate of intraoperative malposition and subsequent adjustment of pedicle screw placement was 3.8% (24 of 627 screws). Screw malposition was detected by intraoperative CT imaging. Warning of potential screw misplacement was provided by use of the EMG monitoring. With increased experience with the technique, rates of intraoperative pedicle screw malposition were found to decrease from 5.1% of screws in the first fifty patients, to 2.0% in the last 50 patients. Only one screw was suboptimally placed at the end of surgery, which did not result in a neurological deficit. Conclusion The use of CT-based computer-assisted navigation in combination with continuous EMG monitoring during percutaneous transpedicular screw placement results in very low rates of malposition and neural injury that compare favourably with previously reported rates. Pedicle screw placement accuracy continues to improve as the surgeon becomes more experienced with the technique. PMID:25694919
Domestic water uses: characterization of daily cycles in the north region of Portugal.
Matos, Cristina; Teixeira, Carlos A; Duarte, A A L S; Bentes, I
2013-08-01
Nowadays, there is an increasing discussion among specialists about water use efficiency and the best measures to improve it. In Portugal, there have been a few attempts to expand the implementation of in situ water reuse projects. However, there is a lack of information about indoor water uses and how they are influenced by sociodemographic characteristics. There are several studies that investigate per capita global water usage, but the partitioning of this volume per domestic device and daily cycles is yet unknown. Identified as one of the key questions in sustainable building design, the water end-use is of primary importance to the design of hydraulic networks in buildings. In order to overcome this lack, a quantitative characterization of daily water uses for each domestic device was performed, based on a weekly monitoring program in fifty-two different dwellings in the northern region of Portugal (Vila Real, Valpaços and Oporto). For forty of them, each water usage of different domestic devices of each dwelling was recorded. At the same time, the remaining twelve dwellings were also monitored in order to register the volume of water consumed in each utilization of each domestic device. This paper presents the results of this complete monitoring program, using collected data to establish indoor water use patterns for each domestic device, aiming to support a more realistic approach to residential water use. The daily cycles in the different cities, where the monitoring program was performed, are also presented, in order to evaluate possible influences of sociodemographic characteristics. Copyright © 2013 Elsevier B.V. All rights reserved.
Patient perceptions of a remote monitoring intervention for chronic disease management.
Wakefield, Bonnie J; Holman, John E; Ray, Annette; Scherubel, Melody
2011-04-01
Use of telecommunications technology to provide remote monitoring for people with chronic disease is becoming increasingly accepted as a means to improve patient outcomes and reduce resource use. The purpose of this project was to evaluate patient perceptions of a nurse-managed remote monitoring intervention to improve outcomes in veterans with comorbid diabetes and hypertension. Postintervention evaluation data were collected using a 12-item questionnaire and an open-ended question. Participants rated the program as generally positive on the questionnaire, but responses to the open-ended question revealed criticisms and suggestions for improvement not captured on the questionnaire. Interviewing participants in these programs may offer richer data for identifying areas for program improvement. Copyright 2011, SLACK Incorporated.
NASA Astrophysics Data System (ADS)
Wang, Haixia; Suo, Tongchuan; Wu, Xiaolin; Zhang, Yue; Wang, Chunhua; Yu, Heshui; Li, Zheng
2018-03-01
The control of batch-to-batch quality variations remains a challenging task for pharmaceutical industries, e.g., traditional Chinese medicine (TCM) manufacturing. One difficult problem is to produce pharmaceutical products with consistent quality from raw material of large quality variations. In this paper, an integrated methodology combining the near infrared spectroscopy (NIRS) and dynamic predictive modeling is developed for the monitoring and control of the batch extraction process of licorice. With the spectra data in hand, the initial state of the process is firstly estimated with a state-space model to construct a process monitoring strategy for the early detection of variations induced by the initial process inputs such as raw materials. Secondly, the quality property of the end product is predicted at the mid-course during the extraction process with a partial least squares (PLS) model. The batch-end-time (BET) is then adjusted accordingly to minimize the quality variations. In conclusion, our study shows that with the help of the dynamic predictive modeling, NIRS can offer the past and future information of the process, which enables more accurate monitoring and control of process performance and product quality.
Monitoring tool usage in surgery videos using boosted convolutional and recurrent neural networks.
Al Hajj, Hassan; Lamard, Mathieu; Conze, Pierre-Henri; Cochener, Béatrice; Quellec, Gwenolé
2018-05-09
This paper investigates the automatic monitoring of tool usage during a surgery, with potential applications in report generation, surgical training and real-time decision support. Two surgeries are considered: cataract surgery, the most common surgical procedure, and cholecystectomy, one of the most common digestive surgeries. Tool usage is monitored in videos recorded either through a microscope (cataract surgery) or an endoscope (cholecystectomy). Following state-of-the-art video analysis solutions, each frame of the video is analyzed by convolutional neural networks (CNNs) whose outputs are fed to recurrent neural networks (RNNs) in order to take temporal relationships between events into account. Novelty lies in the way those CNNs and RNNs are trained. Computational complexity prevents the end-to-end training of "CNN+RNN" systems. Therefore, CNNs are usually trained first, independently from the RNNs. This approach is clearly suboptimal for surgical tool analysis: many tools are very similar to one another, but they can generally be differentiated based on past events. CNNs should be trained to extract the most useful visual features in combination with the temporal context. A novel boosting strategy is proposed to achieve this goal: the CNN and RNN parts of the system are simultaneously enriched by progressively adding weak classifiers (either CNNs or RNNs) trained to improve the overall classification accuracy. Experiments were performed in a dataset of 50 cataract surgery videos, where the usage of 21 surgical tools was manually annotated, and a dataset of 80 cholecystectomy videos, where the usage of 7 tools was manually annotated. Very good classification performance are achieved in both datasets: tool usage could be labeled with an average area under the ROC curve of A z =0.9961 and A z =0.9939, respectively, in offline mode (using past, present and future information), and A z =0.9957 and A z =0.9936, respectively, in online mode (using past and present information only). Copyright © 2018 Elsevier B.V. All rights reserved.
A mainstream monitoring system for respiratory CO2 concentration and gasflow.
Yang, Jiachen; Chen, Bobo; Burk, Kyle; Wang, Haitao; Zhou, Jianxiong
2016-08-01
Continuous respiratory gas monitoring is an important tool for clinical monitoring. In particular, measurement of respiratory [Formula: see text] concentration and gasflow can reflect the status of a patient by providing parameters such as volume of carbon dioxide, end-tidal [Formula: see text] respiratory rate and alveolar deadspace. However, in the majority of previous work, [Formula: see text] concentration and gasflow have been studied separately. This study focuses on a mainstream system which simultaneously measures respiratory [Formula: see text] concentration and gasflow at the same location, allowing for volumetric capnography to be implemented. A non-dispersive infrared monitor is used to measure [Formula: see text] concentration and a differential pressure sensor is used to measure gasflow. In developing this new device, we designed a custom airway adapter which can be placed in line with the breathing circuit and accurately monitor relevant respiratory parameters. Because the airway adapter is used both for capnography and gasflow, our system reduces mechanical deadspace. The finite element method was used to design the airway adapter which can provide a strong differential pressure while reducing airway resistance. Statistical analysis using the coefficient of variation was performed to find the optimal driving voltage of the pressure transducer. Calibration between variations and flows was used to avoid pressure signal drift. We carried out targeted experiments using the proposed device and confirmed that the device can produce stable signals.
Load Disaggregation Technologies: Real World and Laboratory Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayhorn, Ebony T.; Sullivan, Greg P.; Petersen, Joseph M.
Low cost interval metering and communication technology improvements over the past ten years have enabled the maturity of load disaggregation (or non-intrusive load monitoring) technologies to better estimate and report energy consumption of individual end-use loads. With the appropriate performance characteristics, these technologies have the potential to enable many utility and customer facing applications such as billing transparency, itemized demand and energy consumption, appliance diagnostics, commissioning, energy efficiency savings verification, load shape research, and demand response measurement. However, there has been much skepticism concerning the ability of load disaggregation products to accurately identify and estimate energy consumption of end-uses; whichmore » has hindered wide-spread market adoption. A contributing factor is that common test methods and metrics are not available to evaluate performance without having to perform large scale field demonstrations and pilots, which can be costly when developing such products. Without common and cost-effective methods of evaluation, more developed disaggregation technologies will continue to be slow to market and potential users will remain uncertain about their capabilities. This paper reviews recent field studies and laboratory tests of disaggregation technologies. Several factors are identified that are important to consider in test protocols, so that the results reflect real world performance. Potential metrics are examined to highlight their effectiveness in quantifying disaggregation performance. This analysis is then used to suggest performance metrics that are meaningful and of value to potential users and that will enable researchers/developers to identify beneficial ways to improve their technologies.« less
Fu, Sai-Chuen; Chan, Kai-Ming; Chan, Lai-Shan; Fong, Daniel Tik-Pui; Lui, Po-Yee Pauline
2009-05-15
Chronic tendinopathy is characterized with longstanding activity-related pain with degenerative tendon injuries. An objective tool to measure painful responses in animal models is essential for the development of effective treatment for tendinopathy. Gait analysis has been developed to monitor the inflammatory pain in small animals. We reported the use of motion analysis to monitor gait changes in a rat model of degenerative tendon injury. Intratendinous injection of collagenase into the left patellar tendon of Sprague Dawley rat was used to induce degenerative tendon injury, while an equal volume of saline was injected in the control groups. Motion analyses with a high speed video camera were performed on all rats at pre-injury, 2, 4, 8, 12 or 16 weeks post injection. In the end-point study, the rats were sacrificed to obtain tendon samples for histological examination after motion analyses. In the follow-up study, repeated motion analyses were performed on another group of collagenase-treated and saline-treated rats. The results showed that rats with injured patellar tendon exhibited altered walking gait as compared to the controls. The change in double stance duration in the collagenase-treated rats was reversible by administration of buprenorphrine (p=0.029), it suggested that the detected gait changes were associated with pain. Comparisons of end-point and follow-up studies revealed the confounding effects of training, which led to higher gait velocities and probably a different adaptive response to tendon pain in the trained rats. The results showed that motion analysis could be used to measure activity-related chronic tendon pain.
Code of Federal Regulations, 2013 CFR
2013-07-01
... thermocouple, ultra-violet beam sensor, or infrared sensor) capable of continuously detecting the presence of a..., as appropriate. (1) Where an incinerator is used, a temperature monitoring device equipped with a... temperature monitoring device shall be installed in the firebox or in the ductwork immediately downstream of...
Code of Federal Regulations, 2014 CFR
2014-07-01
... thermocouple, ultra-violet beam sensor, or infrared sensor) capable of continuously detecting the presence of a..., as appropriate. (1) Where an incinerator is used, a temperature monitoring device equipped with a... temperature monitoring device shall be installed in the firebox or in the ductwork immediately downstream of...
Code of Federal Regulations, 2012 CFR
2012-07-01
... thermocouple, ultra-violet beam sensor, or infrared sensor) capable of continuously detecting the presence of a..., as appropriate. (1) Where an incinerator is used, a temperature monitoring device equipped with a... temperature monitoring device shall be installed in the firebox or in the ductwork immediately downstream of...
Kuzay, T.M.; Shu, D.
1995-02-07
A photon beam position monitor is disclosed for use in the front end of a beamline of a high heat flux and high energy photon source such as a synchrotron radiation storage ring detects and measures the position and, when a pair of such monitors are used in tandem, the slope of a photon beam emanating from an insertion device such as a wiggler or an undulator inserted in the straight sections of the ring. The photon beam position monitor includes a plurality of spaced blades for precisely locating the photon beam, with each blade comprised of chemical vapor deposition (CVD) diamond with an outer metal coating of a photon sensitive metal such as tungsten, molybdenum, etc., which combination emits electrons when a high energy photon beam is incident upon the blade. Two such monitors are contemplated for use in the front end of the beamline, with the two monitors having vertically and horizontally offset detector blades to avoid blade ''shadowing''. Provision is made for aligning the detector blades with the photon beam and limiting detector blade temperature during operation. 18 figs.
Cost accounting, management control, and planning in health care.
Siegrist, R B; Blish, C S
1988-02-01
Advantages and pharmacy applications of computerized hospital management-control and planning systems are described. Hospitals must define their product lines; patient cases, not tests or procedures, are the end product. Management involves operational control, management control, and strategic planning. Operational control deals with day-to-day management on the task level. Management control involves ensuring that managers use resources effectively and efficiently to accomplish the organization's objectives. Management control includes both control of unit costs of intermediate products, which are procedures and services used to treat patients and are managed by hospital department heads, and control of intermediate product use per case (managed by the clinician). Information from the operation and management levels feeds into the strategic plan; conversely, the management level controls the plan and the operational level carries it out. In the system developed at New England Medical Center, Boston, Massachusetts, the intermediate product-management system enables managers to identify intermediate products, develop standard costs, simulate changes in departmental costs, and perform variance analysis. The end-product management system creates a patient-level data-base, identifies end products (patient-care groupings), develops standard resource protocols, models alternative assumptions, performs variance analysis, and provides concurrent reporting. Examples are given of pharmacy managers' use of such systems to answer questions in the areas of product costing, product pricing, variance analysis, productivity monitoring, flexible budgeting, modeling and planning, and comparative analysis.(ABSTRACT TRUNCATED AT 250 WORDS)
Occurrence and Control of Genotoxins in Drinking Water: A Monitoring Proposal.
Ceretti, Elisabetta; Moretti, Massimo; Zerbini, Ilaria; Villarini, Milena; Zani, Claudia; Monarca, Silvano; Feretti, Donatella
2016-12-09
Many studies have shown the presence of numerous organic genotoxins and carcinogens in drinking water. These toxic substances derive not only from pollution, but also from the disinfection treatments, particularly when water is obtained from surface sources and then chlorinated. Most of the chlorinated compounds in drinking water are nonvolatile and are difficult to characterize. Thus, it has been proposed to study such complex mixtures using short-term genotoxicity tests predictive of carcinogenic activity. Mutagenicity of water before and after disinfection has mainly been studied by the Salmonella/microsome (Ames test); in vitro genotoxicity tests have also been performed in yeasts and mammalian cells; in situ monitoring of genotoxins has also been performed using complete organisms such as aquatic animals or plants (in vivo). The combination of bioassay data together with results of chemical analyses would give us a more firm basis for the assessment of human health risks related to the consumption of drinking water. Tests with different genetic end-points complement each other with regard to sensitivity toward environmental genotoxins and are useful in detecting low genotoxicity levels which are expected in drinking water samples.
A collaborative network middleware project by Lambda Station, TeraPaths, and Phoebus
NASA Astrophysics Data System (ADS)
Bobyshev, A.; Bradley, S.; Crawford, M.; DeMar, P.; Katramatos, D.; Shroff, K.; Swany, M.; Yu, D.
2010-04-01
The TeraPaths, Lambda Station, and Phoebus projects, funded by the US Department of Energy, have successfully developed network middleware services that establish on-demand and manage true end-to-end, Quality-of-Service (QoS) aware, virtual network paths across multiple administrative network domains, select network paths and gracefully reroute traffic over these dynamic paths, and streamline traffic between packet and circuit networks using transparent gateways. These services improve network QoS and performance for applications, playing a critical role in the effective use of emerging dynamic circuit network services. They provide interfaces to applications, such as dCache SRM, translate network service requests into network device configurations, and coordinate with each other to setup up end-to-end network paths. The End Site Control Plane Subsystem (ESCPS) builds upon the success of the three projects by combining their individual capabilities into the next generation of network middleware. ESCPS addresses challenges such as cross-domain control plane signalling and interoperability, authentication and authorization in a Grid environment, topology discovery, and dynamic status tracking. The new network middleware will take full advantage of the perfSONAR monitoring infrastructure and the Inter-Domain Control plane efforts and will be deployed and fully vetted in the Large Hadron Collider data movement environment.
Range Safety for an Autonomous Flight Safety System
NASA Technical Reports Server (NTRS)
Lanzi, Raymond J.; Simpson, James C.
2010-01-01
The Range Safety Algorithm software encapsulates the various constructs and algorithms required to accomplish Time Space Position Information (TSPI) data management from multiple tracking sources, autonomous mission mode detection and management, and flight-termination mission rule evaluation. The software evaluates various user-configurable rule sets that govern the qualification of TSPI data sources, provides a prelaunch autonomous hold-launch function, performs the flight-monitoring-and-termination functions, and performs end-of-mission safing
Silvestri, Michele; Simi, Massimiliano; Cavallotti, Carmela; Vatteroni, Monica; Ferrari, Vincenzo; Freschi, Cinzia; Valdastri, Pietro; Menciassi, Arianna; Dario, Paolo
2011-09-01
In the near future, it is likely that 3-dimensional (3D) surgical endoscopes will replace current 2D imaging systems given the rapid spreading of stereoscopy in the consumer market. In this evaluation study, an emerging technology, the autostereoscopic monitor, is compared with the visualization systems mainly used in laparoscopic surgery: a binocular visor, technically equivalent from the viewer's point of view to the da Vinci 3D console, and a standard 2D monitor. A total of 16 physicians with no experience in 3D interfaces performed 5 different tasks, and the execution time and accuracy of the tasks were evaluated. Moreover, subjective preferences were recorded to qualitatively evaluate the different technologies at the end of each trial. This study demonstrated that the autostereoscopic display is equally effective as the binocular visor for both low- and high-complexity tasks and that it guarantees better performance in terms of execution time than the standard 2D monitor. Moreover, an unconventional task, included to provide the same conditions to the surgeons regardless of their experience, was performed 22% faster when using the autostereoscopic monitor than the binocular visor. However, the final questionnaires demonstrated that 60% of participants preferred the user-friendliness of the binocular visor. These results are greatly heartening because autostereoscopic technology is still in its early stages and offers potential improvement. As a consequence, the authors expect that the increasing interest in autostereoscopy could improve its friendliness in the future and allow the technology to be widely accepted in surgery.
Code of Federal Regulations, 2012 CFR
2012-07-01
... humidity, solar radiation, ultraviolet radiation, and/or precipitation. Metropolitan Statistical Area (MSA... receiver at opposite ends of the monitoring path; (2) Equal to twice the monitoring path length for a (monostatic) system having a transmitter and receiver at one end of the monitoring path and a mirror or...
Code of Federal Regulations, 2013 CFR
2013-07-01
... humidity, solar radiation, ultraviolet radiation, and/or precipitation. Metropolitan Statistical Area (MSA... receiver at opposite ends of the monitoring path; (2) Equal to twice the monitoring path length for a (monostatic) system having a transmitter and receiver at one end of the monitoring path and a mirror or...
Code of Federal Regulations, 2014 CFR
2014-07-01
... humidity, solar radiation, ultraviolet radiation, and/or precipitation. Metropolitan Statistical Area (MSA... receiver at opposite ends of the monitoring path; (2) Equal to twice the monitoring path length for a (monostatic) system having a transmitter and receiver at one end of the monitoring path and a mirror or...
40 CFR 63.489 - Batch front-end process vents-monitoring equipment.
Code of Federal Regulations, 2013 CFR
2013-07-01
... device (including, but not limited to, a thermocouple, ultra-violet beam sensor, or infrared sensor... temperature monitoring device equipped with a continuous recorder is required. (i) Where an incinerator other than a catalytic incinerator is used, the temperature monitoring device shall be installed in the...
40 CFR 63.489 - Batch front-end process vents-monitoring equipment.
Code of Federal Regulations, 2012 CFR
2012-07-01
... device (including, but not limited to, a thermocouple, ultra-violet beam sensor, or infrared sensor... temperature monitoring device equipped with a continuous recorder is required. (i) Where an incinerator other than a catalytic incinerator is used, the temperature monitoring device shall be installed in the...
40 CFR 63.489 - Batch front-end process vents-monitoring equipment.
Code of Federal Regulations, 2014 CFR
2014-07-01
... device (including, but not limited to, a thermocouple, ultra-violet beam sensor, or infrared sensor... temperature monitoring device equipped with a continuous recorder is required. (i) Where an incinerator other than a catalytic incinerator is used, the temperature monitoring device shall be installed in the...
Wu, Yongjiang; Jin, Ye; Ding, Haiying; Luan, Lianjun; Chen, Yong; Liu, Xuesong
2011-09-01
The application of near-infrared (NIR) spectroscopy for in-line monitoring of extraction process of scutellarein from Erigeron breviscapus (vant.) Hand-Mazz was investigated. For NIR measurements, two fiber optic probes designed to transmit NIR radiation through a 2 mm pathlength flow cell were utilized to collect spectra in real-time. High performance liquid chromatography (HPLC) was used as a reference method to determine scutellarein in extract solution. Partial least squares regression (PLSR) calibration model of Savitzky-Golay smoothing NIR spectra in the 5450-10,000 cm(-1) region gave satisfactory predictive results for scutellarein. The results showed that the correlation coefficients of calibration and cross validation were 0.9967 and 0.9811, respectively, and the root mean square error of calibration and cross validation were 0.044 and 0.105, respectively. Furthermore, both the moving block standard deviation (MBSD) method and conformity test were used to identify the end point of extraction process, providing real-time data and instant feedback about the extraction course. The results obtained in this study indicated that the NIR spectroscopy technique provides an efficient and environmentally friendly approach for fast determination of scutellarein and end point control of extraction process. Copyright © 2011 Elsevier B.V. All rights reserved.
Levy, J C; Davies, M J; Holman, R R
2017-09-01
Hypoglycaemia is a significant risk in insulin treated type 2 diabetes and has been associated with future risk of cardiovascular events. We compared the frequency of low-glucose events using continuous glucose monitoring (CGM) with that of self-reported hypoglycemic events at the end of the first and third years of the Treating to Target in Type 2 Diabetes Trial (4-T), which compared biphasic, prandial and basal insulin regimens added to sulfonylurea and metformin. CGM using a Medtronic Gold system was performed in a subgroup of 4-T participants. CGM detected low-glucose events were defined at thresholds of ≤3.0 (CGM3.0) and ≤2.2 (CGM2.2) mmol/l. Of the 110 participants, 106 and 70 had CGM analysable data at the end of years 1 and 3 respectively. In both years, the frequency of CGM detected low glucose events was several fold higher than that of self-reported hypoglycaemia (symptoms with blood glucose less than 3.1mmol/l [<56mg/dl]). At the end of the first year, CGM3.0 and CGM2.2 mean (95%CI) event frequencies, expressed at events per participant per year, were 120 (85, 155) and 41 (21, 61) compared with 17 (8, 29) self-reported events during CGM, each p=0.001. The disparity at the end of the third year was similar. These data demonstrate the likely under-reporting of hypoglycaemia and of potential hypoglycaemia unawareness in clinical trials. The clinical implications of these findings need to be explored further (ISRCTN No ISRCTN51125379). Copyright © 2017. Published by Elsevier B.V.
High resolution in situ ultrasonic corrosion monitor
Grossman, R.J.
1984-01-10
An ultrasonic corrosion monitor is provided which produces an in situ measurement of the amount of corrosion of a monitoring zone or zones of an elongate probe placed in the corrosive environment. A monitoring zone is preferably formed between the end of the probe and the junction of the zone with a lead-in portion of the probe. Ultrasonic pulses are applied to the probe and a determination made of the time interval between pulses reflected from the end of the probe and the junction referred to, both when the probe is uncorroded and while it is corroding. Corresponding electrical signals are produced and a value for the normalized transit time delay derived from these time interval measurements is used to calculate the amount of corrosion.
High resolution in situ ultrasonic corrosion monitor
Grossman, Robert J.
1985-01-01
An ultrasonic corrosion monitor is provided which produces an in situ measurement of the amount of corrosion of a monitoring zone or zones of an elongate probe placed in the corrosive environment. A monitoring zone is preferably formed between the end of the probe and the junction of the zone with a lead-in portion of the probe. Ultrasonic pulses are applied to the probe and a determination made of the time interval between pulses reflected from the end of the probe and the junction referred to, both when the probe is uncorroded and while it is corroding. Corresponding electrical signals are produced and a value for the normalized transit time delay derived from these time interval measurements is used to calculate the amount of corrosion.
Remote health monitoring using mobile phones and Web services.
Agarwal, Sparsh; Lau, Chiew Tong
2010-06-01
Diabetes and hypertension have become very common perhaps because of increasingly busy lifestyles, unhealthy eating habits, and a highly competitive workplace. The rapid advancement of mobile communication technologies offers innumerable opportunities for the development of software and hardware applications for remote monitoring of such chronic diseases. This study describes a remote health-monitoring service that provides an end-to-end solution, that is, (1) it collects blood pressure readings from the patient through a mobile phone; (2) it provides these data to doctors through a Web interface; and (3) it enables doctors to manage the chronic condition by providing feedback to the patients remotely. This article also aims at understanding the requirements and expectations of doctors and hospitals from such a remote health-monitoring service.
An Ontology-based Context-aware System for Smart Homes: E-care@home.
Alirezaie, Marjan; Renoux, Jennifer; Köckemann, Uwe; Kristoffersson, Annica; Karlsson, Lars; Blomqvist, Eva; Tsiftes, Nicolas; Voigt, Thiemo; Loutfi, Amy
2017-07-06
Smart home environments have a significant potential to provide for long-term monitoring of users with special needs in order to promote the possibility to age at home. Such environments are typically equipped with a number of heterogeneous sensors that monitor both health and environmental parameters. This paper presents a framework called E-care@home, consisting of an IoT infrastructure, which provides information with an unambiguous, shared meaning across IoT devices, end-users, relatives, health and care professionals and organizations. We focus on integrating measurements gathered from heterogeneous sources by using ontologies in order to enable semantic interpretation of events and context awareness. Activities are deduced using an incremental answer set solver for stream reasoning. The paper demonstrates the proposed framework using an instantiation of a smart environment that is able to perform context recognition based on the activities and the events occurring in the home.
An Ontology-based Context-aware System for Smart Homes: E-care@home
Alirezaie, Marjan; Köckemann, Uwe; Kristoffersson, Annica; Karlsson, Lars; Blomqvist, Eva; Voigt, Thiemo; Loutfi, Amy
2017-01-01
Smart home environments have a significant potential to provide for long-term monitoring of users with special needs in order to promote the possibility to age at home. Such environments are typically equipped with a number of heterogeneous sensors that monitor both health and environmental parameters. This paper presents a framework called E-care@home, consisting of an IoT infrastructure, which provides information with an unambiguous, shared meaning across IoT devices, end-users, relatives, health and care professionals and organizations. We focus on integrating measurements gathered from heterogeneous sources by using ontologies in order to enable semantic interpretation of events and context awareness. Activities are deduced using an incremental answer set solver for stream reasoning. The paper demonstrates the proposed framework using an instantiation of a smart environment that is able to perform context recognition based on the activities and the events occurring in the home. PMID:28684686
Source-Constrained Recall: Front-End and Back-End Control of Retrieval Quality
ERIC Educational Resources Information Center
Halamish, Vered; Goldsmith, Morris; Jacoby, Larry L.
2012-01-01
Research on the strategic regulation of memory accuracy has focused primarily on monitoring and control processes used to edit out incorrect information after it is retrieved (back-end control). Recent studies, however, suggest that rememberers also enhance accuracy by preventing the retrieval of incorrect information in the first place (front-end…
Business Case for Nonintrusive Load Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baechler, Michael C.; Hao, He
This report explores how utilities, researchers, and consumers could benefit from a lower cost approach to submetering using non-intrusive load monitoring (NILM). NILM is a process of using data from a single point of monitoring, such as a utility smart meter, to provide an itemized accounting of end use energy consumption in residential and small commercial buildings. Pacific Northwest National Laboratory (PNNL) prepared this report for the Bonneville Power Administration (BPA). PNNL participated in an advisory group as part of a research project sponsored by the U.S. Department of Energy (DOE), the Bonneville Power Administration, and the State of Washington.more » The Electric Power Research Institute (EPRI) convened the advisory committee for two workshops held to identify ways in which NILM may be used. PNNL, on behalf of DOE, helped to cosponsor the first of these workshops. Results of an end-use monitoring study of a bank branch conducted by PNNL are also presented for purposes of illustrating the need for better data in energy savings modeling (DOE 2013a).« less
Actionable Science in the Gulf of Mexico: Connecting Researchers and Resource Managers
NASA Astrophysics Data System (ADS)
Lartigue, J.; Parker, F.; Allee, R.; Young, C.
2017-12-01
The National Oceanic and Atmospheric Administration (NOAA) RESTORE Science Program was established in the wake of the Deepwater Horizon oil spill to to carry out research, observation, and monitoring to support the long-term sustainability of the Gulf of Mexico ecosystem, including its fisheries. Administered in partnership with the US Fish and Wildlife Service, the Science Program emphasizes a connection between science and decision-making. This emphasis translated into an engagement process that allowed for resource managers and other users of information about the ecosystem to provide direct input into the science plan for the program. In developing funding opportunities, the Science Program uses structured conversations with resource managers and other decision makers to focus competitions on specific end user needs. When evaluating proposals for funding, the Science Program uses criteria that focus on applicability of a project's findings and products, end user involvement in project planning, and the approach for transferring findings and products to the end user. By including resource managers alongside scientific experts on its review panels, the Science Program ensures that these criteria are assessed from both the researcher and end user perspectives. Once funding decisions are made, the Science Program assigns a technical monitor to each award to assist with identifying and engaging end users. Sharing of best practices among the technical monitors has provided the Science Program insight on how best to bridge the gap between research and resource management and how to build successful scientist-decision maker partnerships. During the presentation, we will share two case studies: 1) design of a cooperative (fisheries scientist, fisheries managers, and fishers), Gulf-wide conservation and monitoring program for fish spawning aggregations and 2) development of habitat-specific ecosystem indicators for use by federal and state resource managers.
High temporal resolution mapping of seismic noise sources using heterogeneous supercomputers
NASA Astrophysics Data System (ADS)
Gokhberg, Alexey; Ermert, Laura; Paitz, Patrick; Fichtner, Andreas
2017-04-01
Time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems. Significant interest in seismic noise source maps with high temporal resolution (days) is expected to come from a number of domains, including natural resources exploration, analysis of active earthquake fault zones and volcanoes, as well as geothermal and hydrocarbon reservoir monitoring. Currently, knowledge of noise sources is insufficient for high-resolution subsurface monitoring applications. Near-real-time seismic data, as well as advanced imaging methods to constrain seismic noise sources have recently become available. These methods are based on the massive cross-correlation of seismic noise records from all available seismic stations in the region of interest and are therefore very computationally intensive. Heterogeneous massively parallel supercomputing systems introduced in the recent years combine conventional multi-core CPU with GPU accelerators and provide an opportunity for manifold increase and computing performance. Therefore, these systems represent an efficient platform for implementation of a noise source mapping solution. We present the first results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service that provides seismic noise source maps for Central Europe with high temporal resolution (days to few weeks depending on frequency and data availability). The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept in order to provide the interested external researchers the regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise mapping application is composed of four principal modules: (1) pre-processing of raw data, (2) massive cross-correlation, (3) post-processing of correlation data based on computation of logarithmic energy ratio and (4) generation of source maps from post-processed data. Implementation of the solution posed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.
A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth
2005-03-15
The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less
Monitoring the Low-Energy Gamma-Ray Sky Using Earth Occultation with GLAST GBM
NASA Technical Reports Server (NTRS)
Case, G.; Wilson-Hodge, C.; Cherry, M.; Kippen, M.; Ling, J.; Radocinski, R.; Wheaton, W.
2007-01-01
Long term all-sky monitoring of the 20 keV - 2 MeV gamma-ray sky using the Earth occultation technique was demonstrated by the BATSE instrument on the Compton Gamma Ray Observatory. The principles and techniques used for the development of an end-to-end earth occultation data analysis system for BATSE can be extended to the GLAST Gamma-ray Burst Monitor (GBM), resulting in multiband light curves and time-resolved spectra in the energy range 8 keV to above 1 MeV for known gamma-ray sources and transient outbursts, as well as the discovery of new sources of gamma-ray emission. In this paper we describe the application of the technique to the GBM. We also present the expected sensitivity for the GBM.
NASA Technical Reports Server (NTRS)
Banks, Bruce A. (Inventor)
2008-01-01
Disclosed is a method of producing cones and pillars on polymethylmethacralate (PMMA) optical fibers for glucose monitoring. The method, in one embodiment, consists of using electron beam evaporation to deposit a non-contiguous thin film of aluminum on the distal ends of the PMMA fibers. The partial coverage of aluminum on the fibers is randomly, but rather uniformly distributed across the end of the optical fibers. After the aluminum deposition, the ends of the fibers are then exposed to hyperthermal atomic oxygen, which oxidizes the areas that are not protected by aluminum. The resulting PMMA fibers have a greatly increased surface area and the cones or pillars are sufficiently close together that the cellular components in blood are excluded from passing into the valleys between the cones and pillars. The optical fibers are then coated with appropriated surface chemistry so that they can optically sense the glucose level in the blood sample than that with conventional glucose monitoring.
Calibration and validation of wearable monitors.
Bassett, David R; Rowlands, Alex; Trost, Stewart G
2012-01-01
Wearable monitors are increasingly being used to objectively monitor physical activity in research studies within the field of exercise science. Calibration and validation of these devices are vital to obtaining accurate data. This article is aimed primarily at the physical activity measurement specialist, although the end user who is conducting studies with these devices also may benefit from knowing about this topic. Initially, wearable physical activity monitors should undergo unit calibration to ensure interinstrument reliability. The next step is to simultaneously collect both raw signal data (e.g., acceleration) from the wearable monitors and rates of energy expenditure, so that algorithms can be developed to convert the direct signals into energy expenditure. This process should use multiple wearable monitors and a large and diverse subject group and should include a wide range of physical activities commonly performed in daily life (from sedentary to vigorous). New methods of calibration now use "pattern recognition" approaches to train the algorithms on various activities, and they provide estimates of energy expenditure that are much better than those previously available with the single-regression approach. Once a method of predicting energy expenditure has been established, the next step is to examine its predictive accuracy by cross-validating it in other populations. In this article, we attempt to summarize the best practices for calibration and validation of wearable physical activity monitors. Finally, we conclude with some ideas for future research ideas that will move the field of physical activity measurement forward.
Active Sites Environmental Monitoring Program: Mid-FY 1991 report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ashwood, T.L.; Wickliff, D.S.; Morrissey, C.M.
1991-10-01
This report summarizes the activities of the Active Sites Environmental Monitoring Program (ASEMP) from October 1990 through March 1991. The ASEMP was established in 1989 by Solid Waste Operations and the Environmental Sciences Division to provide early detection and performance monitoring at active low-level radioactive waste (LLW) disposal sites in Solid Waste Storage Area (SWSA) 6 and transuranic (TRU) waste storage sites in SWSA 5 as required by chapters II and III of US Department of Energy Order 5820.2A. Monitoring results continue to demonstrate the no LLW is being leached from the storage vaults on the tumulus pads. Loading ofmore » vaults on Tumulus II began during this reporting period and 115 vaults had been loaded by the end of March 1991.« less
Long-term microfluidic glucose and lactate monitoring in hepatic cell culture
Prill, Sebastian; Jaeger, Magnus S.; Duschl, Claus
2014-01-01
Monitoring cellular bioenergetic pathways provides the basis for a detailed understanding of the physiological state of a cell culture. Therefore, it is widely used as a tool amongst others in the field of in vitro toxicology. The resulting metabolic information allows for performing in vitro toxicology assays for assessing drug-induced toxicity. In this study, we demonstrate the value of a microsystem for the fully automated detection of drug-induced changes in cellular viability by continuous monitoring of the metabolic activity over several days. To this end, glucose consumption and lactate secretion of a hepatic tumor cell line were continuously measured using microfluidically addressed electrochemical sensors. Adapting enzyme-based electrochemical flat-plate sensors, originally designed for human whole-blood samples, to their use with cell culture medium supersedes the common manual and laborious colorimetric assays and off-line operated external measurement systems. The cells were exposed to different concentrations of the mitochondrial inhibitor rotenone and the cellular response was analyzed by detecting changes in the rates of the glucose and lactate metabolism. Thus, the system provides real-time information on drug-induced liver injury in vitro. PMID:24926387
Performance Characteristic Mems-Based IMUs for UAVs Navigation
NASA Astrophysics Data System (ADS)
Mohamed, H. A.; Hansen, J. M.; Elhabiby, M. M.; El-Sheimy, N.; Sesay, A. B.
2015-08-01
Accurate 3D reconstruction has become essential for non-traditional mapping applications such as urban planning, mining industry, environmental monitoring, navigation, surveillance, pipeline inspection, infrastructure monitoring, landslide hazard analysis, indoor localization, and military simulation. The needs of these applications cannot be satisfied by traditional mapping, which is based on dedicated data acquisition systems designed for mapping purposes. Recent advances in hardware and software development have made it possible to conduct accurate 3D mapping without using costly and high-end data acquisition systems. Low-cost digital cameras, laser scanners, and navigation systems can provide accurate mapping if they are properly integrated at the hardware and software levels. Unmanned Aerial Vehicles (UAVs) are emerging as a mobile mapping platform that can provide additional economical and practical advantages. However, such economical and practical requirements need navigation systems that can provide uninterrupted navigation solution. Hence, testing the performance characteristics of Micro-Electro-Mechanical Systems (MEMS) or low cost navigation sensors for various UAV applications is important research. This work focuses on studying the performance characteristics under different manoeuvres using inertial measurements integrated with single point positioning, Real-Time-Kinematic (RTK), and additional navigational aiding sensors. Furthermore, the performance of the inertial sensors is tested during Global Positioning System (GPS) signal outage.
Singh, Ravendra; Román-Ospino, Andrés D; Romañach, Rodolfo J; Ierapetritou, Marianthi; Ramachandran, Rohit
2015-11-10
The pharmaceutical industry is strictly regulated, where precise and accurate control of the end product quality is necessary to ensure the effectiveness of the drug products. For such control, the process and raw materials variability ideally need to be fed-forward in real time into an automatic control system so that a proactive action can be taken before it can affect the end product quality. Variations in raw material properties (e.g., particle size), feeder hopper level, amount of lubrication, milling and blending action, applied shear in different processing stages can affect the blend density significantly and thereby tablet weight, hardness and dissolution. Therefore, real time monitoring of powder bulk density variability and its incorporation into the automatic control system so that its effect can be mitigated proactively and efficiently is highly desired. However, real time monitoring of powder bulk density is still a challenging task because of different level of complexities. In this work, powder bulk density which has a significant effect on the critical quality attributes (CQA's) has been monitored in real time in a pilot-plant facility, using a NIR sensor. The sensitivity of the powder bulk density on critical process parameters (CPP's) and CQA's has been analyzed and thereby feed-forward controller has been designed. The measured signal can be used for feed-forward control so that the corrective actions on the density variations can be taken before they can influence the product quality. The coupled feed-forward/feed-back control system demonstrates improved control performance and improvements in the final product quality in the presence of process and raw material variations. Copyright © 2015 Elsevier B.V. All rights reserved.
Data Quality Monitoring System for New GEM Muon Detectors for the CMS Experiment Upgrade
NASA Astrophysics Data System (ADS)
King, Robert; CMS Muon Group Team
2017-01-01
The Gas Electron Multiplier (GEM) detectors are novel detectors designed to improve the muon trigger and tracking performance in CMS experiment for the high luminosity upgrade of the LHC. Partial installation of GEM detectors is planned during the 2016-2017 technical stop. Before the GEM system is installed underground, its data acquisition (DAQ) electronics must be thoroughly tested. The DAQ system includes several commercial and custom-built electronic boards running custom firmware. The front-end electronics are radiation-hard and communicate via optical fibers. The data quality monitoring (DQM) software framework has been designed to provide online verification of the integrity of the data produced by the detector electronics, and to promptly identify potential hardware or firmware malfunctions in the system. Local hits reconstruction and clustering algorithms allow quality control of the data produced by each GEM chamber. Once the new detectors are installed, the DQM will monitor the stability and performance of the system during normal data-taking operations. We discuss the design of the DQM system, the software being developed to read out and process the detector data, and the methods used to identify and report hardware and firmware malfunctions of the system.
USDA-ARS?s Scientific Manuscript database
The Soil Moisture and Ocean Salinity satellite (SMOS) was launched in November 2009 and started delivering data in January 2010. The commissioning phase ended in May 2010. Subsequently, the satellite has been in operation for over 5 years while the retrieval algorithms from Level 1 to Level 2 underw...
Williams, Jeni
2007-10-01
Strategies for improving the consumer service skills of finance staff include: Hire employees who have a customer service background. Work with your human resources department to provide customer service training. Monitor new hires extensively. Offer front-end employees scripted language for situations they may face on the job. Measure the quality of customer service provided. Provide incentives for performance.
Using Trust to Establish a Secure Routing Model in Cognitive Radio Network.
Zhang, Guanghua; Chen, Zhenguo; Tian, Liqin; Zhang, Dongwen
2015-01-01
Specific to the selective forwarding attack on routing in cognitive radio network, this paper proposes a trust-based secure routing model. Through monitoring nodes' forwarding behaviors, trusts of nodes are constructed to identify malicious nodes. In consideration of that routing selection-based model must be closely collaborative with spectrum allocation, a route request piggybacking available spectrum opportunities is sent to non-malicious nodes. In the routing decision phase, nodes' trusts are used to construct available path trusts and delay measurement is combined for making routing decisions. At the same time, according to the trust classification, different responses are made specific to their service requests. By adopting stricter punishment on malicious behaviors from non-trusted nodes, the cooperation of nodes in routing can be stimulated. Simulation results and analysis indicate that this model has good performance in network throughput and end-to-end delay under the selective forwarding attack.
Neonatal euthanasia: lessons from the Groningen Protocol.
Eduard Verhagen, A A
2014-10-01
Decisions about neonatal end-of-life care have been studied intensely over the last 20 years in The Netherlands. Nationwide surveys were done to quantify these decisions, provide details and monitor the effect of guidelines, new regulations and other interventions. One of those interventions was the Groningen Protocol for newborn euthanasia in severely ill newborns, published in 2005. Before publication, an estimated 20 cases of euthanasia per year were performed. After publication, only two cases in five years were reported. Studies suggested that this might be partly caused by the lack of consensus about the dividing line between euthanasia and palliative care. New recommendations about paralytic medication use in dying newborns were issued to increase transparency and to improve reporting of euthanasia. New surveys will be needed to measure the effects of these interventions. This cycle of interventions and measurements seems useful for continuous improvement of end-of-life care in newborns. Copyright © 2014 Elsevier Ltd. All rights reserved.
Curtright, J W; Stolp-Smith, S C; Edell, E S
2000-01-01
Managing and measuring performance become exceedingly complex as healthcare institutions evolve into integrated health systems comprised of hospitals, outpatient clinics and surgery centers, nursing homes, and home health services. Leaders of integrated health systems need to develop a methodology and system that align organizational strategies with performance measurement and management. To meet this end, multiple healthcare organizations embrace the performance-indicators reporting system known as a "balanced scorecard" or a "dashboard report." This discrete set of macrolevel indicators gives senior management a fast but comprehensive glimpse of the organization's performance in meeting its quality, operational, and financial goals. The leadership of outpatient operations for Mayo Clinic in Rochester, Minnesota built on this concept by creating a performance management and measurement system that monitors and reports how well the organization achieves its performance goals. Internal stakeholders identified metrics to measure performance in each key category. Through these metrics, the organization links Mayo Clinic's vision, primary value, core principles, and day-to-day operations by monitoring key performance indicators on a weekly, monthly, or quarterly basis.
Giddings, R P; Hugues-Salas, E; Tang, J M
2012-08-27
Record high 19.125 Gb/s real-time end-to-end dual-band optical OFDM (OOFDM) transmission is experimentally demonstrated, for the first time, in a simple electro-absorption modulated laser (EML)-based 25 km standard SMF system using intensity modulation and direct detection (IMDD). Adaptively modulated baseband (0-2GHz) and passband (6.125 ± 2GHz) OFDM RF sub-bands, supporting line rates of 10 Gb/s and 9.125 Gb/s respectively, are independently generated and detected with FPGA-based DSP clocked at only 100 MHz and DACs/ADCs operating at sampling speeds as low as 4GS/s. The two OFDM sub-bands are electrically frequency-division-multiplexed (FDM) for intensity modulation of a single optical carrier by an EML. To maximize and balance the signal transmission performance of each sub-band, on-line adaptive features and on-line performance monitoring is fully exploited to optimize key OOFDM transceiver and system parameters, which includes subcarrier characteristics within each individual OFDM sub-band, total and relative sub-band power as well as EML operating conditions. The achieved 19.125 Gb/s over 25 km SMF OOFDM transmission system has an optical power budget of 13.5 dB, and shows almost identical bit error rate (BER) performances for both the baseband and passband signals. In addition, experimental investigations also indicate that the maximum achievable transmission capacity of the present system is mainly determined by the EML frequency chirp-enhanced chromatic dispersion effect, and the passband BER performance is not affected by the two sub-band-induced intermixing effect, which, however, gives a 1.2dB optical power penalty to the baseband signal transmission.
The GÉANT network: addressing current and future needs of the HEP community
NASA Astrophysics Data System (ADS)
Capone, Vincenzo; Usman, Mian
2015-12-01
The GÉANT infrastructure is the backbone that serves the scientific communities in Europe for their data movement needs and their access to international research and education networks. Using the extensive fibre footprint and infrastructure in Europe the GÉANT network delivers a portfolio of services aimed to best fit the specific needs of the users, including Authentication and Authorization Infrastructure, end-to-end performance monitoring, advanced network services (dynamic circuits, L2-L3VPN, MD-VPN). This talk will outline the factors that help the GÉANT network to respond to the needs of the High Energy Physics community, both in Europe and worldwide. The Pan-European network provides the connectivity between 40 European national research and education networks. In addition, GÉANT also connects the European NRENs to the R&E networks in other world region and has reach to over 110 NREN worldwide, making GÉANT the best connected Research and Education network, with its multiple intercontinental links to different continents e.g. North and South America, Africa and Asia-Pacific. The High Energy Physics computational needs have always had (and will keep having) a leading role among the scientific user groups of the GÉANT network: the LHCONE overlay network has been built, in collaboration with the other big world REN, specifically to address the peculiar needs of the LHC data movement. Recently, as a result of a series of coordinated efforts, the LHCONE network has been expanded to the Asia-Pacific area, and is going to include some of the main regional R&E network in the area. The LHC community is not the only one that is actively using a distributed computing model (hence the need for a high-performance network); new communities are arising, as BELLE II. GÉANT is deeply involved also with the BELLE II Experiment, to provide full support to their distributed computing model, along with a perfSONAR-based network monitoring system. GÉANT has also coordinated the setup of the network infrastructure to perform the BELLE II Trans-Atlantic Data Challenge, and has been active on helping the BELLE II community to sort out their end-to-end performance issues. In this talk we will provide information about the current GÉANT network architecture and of the international connectivity, along with the upcoming upgrades and the planned and foreseeable improvements. We will also describe the implementation of the solutions provided to support the LHC and BELLE II experiments.
Contamination monitoring approaches for EUV space optics
NASA Technical Reports Server (NTRS)
Ray, David C.; Malina, Roger F.; Welsh, Barry J.; Battel, Steven J.
1989-01-01
Data from contaminant-induced UV optics degradation studies and particulate models are used here to develop end-of-service-life instrument contamination requirements which are very stringent but achievable. The budget is divided into allocations for each phase of hardware processing. Optical and nonoptical hardware are monitored for particulate and molecular contamination during initial cleaning and baking, assembly, test, and calibration phases. The measured contamination levels are compared to the requirements developed for each phase to provide confidence that the required end-of-life levels will be met.
Multisource, Phase-controlled Radiofrequency for Treatment of Skin Laxity
Moreno-Moraga, Javier; Muñoz, Estefania; Cornejo Navarro, Paloma
2011-01-01
Objective: The objective of this study was to analyze the correlation between degrees of clinical improvement and microscopic changes detected using confocal microscopy at the temperature gradients reached in patients treated for skin laxity with a phase-controlled, multisource radiofrequency system. Design and setting: Patients with skin laxity in the abdominal area were treated in six sessions with radiofrequency (the first 4 sessions were held at 2-week intervals and the 2 remaining sessions at 3-week intervals). Patients attended monitoring at 6, 9, and 12 months. Participants: 33 patients (all women). Measurements: The authors recorded the following: variations in weight, measurements of the contour of the treated area and control area, evaluation of clinical improvement by the clinician and by the patient, images taken using an infrared camera, temperature (before, immediately after, and 20 minutes after the procedure), and confocal microscopy images (before treatment and at 6, 9, and 12 months). The degree of clinical improvement was contrasted by two external observers (clinicians). The procedure was performed using a new phase-controlled, multipolar radiofrequency system. Results: The results reveal a greater degree of clinical improvement in patients with surface temperature increases greater than 11.5ºC at the end of the procedure and remaining greater than 4.5ºC 20 minutes later. These changes induced by radiofrequency were contrasted with the structural improvements observed at the dermal-epidermal junction using confocal microscopy. Changes are more intense and are statistically correlated with patients who show a greater degree of improvement and have higher temperature gradients at the end of the procedure and 20 minutes later. Conclusion: Monitoring and the use of parameters to evaluate end-point values in skin quality treatment by multisource, phased-controlled radiofrequency can help optimize aesthetic outcome. PMID:21278896
Graphene Nanoprobes for Real-Time Monitoring of Isothermal Nucleic Acid Amplification.
Li, Fan; Liu, Xiaoguo; Zhao, Bin; Yan, Juan; Li, Qian; Aldalbahi, Ali; Shi, Jiye; Song, Shiping; Fan, Chunhai; Wang, Lihua
2017-05-10
Isothermal amplification is an efficient way to amplify DNA with high accuracy; however, the real-time monitoring for quantification analysis mostly relied on expensive and precisely designed probes. In the present study, a graphene oxide (GO)-based nanoprobe was used to real-time monitor the isothermal amplification process. The interaction between GO and different DNA structures was systematically investigated, including single-stranded DNA (ssDNA), double-stranded DNA (dsDNA), DNA 3-helix, and long rolling circle amplification (RCA) and hybridization chain reaction (HCR) products, which existed in one-, two-, and three-dimensional structures. It was found that the high rigid structures exhibited much lower affinity with GO than soft ssDNA, and generally the rigidity was dependent on the length of targets and the hybridization position with probe DNA. On the basis of these results, we successfully monitored HCR amplification process, RCA process, and the enzyme restriction of RCA products with GO nanoprobe; other applications including the detection of the assembly/disassembly of DNA 3-helix structures were also performed. Compared to the widely used end-point detection methods, the GO-based sensing platform is simple, sensitive, cost-effective, and especially in a real-time monitoring mode. We believe such studies can provide comprehensive understandings and evocation on design of GO-based biosensors for broad application in various fields.
Motor recovery monitoring using acceleration measurements in post acute stroke patients.
Gubbi, Jayavardhana; Rao, Aravinda S; Fang, Kun; Yan, Bernard; Palaniswami, Marimuthu
2013-04-16
Stroke is one of the major causes of morbidity and mortality. Its recovery and treatment depends on close clinical monitoring by a clinician especially during the first few hours after the onset of stroke. Patients who do not exhibit early motor recovery post thrombolysis may benefit from more aggressive treatment. A novel approach for monitoring stroke during the first few hours after the onset of stroke using a wireless accelerometer based motor activity monitoring system is developed. It monitors the motor activity by measuring the acceleration of the arms in three axes. In the presented proof of concept study, the measured acceleration data is transferred wirelessly using iMote2 platform to the base station that is equipped with an online algorithm capable of calculating an index equivalent to the National Institute of Health Stroke Score (NIHSS) motor index. The system is developed by collecting data from 15 patients. We have successfully demonstrated an end-to-end stroke monitoring system reporting an accuracy of calculating stroke index of more than 80%, highest Cohen's overall agreement of 0.91 (with excellent κ coefficient of 0.76). A wireless accelerometer based 'hot stroke' monitoring system is developed to monitor the motor recovery in acute-stroke patients. It has been shown to monitor stroke patients continuously, which has not been possible so far with high reliability.
Motor recovery monitoring using acceleration measurements in post acute stroke patients
2013-01-01
Background Stroke is one of the major causes of morbidity and mortality. Its recovery and treatment depends on close clinical monitoring by a clinician especially during the first few hours after the onset of stroke. Patients who do not exhibit early motor recovery post thrombolysis may benefit from more aggressive treatment. Method A novel approach for monitoring stroke during the first few hours after the onset of stroke using a wireless accelerometer based motor activity monitoring system is developed. It monitors the motor activity by measuring the acceleration of the arms in three axes. In the presented proof of concept study, the measured acceleration data is transferred wirelessly using iMote2 platform to the base station that is equipped with an online algorithm capable of calculating an index equivalent to the National Institute of Health Stroke Score (NIHSS) motor index. The system is developed by collecting data from 15 patients. Results We have successfully demonstrated an end-to-end stroke monitoring system reporting an accuracy of calculating stroke index of more than 80%, highest Cohen’s overall agreement of 0.91 (with excellent κ coefficient of 0.76). Conclusion A wireless accelerometer based ‘hot stroke’ monitoring system is developed to monitor the motor recovery in acute-stroke patients. It has been shown to monitor stroke patients continuously, which has not been possible so far with high reliability. PMID:23590690
Kress, B C; Mizrahi, I A; Armour, K W; Marcus, R; Emkey, R D; Santora, A C
1999-07-01
Biochemical bone markers are sensitive to the changes in bone turnover that result from treatment of postmenopausal osteoporotic women with antiresorptive therapies. Although information is available on the use of bone markers in monitoring therapy in groups of subjects, less is known regarding how these markers perform in individual patients. Serum bone alkaline phosphatase (bone ALP) concentrations, measured with the Tandem(R) Ostase(R) assay, were used to monitor the biochemical response of bone in postmenopausal women with osteoporosis receiving either 10 mg/day alendronate therapy (n = 74) or calcium supplementation (n = 148) for 24 months. Bone ALP decreased significantly from baseline at 3 months (P =0.0001), reaching a nadir between 3 and 6 months of alendronate therapy. The magnitude of the bone ALP decrease in the treated osteoporotic population was consistent with normalization to premenopausal concentrations. Of the 74 alendronate-treated subjects, 63 (85.1%) demonstrated a decrease from baseline in bone ALP by 6 months that exceeded the least significant change of 25%. The bone ALP decrease from baseline exceeded 25% in 72 (97%) by the end of the study. The bone ALP assay is a sensitive and reliable tool that may be used to monitor the reduction in bone turnover after alendronate therapy in individual postmenopausal osteoporotic women.
Dai, Ming; Xiao, Xueliang; Chen, Xin; Lin, Haoming; Wu, Wanqing; Chen, Siping
2016-12-01
With the increasing aging population as well as health concerns, chronic heart disease has become the focus of public attention. A comfortable, low-powered, and wearable electrocardiogram (ECG) system for continuously monitoring the elderly's ECG signals over several hours is important for preventing cardiovascular diseases. Traditional ECG monitoring apparatus is often inconvenient to carry, has many electrodes to attach to the chest, and has a high-power consumption. There is also a challenge to design an electrocardiograph that satisfies requirements such as comfort, confinement, and compactness. Based on these considerations, this study presents a biosensor acquisition system for wearable, ubiquitous healthcare applications using three textile electrodes and a recording circuit specialized for ECG monitoring. In addition, several methods were adopted to reduce the power consumption of the device. The proposed system is composed of three parts: (1) an ECG analog front end (AFE), (2) digital signal processing and micro-control circuits, and (3) system software. Digital filter methods were used to eliminate the baseline wander, skin contact noise, and other interfering signals. A comparative study was conducted using this system to observe its performance with two commercial Holter monitors. The experimental results demonstrated that the total power consumption of this proposed system in a full round of ECG acquisition was only 29.74 mW. In addition, this low-power system performed well and stably measured the heart rate with an accuracy of 98.55 %. It can also contain a real-time dynamic display with organic light-emitting diodes (OLED) and wirelessly transmit information via a Bluetooth 4.0 module.
Surface-enhanced Raman as a water monitor for warfare agents
NASA Astrophysics Data System (ADS)
Spencer, Kevin M.; Sylvia, James M.; Clauson, Susan L.; Janni, James A.
2002-02-01
The threat of chemical warfare agents being released upon civilian and military personnel continues to escalate. One aspect of chemical preparedness is to analyze and protect the portable water supply for the military. Chemical nerve, blister, and choking agents, as well as biological threats must all be analyzed and low limits of detection must be verified. For chemical agents, this generally means detection down to the low ppb levels. Surface-Enhanced Raman Spectroscopy (SERS) is a spectroscopic technique that can detect trace levels of contaminants directly in the aqueous environment. In this paper, results are presented on the use of SERS to detect chemical and biological agent simulants with an end goal of creating a Joint Service Agent Water Monitor. Detection of cyanide, 2-chloroethyl ethyl sulfide, phosphonates, Gram-positive and Gram-negative bacteria using SERS has been performed and is discussed herein. Aspects of transferring laboratory results to an unattended field instrument are also discussed.
The development of W-PBPM at diagnostic beamline
NASA Astrophysics Data System (ADS)
Kim, Seungnam; Kim, Myeongjin; Kim, Seonghan; Shin, Hocheol; Kim, Jiwha; Lee, Chaesun
2017-12-01
The photon beam position monitor (PBPM) plays a critically important role in the accurate monitoring of the beam position. W (Wire)-PBPMs are installed at the front end and photon transfer line (PTL) of the diagnostic beamline and detect the change of position and angle of the beam orbit applied to the beamline. It provides beam stability and position data in real time, which can be used in feedback system with BPM in storage-ring. Also it provides beam profile, which makes it possible to figure out the specifications of beam. With two W-PBPMs, the angle information of beam could be acquired and the results coupled with beam profile are used with orbit correction. The W-PBPM has been designed and installed in the diagnostic beamline at Pohang Light Source. Herein the details of the design, analysis and performance for the W-PBPM will be reported.
Stock, W; Westbrook, C A; Peterson, B; Arthur, D C; Szatrowski, T P; Silver, R T; Sher, D A; Wu, D; Le Beau, M M; Schiffer, C A; Bloomfield, C D
1997-01-01
Disappearance of the Philadelphia chromosome during treatment for chronic myeloid leukemia (CML) has become an important therapeutic end point. To determine the additional value of molecular monitoring during treatment for CML, we performed a prospective, sequential analysis using quantitative Southern blot monitoring of BCR gene rearrangements of blood and marrow samples from Cancer and Leukemia Group B (CALGB) study 8761. Sixty-four previously untreated adults with chronic-phase CML who were enrolled onto CALGB 8761, a molecular-monitoring companion study to a treatment study for adults with chronic-phase CML (CALGB 9013). Treatment consisted of repetitive cycles of interferon alfa and low-dose subcutaneous cytarabine. Blood and marrow Southern blot quantitation of BCR gene rearrangements was compared with marrow cytogenetic analysis before the initiation of treatment and of specified points during therapy. Reverse-transcriptase polymerase chain reaction (RT-PCR) analysis was performed to detect residual disease in patients who achieved a complete response by Southern blot or cytogenetic analysis. Quantitative molecular monitoring by Southern blot analysis of blood samples was found to be equivalent to marrow monitoring at all time points. Twelve of 62 (19%) follow-up samples studied by Southern blot analysis had a complete loss of BCR gene rearrangement in matched marrow and blood specimens. Southern blot monitoring of blood samples was also found to be highly correlated to marrow cytogenetic evaluation at all points, although there were four discordant cases in which Southern blot analysis of blood showed no BCR gene rearrangement, yet demonstrated from 12% to 20% Philadelphia chromosome-positive metaphase cells in the marrow. RT-PCR analysis detected residual disease in five of six patients in whom no malignant cells were detected using Southern blot or cytogenetic analyses. Quantitative Southern blot analysis of blood samples may be substituted for bone marrow to monitor the response to therapy in CML and results in the need for fewer bone marrow examinations. To avoid overestimating the degree of response, marrow cytogenetic analysis should be performed when patients achieve a complete response by Southern blot monitoring. This approach provides a rational, cost-effective strategy to monitor the effect of treatment of individual patients, as well as to analyze large clinical trials in CML.
Real-time local experimental monitoring of the bleaching process.
Rakic, Mario; Klaric, Eva; Sever, Ivan; Rakic, Iva Srut; Pichler, Goran; Tarle, Zrinka
2015-04-01
The purpose of this article was to investigate a new setup for tooth bleaching and monitoring of the same process in real time, so to prevent overbleaching and related sideeffects of the bleaching procedure. So far, known bleaching procedures cannot simultaneously monitor and perform the bleaching process or provide any local control over bleaching. The experimental setup was developed at the Institute of Physics, Zagreb. The setup consists of a camera, a controller, and optical fibers. The bleaching was performed with 25% hydrogen peroxide activated by ultraviolet light diodes, and the light for monitoring was emitted by white light diodes. The collected light was analyzed using a red-green-blue (RGB) index. A K-type thermocouple was used for temperature measurements. Pastilles made from hydroxylapatite powder as well as human teeth served as experimental objects. Optimal bleaching time substantially varied among differently stained specimens. To reach reference color (A1, Chromascop shade guide), measured as an RGB index, bleaching time for pastilles ranged from 8 to >20 min, whereas for teeth it ranged from 3.5 to >20 min. The reflected light intensity of each R, G, and B component at the end of bleaching process (after 20 min) had increased up to 56% of the baseline intensity. The presented experimental setup provides essential information about when to stop the bleaching process to achieve the desired optical results so that the bleaching process can be completely responsive to the characteristics of every individual, leading to more satisfying results.
Innovative approach for in-vivo ablation validation on multimodal images
NASA Astrophysics Data System (ADS)
Shahin, O.; Karagkounis, G.; Carnegie, D.; Schlaefer, A.; Boctor, E.
2014-03-01
Radiofrequency ablation (RFA) is an important therapeutic procedure for small hepatic tumors. To make sure that the target tumor is effectively treated, RFA monitoring is essential. While several imaging modalities can observe the ablation procedure, it is not clear how ablated lesions on the images correspond to actual necroses. This uncertainty contributes to the high local recurrence rates (up to 55%) after radiofrequency ablative therapy. This study investigates a novel approach to correlate images of ablated lesions with actual necroses. We mapped both intraoperative images of the lesion and a slice through the actual necrosis in a common reference frame. An electromagnetic tracking system was used to accurately match lesion slices from different imaging modalities. To minimize the liver deformation effect, the tracking reference frame was defined inside the tissue by anchoring an electromagnetic sensor adjacent to the lesion. A validation test was performed using a phantom and proved that the end-to-end accuracy of the approach was within 2mm. In an in-vivo experiment, intraoperative magnetic resonance imaging (MRI) and ultrasound (US) ablation images were correlated to gross and histopathology. The results indicate that the proposed method can accurately correlate invivo ablations on different modalities. Ultimately, this will improve the interpretation of the ablation monitoring and reduce the recurrence rates associated with RFA.
A smart end-effector for assembly of space truss structures
NASA Technical Reports Server (NTRS)
Doggett, William R.; Rhodes, Marvin D.; Wise, Marion A.; Armistead, Maurice F.
1992-01-01
A unique facility, the Automated Structures Research Laboratory, is being used to investigate robotic assembly of truss structures. A special-purpose end-effector is used to assemble structural elements into an eight meter diameter structure. To expand the capabilities of the facility to include construction of structures with curved surfaces from straight structural elements of different lengths, a new end-effector has been designed and fabricated. This end-effector contains an integrated microprocessor to monitor actuator operations through sensor feedback. This paper provides an overview of the automated assembly tasks required by this end-effector and a description of the new end-effector's hardware and control software.
Mooney, Karen; McElnay, James C; Donnelly, Ryan F
2015-08-01
Microneedle (MN) arrays could offer an alternative method to traditional drug delivery and blood sampling methods. However, acceptance among key end-users is critical for new technologies to succeed. MNs have been advocated for use in children and so, paediatricians are key potential end-users. However, the opinions of paediatricians on MN use have been previously unexplored. The aim of this study was to investigate the views of UK paediatricians on the use of MN technology within neonatal and paediatric care. An online survey was developed and distributed among UK paediatricians to gain their opinions of MN technology and its use in the neonatal and paediatric care settings, particularly for MN-mediated monitoring. A total of 145 responses were obtained, with a completion response rate of 13.7 %. Respondents believed an alternative monitoring technique to blood sampling in children was required. Furthermore, 83 % of paediatricians believed there was a particular need in premature neonates. Overall, this potential end-user group approved of the MN technology and a MN-mediated monitoring approach. Minimal pain and the perceived ease of use were important elements in gaining favour. Concerns included the need for confirmation of correct application and the potential for skin irritation. The findings of this study provide an initial indication of MN acceptability among a key potential end-user group. Furthermore, the concerns identified present a challenge to those working within the MN field to provide solutions to further improve this technology. The work strengthens the rationale behind MN technology and facilitates the translation of MN technology from lab bench into the clinical setting.
Disaggregating Hot Water Use and Predicting Hot Water Waste in Five Test Homes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henderson, Hugh; Wade, Jeremy
2014-04-01
While it is important to make the equipment (or "plant") in a residential hot water system more efficient, the hot water distribution system also affects overall system performance and energy use. Energy wasted in heating water that is not used is estimated to be on the order of 10%-30% of total domestic hot water (DHW) energy use. This field monitoring project installed temperature sensors on the distribution piping (on trunks and near fixtures) in five houses near Syracuse, NY, and programmed a data logger to collect data at 5 second intervals whenever there was a hot water draw. This datamore » was used to assign hot water draws to specific end uses in the home as well as to determine the portion of each hot water that was deemed useful (i.e., above a temperature threshold at the fixture). Overall, the procedures to assign water draws to each end use were able to successfully assign about 50% of the water draws, but these assigned draws accounted for about 95% of the total hot water use in each home. The amount of hot water deemed as useful ranged from low of 75% at one house to a high of 91% in another. At three of the houses, new water heaters and distribution improvements were implemented during the monitoring period and the impact of these improvements on hot water use and delivery efficiency were evaluated.« less
Room air monitor for radioactive aerosols
Balmer, D.K.; Tyree, W.H.
1987-03-23
A housing assembly for use with a room air monitor for simultaneous collection and counting of suspended particles includes a casing containing a combination detector-preamplifier system at one end, a filter system at the other end, and an air flow system consisting of an air inlet formed in the casing between the detector-preamplifier system and the filter system and an air passageway extending from the air inlet through the casing and out the end opposite the detector-preamplifier combination. The filter system collects suspended particles transported directly through the housing by means of the air flow system, and these particles are detected and examined for radioactivity by the detector-preamplifier combination. 2 figs.
Electron launching voltage monitor
Mendel, Clifford W.; Savage, Mark E.
1992-01-01
An electron launching voltage monitor measures MITL voltage using a relationship between anode electric field and electron current launched from a cathode-mounted perturbation. An electron launching probe extends through and is spaced from the edge of an opening in a first MITL conductor, one end of the launching probe being in the gap between the MITL conductor, the other end being adjacent a first side of the first conductor away from the second conductor. A housing surrounds the launching probe and electrically connects the first side of the first conductor to the other end of the launching probe. A detector detects the current passing through the housing to the launching probe, the detected current being representative of the voltage between the conductors.
NASA Astrophysics Data System (ADS)
Zhang, Y.; Li, S.
2014-12-01
Geologic carbon sequestration (GCS) is proposed for the Nugget Sandstone in Moxa Arch, a regional saline aquifer with a large storage potential. For a proposed storage site, this study builds a suite of increasingly complex conceptual "geologic" model families, using subsets of the site characterization data: a homogeneous model family, a stationary petrophysical model family, a stationary facies model family with sub-facies petrophysical variability, and a non-stationary facies model family (with sub-facies variability) conditioned to soft data. These families, representing alternative conceptual site models built with increasing data, were simulated with the same CO2 injection test (50 years at 1/10 Mt per year), followed by 2950 years of monitoring. Using the Design of Experiment, an efficient sensitivity analysis (SA) is conducted for all families, systematically varying uncertain input parameters. Results are compared among the families to identify parameters that have 1st order impact on predicting the CO2 storage ratio (SR) at both end of injection and end of monitoring. At this site, geologic modeling factors do not significantly influence the short-term prediction of the storage ratio, although they become important over monitoring time, but only for those families where such factors are accounted for. Based on the SA, a response surface analysis is conducted to generate prediction envelopes of the storage ratio, which are compared among the families at both times. Results suggest a large uncertainty in the predicted storage ratio given the uncertainties in model parameters and modeling choices: SR varies from 5-60% (end of injection) to 18-100% (end of monitoring), although its variation among the model families is relatively minor. Moreover, long-term leakage risk is considered small at the proposed site. In the lowest-SR scenarios, all families predict gravity-stable supercritical CO2 migrating toward the bottom of the aquifer. In the highest-SR scenarios, supercritical CO2 footprints are relatively insignificant by the end of monitoring.
A Novel Optical Fiber Sensor for Steel Corrosion in Concrete Structures.
Leung, Christopher K Y; Wan, Kai Tai; Chen, Liquan
2008-03-20
Steel corrosion resulting from the penetration of chloride ions or carbon dioxide is a major cause of degradation for reinforced concrete structures,. The objective of the present investigation was to develop a low-cost sensor for steel corrosion, which is based on a very simple physical principle. The flat end of a cut optical fiber is coated with an iron thin film using the ion sputtering technique. Light is then sent into a fiber embedded in concrete and the reflected signal is monitored. Initially, most of the light is reflected by the iron layer. When corrosion occurs to remove the iron layer, a significant portion of the light power will leave the fiber at its exposed end, and the reflected power is greatly reduced. Monitoring of the reflected signal is hence an effective way to assess if the concrete environment at the location of the fiber tip may induce steel corrosion or not. In this paper, first the principle of the corrosion sensor and its fabrication are described. The sensing principle is then verified by experimental results. Sensor packaging for practical installation will be presented and the performance of the packaged sensors is assessed by additional experiments.
A Novel Optical Fiber Sensor for Steel Corrosion in Concrete Structures
Leung, Christopher K.Y.; Wan, Kai Tai; Chen, Liquan
2008-01-01
Steel corrosion resulting from the penetration of chloride ions or carbon dioxide is a major cause of degradation for reinforced concrete structures,. The objective of the present investigation was to develop a low-cost sensor for steel corrosion, which is based on a very simple physical principle. The flat end of a cut optical fiber is coated with an iron thin film using the ion sputtering technique. Light is then sent into a fiber embedded in concrete and the reflected signal is monitored. Initially, most of the light is reflected by the iron layer. When corrosion occurs to remove the iron layer, a significant portion of the light power will leave the fiber at its exposed end, and the reflected power is greatly reduced. Monitoring of the reflected signal is hence an effective way to assess if the concrete environment at the location of the fiber tip may induce steel corrosion or not. In this paper, first the principle of the corrosion sensor and its fabrication are described. The sensing principle is then verified by experimental results. Sensor packaging for practical installation will be presented and the performance of the packaged sensors is assessed by additional experiments. PMID:27879805
Fiber optic spectroscopic digital imaging sensor and method for flame properties monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zelepouga, Serguei A; Rue, David M; Saveliev, Alexei V
2011-03-15
A system for real-time monitoring of flame properties in combustors and gasifiers which includes an imaging fiber optic bundle having a light receiving end and a light output end and a spectroscopic imaging system operably connected with the light output end of the imaging fiber optic bundle. Focusing of the light received by the light receiving end of the imaging fiber optic bundle by a wall disposed between the light receiving end of the fiber optic bundle and a light source, which wall forms a pinhole opening aligned with the light receiving end.
Measuring and Understanding the Energy Use Signatures of a Bank Building
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, YuLong; Liu, Bing; Athalye, Rahul A.
The Pacific Northwest National Laboratory measured and analyzed the energy end-use patterns in a bank building located in the north-eastern United States. This work was performed in collaboration with PNC Financial Service Group under the US DOE’s Commercial Building Partnerships Program. This paper presents the metering study and the results of the metered data analysis. It provides a benchmark for the energy use of different bank-related equipments. The paper also reveals the importance of metering in fully understanding building loads and indentifying opportunities for energy efficiency improvements that will have impacts across PNC’s portfolio of buildings and were crucial tomore » reducing receptacle loads in the design of a net-zero bank branches. PNNL worked with PNC to meter a 4,000 ft2 bank branch in the state of Pennsylvania. 71 electrical circuits were monitored and 25 stand-alone watt-hour meters were installed at the bank. These meters monitored the consumption of all interior and exterior lighting, receptacle loads, service water heating, and the HVAC rooftop unit at a 5-minute sampling interval from November 2009 to November 2010. A total of over 8 million data records were generated, which were then analyzed to produce the end-use patterns, daily usage profiles, rooftop unit usage cycles, and inputs for calibrating the energy model of the building.« less
Carling, Christopher J.; Flanagan, Eamon; O’Doherty, Pearse; Piscione, Julien
2017-01-01
Purpose This study investigated exposure time, running and skill-related performance in two international u20 rugby union teams during an intensified tournament: the 2015 Junior World Rugby Championship. Method Both teams played 5 matches in 19 days. Analyses were conducted using global positioning system (GPS) tracking (Viper 2™, Statsports Technologies Ltd) and event coding (Opta Pro®). Results Of the 62 players monitored, 36 (57.1%) participated in 4 matches and 23 (36.5%) in all 5 matches while player availability for selection was 88%. Analyses of team running output (all players completing >60-min play) showed that the total and peak 5-minute high metabolic load distances covered were likely-to-very likely moderately higher in the final match compared to matches 1 and 2 in back and forward players. In individual players with the highest match-play exposure (participation in >75% of total competition playing time and >75-min in each of the final 3 matches), comparisons of performance in matches 4 and 5 versus match 3 (three most important matches) reported moderate-to-large decreases in total and high metabolic load distance in backs while similar magnitude reductions occurred in high-speed distance in forwards. In contrast, skill-related performance was unchanged, albeit with trivial and unclear changes, while there were no alterations in either total or high-speed running distance covered at the end of matches. Conclusions These findings suggest that despite high availability for selection, players were not over-exposed to match-play during an intensified u20 international tournament. They also imply that the teams coped with the running and skill-related demands. Similarly, individual players with the highest exposure to match-play were also able to maintain skill-related performance and end-match running output (despite an overall reduction in the latter). These results support the need for player rotation and monitoring of performance, recovery and intervention strategies during intensified tournaments. PMID:29136039
Carling, Christopher J; Lacome, Mathieu; Flanagan, Eamon; O'Doherty, Pearse; Piscione, Julien
2017-01-01
This study investigated exposure time, running and skill-related performance in two international u20 rugby union teams during an intensified tournament: the 2015 Junior World Rugby Championship. Both teams played 5 matches in 19 days. Analyses were conducted using global positioning system (GPS) tracking (Viper 2™, Statsports Technologies Ltd) and event coding (Opta Pro®). Of the 62 players monitored, 36 (57.1%) participated in 4 matches and 23 (36.5%) in all 5 matches while player availability for selection was 88%. Analyses of team running output (all players completing >60-min play) showed that the total and peak 5-minute high metabolic load distances covered were likely-to-very likely moderately higher in the final match compared to matches 1 and 2 in back and forward players. In individual players with the highest match-play exposure (participation in >75% of total competition playing time and >75-min in each of the final 3 matches), comparisons of performance in matches 4 and 5 versus match 3 (three most important matches) reported moderate-to-large decreases in total and high metabolic load distance in backs while similar magnitude reductions occurred in high-speed distance in forwards. In contrast, skill-related performance was unchanged, albeit with trivial and unclear changes, while there were no alterations in either total or high-speed running distance covered at the end of matches. These findings suggest that despite high availability for selection, players were not over-exposed to match-play during an intensified u20 international tournament. They also imply that the teams coped with the running and skill-related demands. Similarly, individual players with the highest exposure to match-play were also able to maintain skill-related performance and end-match running output (despite an overall reduction in the latter). These results support the need for player rotation and monitoring of performance, recovery and intervention strategies during intensified tournaments.
NASA Astrophysics Data System (ADS)
Kernicky, Timothy; Whelan, Matthew; Al-Shaer, Ehab
2018-06-01
A methodology is developed for the estimation of internal axial force and boundary restraints within in-service, prismatic axial force members of structural systems using interval arithmetic and contractor programming. The determination of the internal axial force and end restraints in tie rods and cables using vibration-based methods has been a long standing problem in the area of structural health monitoring and performance assessment. However, for structural members with low slenderness where the dynamics are significantly affected by the boundary conditions, few existing approaches allow for simultaneous identification of internal axial force and end restraints and none permit for quantifying the uncertainties in the parameter estimates due to measurement uncertainties. This paper proposes a new technique for approaching this challenging inverse problem that leverages the Set Inversion Via Interval Analysis algorithm to solve for the unknown axial forces and end restraints using natural frequency measurements. The framework developed offers the ability to completely enclose the feasible solutions to the parameter identification problem, given specified measurement uncertainties for the natural frequencies. This ability to propagate measurement uncertainty into the parameter space is critical towards quantifying the confidence in the individual parameter estimates to inform decision-making within structural health diagnosis and prognostication applications. The methodology is first verified with simulated data for a case with unknown rotational end restraints and then extended to a case with unknown translational and rotational end restraints. A laboratory experiment is then presented to demonstrate the application of the methodology to an axially loaded rod with progressively increased end restraint at one end.
Yegian, Courtney C; Volz, Lana M; Galgon, Richard E
2018-05-11
Tracheal extubation in children with known difficult airways is associated with an increased risk of adverse events. Currently, there is no reliable measure to predict the need for emergent reintubation due to airway inadequacy. Airway exchange catheter-assisted extubation has been shown to be a useful adjunct in decreasing the risk of adverse events due to failed extubation. We report a case of using an airway exchange catheter-assisted extubation with continuous end-tidal carbon dioxide monitoring for a pediatric patient with a known difficult airway.
NASA Astrophysics Data System (ADS)
Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.
1991-03-01
To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).
Code of Federal Regulations, 2010 CFR
2010-07-01
... the excepted sorbent trap monitoring methodology. For an affected coal-fired unit under a State or...; (c) A certified flow monitoring system is required; (d) Correction for stack gas moisture content is... proportional to the stack gas volumetric flow rate. (f) At the beginning and end of each sample collection...
Innovative monitoring of 3D warp interlock fabric during forming process
NASA Astrophysics Data System (ADS)
Dufour, C.; Jerkovic, I.; Wang, P.; Boussu, F.; Koncar, V.; Soulat, D.; Grancaric, A. M.; Pineau, P.
2017-10-01
The final geometry of 3D warp interlock fabric needs to be check during the 3D forming step to ensure the right locations of warp and weft yarns inside the final structure. Thus, a new monitoring approach has been proposed based on sensor yarns located in the fabric thickness. To ensure the accuracy of measurements, the observation of the surface deformation of the 3D warp interlock fabric has been joined to the sensor yarns measurements. At the end, it has been revealed a good correlation between strain measurement done globally by camera and locally performed by sensor yarns.
Some Solved Problems with the SLAC PEP-II B-Factory Beam-Position Monitor System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Ronald G.
2000-05-05
The Beam-Position Monitor (BPM) system for the SLAC PEP-II B-Factory has been in operation for over two years. Although the BPM system has met all of its specifications, several problems with the system have been identified and solved. The problems include errors and limitations in both the hardware and software. Solutions of such problems have led to improved performance and reliability. In this paper the authors report on this experience. The process of identifying problems is not at an end and they expect continued improvement of the BPM system.
Deep Space Network Antenna Monitoring Using Adaptive Time Series Methods and Hidden Markov Models
NASA Technical Reports Server (NTRS)
Smyth, Padhraic; Mellstrom, Jeff
1993-01-01
The Deep Space Network (DSN)(designed and operated by the Jet Propulsion Laboratory for the National Aeronautics and Space Administration (NASA) provides end-to-end telecommunication capabilities between earth and various interplanetary spacecraft throughout the solar system.
Condition Assessment and End-of-Life Prediction System for Electric Machines and Their Loads
NASA Technical Reports Server (NTRS)
Parlos, Alexander G.; Toliyat, Hamid A.
2005-01-01
An end-of-life prediction system developed for electric machines and their loads could be used in integrated vehicle health monitoring at NASA and in other government agencies. This system will provide on-line, real-time condition assessment and end-of-life prediction of electric machines (e.g., motors, generators) and/or their loads of mechanically coupled machinery (e.g., pumps, fans, compressors, turbines, conveyor belts, magnetic levitation trains, and others). In long-duration space flight, the ability to predict the lifetime of machinery could spell the difference between mission success or failure. Therefore, the system described here may be of inestimable value to the U.S. space program. The system will provide continuous monitoring for on-line condition assessment and end-of-life prediction as opposed to the current off-line diagnoses.
Audible acoustics in high-shear wet granulation: application of frequency filtering.
Hansuld, Erin M; Briens, Lauren; McCann, Joe A B; Sayani, Amyn
2009-08-13
Previous work has shown analysis of audible acoustic emissions from high-shear wet granulation has potential as a technique for end-point detection. In this research, audible acoustic emissions (AEs) from three different formulations were studied to further develop this technique as a process analytical technology. Condenser microphones were attached to three different locations on a PMA-10 high-shear granulator (air exhaust, bowl and motor) to target different sound sources. Size, flowability and tablet break load data was collected to support formulator end-point ranges and interpretation of AE analysis. Each formulation had a unique total power spectral density (PSD) profile that was sensitive to granule formation and end-point. Analyzing total PSD in 10 Hz segments identified profiles with reduced run variability and distinct maxima and minima suitable for routine granulation monitoring and end-point control. A partial least squares discriminant analysis method was developed to automate selection of key 10 Hz frequency groups using variable importance to projection. The results support use of frequency refinement as a way forward in the development of acoustic emission analysis for granulation monitoring and end-point control.
An ultra-low power (ULP) bandage-type ECG sensor for efficient cardiac disease management.
Shin, Kunsoo; Park, G G; Kim, J P; Lee, T H; Ko, B H; Kim, Y H
2013-01-01
This paper proposed an ultra-low power bandage-type ECG sensor (the size: 76 × 34 × 3 (mm(3)) and the power consumption: 1 mW) which allows for a continuous and real-time monitoring of a user's ECG signals over 24h during daily activities. For its compact size and lower power consumption, we designed the analog front-end, the SRP (Samsung Reconfigurable Processor) based DSP of 30 uW/MHz, and the ULP wireless RF of 1 nJ/bit. Also, to tackle motion artifacts(MA), a MA monitoring technique based on the HCP (Half-cell Potential) is proposed which resulted in the high correlation between the MA and the HCP, the correlation coefficient of 0.75 ± 0.18. To assess its feasibility and validity as a wearable health monitor, we performed the comparison of two ECG signals recorded form it and a conventional Holter device. As a result, the performance of the former is a little lower as compared with the latter, although showing no statistical significant difference (the quality of the signal: 94.3% vs 99.4%; the accuracy of arrhythmia detection: 93.7% vs 98.7%). With those results, it has been confirmed that it can be used as a wearable health monitor due to its comfortability, its long operation lifetime and the good quality of the measured ECG signal.
Noninvasive Hemodynamic Measurements During Neurosurgical Procedures in Sitting Position.
Schramm, Patrick; Tzanova, Irene; Gööck, Tilman; Hagen, Frank; Schmidtmann, Irene; Engelhard, Kristin; Pestel, Gunther
2017-07-01
Neurosurgical procedures in sitting position need advanced cardiovascular monitoring. Transesophageal echocardiography (TEE) to measure cardiac output (CO)/cardiac index (CI) and stroke volume (SV), and invasive arterial blood pressure measurements for systolic (ABPsys), diastolic (ABPdiast) and mean arterial pressure (MAP) are established monitoring technologies for these kind of procedures. A noninvasive device for continuous monitoring of blood pressure and CO based on a modified Penaz technique (volume-clamp method) was introduced recently. In the present study the noninvasive blood pressure measurements were compared with invasive arterial blood pressure monitoring, and the noninvasive CO monitoring to TEE measurements. Measurements of blood pressure and CO were performed in 35 patients before/after giving a fluid bolus and a change from supine to sitting position, start of surgery, and repositioning from sitting to supine at the end of surgery. Data pairs from the noninvasive device (Nexfin HD) versus arterial line measurements (ABPsys, ABPdiast, MAP) and versus TEE (CO, CI, SV) were compared using Bland-Altman analysis and percentage error. All parameters compared (CO, CI, SV, ABPsys, ABPdiast, MAP) showed a large bias and wide limits of agreement. Percentage error was above 30% for all parameters except ABPsys. The noninvasive device based on a modified Penaz technique cannot replace arterial blood pressure monitoring or TEE in anesthetized patients undergoing neurosurgery in sitting position.
Portable system for temperature monitoring in all phases of wine production.
Boquete, Luciano; Cambralla, Rafael; Rodríguez-Ascariz, J M; Miguel-Jiménez, J M; Cantos-Frontela, J J; Dongil, J
2010-07-01
This paper presents a low-cost and highly versatile temperature-monitoring system applicable to all phases of wine production, from grape cultivation through to delivery of bottled wine to the end customer. Monitoring is performed by a purpose-built electronic system comprising a digital memory that stores temperature data and a ZigBee communication system that transmits it to a Control Centre for processing and display. The system has been tested under laboratory conditions and in real-world operational applications. One of the system's advantages is that it can be applied to every phase of wine production. Moreover, with minimum modification, other variables of interest (pH, humidity, etc.) could also be monitored and the system could be applied to other similar sectors, such as olive-oil production. 2010 ISA. Published by Elsevier Ltd. All rights reserved.
Energy Monitoring and Control Systems--Performance Verification and Endurance Test Procedures.
1982-12-01
EM-; tK2 s) trave h.en loadted in qvstem sit ~rp sm:l , o,1 ti-’.~ -1 rouiitthe factoz’, lest. E.VEF.NT Comnmwid the system to display the status of...contractor correction of all outstanding deficiencies . 163 TEST NO: END-i Page I of 1 OBJECTIVE: To demonstrate EMCS normal mode operation 24 TITLE: Endurance
Cardiovascular studies using the chimpanzee (Pan troglodytes)
NASA Technical Reports Server (NTRS)
Hinds, J. E.; Cothran, L. N.; Hawthorne, E. W.
1977-01-01
Despite the phylogenetic similarities between chimpanzees and man, there exists a paucity of reliable data on normal cardiovascular function and the physiological responses of the system to standard interventions. Totally implanted biotelemetry systems or hardwire analog techniques were used to examine the maximum number of cardiovascular variables which could be simultaneously monitored without significantly altering the system's performance. This was performed in order to acquire base-line data not previously obtained in this species, to determine cardiovascular response to specific forcing functions such as ventricular pacing, drug infusions, and lower body negative pressure. A cardiovascular function profile protocol was developed in order to adjust independently the three major factors which modify ventricular performance, namely, left ventricular performance, left ventricular preload, afterload, and contractility. Cardiac pacing at three levels above the ambient rate was used to adjust end diastolic volume (preload). Three concentrations of angiotensin were infused continuously to evaluate afterload in a stepwide fashion. A continuous infusion of dobutamine was administered to raise the manifest contractile state of the heart.
NASA Technical Reports Server (NTRS)
Regan, Timothy F.
2004-01-01
The free-piston Stirling convertor end-to-end modeling effort at the NASA Glenn Research Center has produced a software-based test bed in which free-piston Stirling convertors can be simulated and evaluated. The simulation model includes all the components of the convertor: the Stirling cycle engine, heat source, linear alternator, controller, and load. So far, it has been used in evaluating the performance of electronic controller designs. Three different controller design concepts were simulated using the model: 1) Controllers with parasitic direct current loading. 2) Controllers with parasitic alternating current loading. 3) Controllers that maintain a reference current. The free-piston Stirling convertor is an electromechanical device that operates at resonance. It is the function of the electronic load controller to ensure that the electrical load seen by the machine is always great enough to keep the amplitude of the piston and alternator oscillation at the rated value. This is done by regulating the load on the output bus. The controller monitors the instantaneous voltage, regulating it by switching loads called parasitic loads onto the bus whenever the bus voltage is too high and removing them whenever the voltage is too low. In the first type of controller, the monitor-ing and switching are done on the direct-current (dc) bus. In the second type, the alternating current bus is used. The model allows designers to test a controller concept before investing time in hardware. The simulation code used to develop the model also offers detailed models of digital and analog electronic components so that the resulting designs are realistic enough to translate directly into hardware circuits.
NASA Astrophysics Data System (ADS)
Jiang, Ying; Zeng, Jie; Liang, Dakai; Ni, Xiaoyu; Luo, Wenyong
2013-06-01
The fibers aligning is very important in fusion splicing process. The core of polarization maintaining photonic crystal fiber(PM-PCF) can not be seen in the splicer due to microhole structure of its cross-section. So it is difficult to align precisely PM-PCF and conventional single-mode fiber(SMF).We demonstrate a novel method for aligning precisely PM-PCF and conventional SMF by online spectrum monitoring. Firstly, the light source of halogen lamp is connected to one end face of conventional SMF.Then align roughly one end face of PM-PCF and the other end face of conventional SMF by observing visible light in the other end face of PM-PCF. If there exists visible light, they are believed to align roughly. The other end face of PM-PCF and one end face of the other conventional SMF are aligned precisely in the other splicer by online spectrum monitoring. Now the light source of halogen lamp is changed into a broadband light source with 52nm wavelength range.The other end face of the other conventional SMF is connected to an optical spectrum analyzer.They are translationally and rotationally adjusted in the splicer by monitoring spectrum. When the transmission spectrum power is maximum, the aligning is precise.
Effects of Regulation and Technology on End Uses of Nonfuel Mineral Commodities in the United States
Matos, Grecia R.
2007-01-01
The regulatory system and advancement of technologies have shaped the end-use patterns of nonfuel minerals used in the United States. These factors affected the quantities and types of materials used by society. Environmental concerns and awareness of possible negative effects on public health prompted numerous regulations that have dramatically altered the use of commodities like arsenic, asbestos, lead, and mercury. While the selected commodities represent only a small portion of overall U.S. materials use, they have the potential for harmful effects on human health or the environment, which other commodities, like construction aggregates, do not normally have. The advancement of technology allowed for new uses of mineral materials in products like high-performance computers, telecommunications equipment, plasma and liquid-crystal display televisions and computer monitors, mobile telephones, and electronic devices, which have become mainstream products. These technologies altered the end-use pattern of mineral commodities like gallium, germanium, indium, and strontium. Human ingenuity and people?s demand for different and creative services increase the demand for new materials and industries while shifting the pattern of use of mineral commodities. The mineral commodities? end-use data are critical for the understanding of the magnitude and character of these flows, assessing their impact on the environment, and providing an early warning of potential problems in waste management of products containing these commodities. The knowledge of final disposition of the mineral commodity allows better decisions as to how regulation should be tailored.
DOE Office of Scientific and Technical Information (OSTI.GOV)
?There are many existing buildings with load-bearing mass masonry walls, whose energy performance could be improved with the retrofit of insulation. However, adding insulation to the interior side of walls of such masonry buildings in cold (and wet) climates may cause performance and durability problems. Some concerns, such as condensation and freeze-thaw have known solutions. But wood members embedded in the masonry structure will be colder (and potentially wetter) after an interior insulation retrofit. Moisture content and relative humidity were monitored at joist ends in historic mass brick masonry walls retrofitted with interior insulation in a cold climate (Zone 5A);more » data were collected from 2012-2015. Eleven joist ends were monitored in all four orientations. One limitation of these results is that the renovation is still ongoing, with limited wintertime construction heating and no permanent occupancy to date. Measurements show that many joists ends remain at high moisture contents, especially at north- and east-facing orientations, with constant 100 percent RH conditions at the worst cases. These high moisture levels are not conducive for wood durability, but no evidence for actual structural damage has been observed. Insulated vs. non-insulated joist pockets do not show large differences. South facing joists have safe (10-15 percent) moisture contents. Given the uncertainty pointed out by research, definitive guidance on the vulnerability of embedded wood members is difficult to formulate. In high-risk situations, or when a very conservative approach is warranted, the embedded wood member condition can be eliminated entirely, supporting the joist ends outside of the masonry pocket.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report, Evaluation of the Performance of Houses with and without Supplemental Dehumidification in a Hot-Humid Climate, describes a research study that that was conducted by the Building Science Corporation (BSC) Building America Research Team. BSC seeks to research and report on the field monitoring of the performance of in-situ supplemental dehumidification systems in low energy, high performance, homes in a Hot-Humid climate. The purpose of this research project was to observe and compare the humidity control performance of new, single family, low energy, and high performance, homes. Specifically, the study sought to compare the interior conditions and mechanical systemsmore » operation between two distinct groups of houses, homes with a supplemental dehumidifier installed in addition to HVAC system, and homes without any supplemental dehumidification. The subjects of the study were ten single-family new construction homes in New Orleans, LA. Data logging equipment was installed at each home in 2012. Interior conditions and various end-use loads were monitored for one year. In terms of averages, the homes with dehumidifiers are limiting elevated levels of humidity in the living space. However, there was significant variation in humidity control between individual houses. An analysis of the equipment operation did not show a clear correlation between energy use and humidity levels. In general, no single explanatory variable appears to provide a consistent understanding of the humidity control in each house. Indoor humidity is likely due to all of the factors we have examined, and the specifics of how they are used by each occupant.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerrigan, P.; Norton, P.
This report, Evaluation of the Performance of Houses with and without Supplemental Dehumidification in a Hot-Humid Climate, describes a research study that that was conducted by the Building Science Corporation (BSC) Building America Research Team. BSC seeks to research and report on the field monitoring of the performance of in-situ supplemental dehumidification systems in low energy, high performance, homes in a Hot-Humid climate. The purpose of this research project was to observe and compare the humidity control performance of new, single family, low energy, and high performance, homes. Specifically, the study sought to compare the interior conditions and mechanical systemsmore » operation between two distinct groups of houses, homes with a supplemental dehumidifier installed in addition to HVAC system, and homes without any supplemental dehumidification. The subjects of the study were ten single-family new construction homes in New Orleans, LA.Data logging equipment was installed at each home in 2012. Interior conditions and various end-use loads were monitored for one year. In terms of averages, the homes with dehumidifiers are limiting elevated levels of humidity in the living space. However, there was significant variation in humidity control between individual houses. An analysis of the equipment operation did not show a clear correlation between energy use and humidity levels. In general, no single explanatory variable appears to provide a consistent understanding of the humidity control in each house. Indoor humidity is likely due to all of the factors we have examined, and the specifics of how they are used by each occupant.« less
Real-Time Payload Control and Monitoring on the World Wide Web
NASA Technical Reports Server (NTRS)
Sun, Charles; Windrem, May; Givens, John J. (Technical Monitor)
1998-01-01
World Wide Web (W3) technologies such as the Hypertext Transfer Protocol (HTTP) and the Java object-oriented programming environment offer a powerful, yet relatively inexpensive, framework for distributed application software development. This paper describes the design of a real-time payload control and monitoring system that was developed with W3 technologies at NASA Ames Research Center. Based on Java Development Toolkit (JDK) 1.1, the system uses an event-driven "publish and subscribe" approach to inter-process communication and graphical user-interface construction. A C Language Integrated Production System (CLIPS) compatible inference engine provides the back-end intelligent data processing capability, while Oracle Relational Database Management System (RDBMS) provides the data management function. Preliminary evaluation shows acceptable performance for some classes of payloads, with Java's portability and multimedia support identified as the most significant benefit.
Monitoring Programs Using Rewriting
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Rosu, Grigore; Lan, Sonie (Technical Monitor)
2001-01-01
We present a rewriting algorithm for efficiently testing future time Linear Temporal Logic (LTL) formulae on finite execution traces, The standard models of LTL are infinite traces, reflecting the behavior of reactive and concurrent systems which conceptually may be continuously alive in most past applications of LTL, theorem provers and model checkers have been used to formally prove that down-scaled models satisfy such LTL specifications. Our goal is instead to use LTL for up-scaled testing of real software applications, corresponding to analyzing the conformance of finite traces against LTL formulae. We first describe what it means for a finite trace to satisfy an LTL property end then suggest an optimized algorithm based on transforming LTL formulae. We use the Maude rewriting logic, which turns out to be a good notation and being supported by an efficient rewriting engine for performing these experiments. The work constitutes part of the Java PathExplorer (JPAX) project, the purpose of which is to develop a flexible tool for monitoring Java program executions.
Igne, Benoît; de Juan, Anna; Jaumot, Joaquim; Lallemand, Jordane; Preys, Sébastien; Drennen, James K; Anderson, Carl A
2014-10-01
The implementation of a blend monitoring and control method based on a process analytical technology such as near infrared spectroscopy requires the selection and optimization of numerous criteria that will affect the monitoring outputs and expected blend end-point. Using a five component formulation, the present article contrasts the modeling strategies and end-point determination of a traditional quantitative method based on the prediction of the blend parameters employing partial least-squares regression with a qualitative strategy based on principal component analysis and Hotelling's T(2) and residual distance to the model, called Prototype. The possibility to monitor and control blend homogeneity with multivariate curve resolution was also assessed. The implementation of the above methods in the presence of designed experiments (with variation of the amount of active ingredient and excipients) and with normal operating condition samples (nominal concentrations of the active ingredient and excipients) was tested. The impact of criteria used to stop the blends (related to precision and/or accuracy) was assessed. Results demonstrated that while all methods showed similarities in their outputs, some approaches were preferred for decision making. The selectivity of regression based methods was also contrasted with the capacity of qualitative methods to determine the homogeneity of the entire formulation. Copyright © 2014. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Ugolini, Giovanni Stefano; Occhetta, Paola; Saccani, Alessandra; Re, Francesca; Krol, Silke; Rasponi, Marco; Redaelli, Alberto
2018-04-01
In vitro blood-brain barrier models are highly relevant for drug screening and drug development studies, due to the challenging task of understanding the transport mechanism of drug molecules through the blood-brain barrier towards the brain tissue. In this respect, microfluidics holds potential for providing microsystems that require low amounts of cells and reagent and can be potentially multiplexed for increasing the ease and throughput of the drug screening process. We here describe the design, development and validation of a microfluidic device for endothelial blood-brain barrier cell transport studies. The device comprises of two microstructured layers (top culture chamber and bottom collection chamber) sandwiching a porous membrane for the cell culture. Microstructured layers include two pairs of physical electrodes, embedded into the device layers by geometrically defined guiding channels with computationally optimized positions. These electrodes allow the use of commercial electrical measurement systems for monitoring trans-endothelial electrical resistance (TEER). We employed the designed device for performing preliminary assessment of endothelial barrier formation with murine brain endothelial cells (Br-bEnd5). Results demonstrate that cellular junctional complexes effectively form in the cultures (expression of VE-Cadherin and ZO-1) and that the TEER monitoring systems effectively detects an increase of resistance of the cultured cell layers indicative of tight junction formation. Finally, we validate the use of the described microsystem for drug transport studies demonstrating that Br-bEnd5 cells significantly hinder the transport of molecules (40 kDa and 4 kDa dextran) from the top culture chamber to the bottom collection chamber.
ZHANG, JUN-JING; NIU, JIAN-XIANG; YUE, GEN-QUAN; ZHONG, HAI-YAN; MENG, XING-KAI
2012-01-01
This study aimed to develop a new auxiliary heterotopic partial liver transplantation (AHPLT) technique in minipigs using a model of liver cirrhosis. Based on our previous study, 14 minipigs were induced to cirrhosis by administration of carbon tetrachloride (CCl4) through intraperitoneal injection. All of the cirrhotic animals were utilized as recipients. The donor’s liver was placed on the recipient’s splenic bed, and the anastomosis was performed as follows: end-to-end anastomosis between the donor’s portal vein and the recipient’s splenic vein, end-to-side anastomosis between the donor’s suprahepatic vena cava and the recipient’s suprahepatic vena cava, and end-to-end anastomosis between the donor’s hepatic artery and the recipient’s splenic artery. The common bile duct of the donor was intubated and bile was collected with an extracorporeal bag. Vital signs, portal vein pressure (PVP), hepatic venous pressure (HVP) and portal vein pressure gradient (PVPG) were monitored throughout the transplantation. All 8 minipigs that developed liver cirrhosis were utilized to establish the new AHPLT; 7 cases survived. Following the surgical intervention, the PVP and PVPG of the recipients were lower than those prior to the operation (P<0.05), whereas the PVP and PVPG of the donors increased significantly compared to those of the normal animals (P<0.05). A new operative technique for AHPLT has been successfully described herein using a model of liver cirrhosis. PMID:22969983
Riphaus, Andrea; Slottje, Mark; Bulla, Jan; Keil, Carolin; Mentzel, Christian; Limbach, Vera; Schultz, Barbara; Unzicker, Christian
2017-10-01
Sedation for colonoscopy using intravenous propofol has become standard in many Western countries. Gender-specific differences have been shown for general anaesthesia in dentistry, but no such data existed for gastrointestinal endoscopy. A prospective observational study. An academic teaching hospital of Hannover Medical School. A total of 219 patients (108 women and 111 men) scheduled for colonoscopy. Propofol sedation using electroencephalogram monitoring during a constant level of sedation depth (D0 to D2) performed by trained nurses or physicians after a body-weight-adjusted loading dose. The primary end-point was the presence of gender-specific differences in awakening time (time from end of sedation to eye-opening and complete orientation); secondary outcome parameters analysed were total dose of propofol, sedation-associated complications (bradycardia, hypotension, hypoxaemia and apnoea), patient cooperation and patient satisfaction. Multivariate analysis was performed to correct confounding factors such as age and BMI. Women awakened significantly faster than men, with a time to eye-opening of 7.3 ± 3.7 versus 8.4 ± 3.4 min (P = 0.005) and time until complete orientation of 9.1 ± 3.9 versus 10.4 ± 13.7 min (P = 0.008). The propofol dosage was not significantly different, with some trend towards more propofol per kg body weight in women (3.98 ± 1.81 mg versus 3.72 ± 1.75 mg, P = 0.232). The effect of gender aspects should be considered when propofol is used as sedation for gastrointestinal endoscopy. That includes adequate dosing for women as well as caution regarding potential overdosing of male patients. ClinicalTrials.gov (Identifier: NCT02687568).
NASA Astrophysics Data System (ADS)
Scheyer, Austin G.; Anton, Steven R.
2017-04-01
Embedding sensors within additive manufactured (AM) structures gives the ability to develop smart structures that are capable of monitoring the mechanical health of a system. AM provides an opportunity to embed sensors within a structure during the manufacturing process. One major limitation of AM technology is the ability to verify the geometric and material properties of fabricated structures. Over the past several years, the electromechanical impedance (EMI) method for structural health monitoring (SHM) has been proven to be an effective method for sensing damage in structurers. The EMI method utilizes the coupling between the electrical and mechanical properties of a piezoelectric transducer to detect a change in the dynamic response of a structure. A piezoelectric device, usually a lead zirconate titanate (PZT) ceramic wafer, is bonded to a structure and the electrical impedance is measured across as range of frequencies. A change in the electrical impedance is directly correlated to changes made to the mechanical condition of the structure. In this work, the EMI method is employed on piezoelectric transducers embedded inside AM parts to evaluate the feasibility of performing SHM on parts fabricated using additive manufacturing. The fused deposition modeling (FDM) method is used to print specimens for this feasibility study. The specimens are printed from polylactic acid (PLA) in the shape of a beam with an embedded monolithic piezoelectric ceramic disc. The specimen is mounted as a cantilever while impedance measurements are taken using an HP 4194A impedance analyzer. Both destructive and nondestructive damage is simulated in the specimens by adding an end mass and drilling a hole near the free end of the cantilever, respectively. The Root Mean Square Deviation (RMSD) method is utilized as a metric for quantifying damage to the system. In an effort to determine a threshold for RMSD, the values are calculated for the variation associated with taking multiple measurements and with re-clamping the cantilever, and determined to be 0.154, and 3.125 respectively. The RMSD value of the cantilever with a 400 g end mass is 11.39, and the RMSD value of the cantilever with a 4 mm hole near the end is 12.15. From these results, it can be determined that the damaged cases have much higher RMSD values than the RMSD values associated with measurements and set up variability of the healthy structure.
Zwanenburg, Alex; Andriessen, Peter; Jellema, Reint K; Niemarkt, Hendrik J; Wolfs, Tim G A M; Kramer, Boris W; Delhaas, Tammo
2015-03-01
Seizures below one minute in duration are difficult to assess correctly using seizure detection algorithms. We aimed to improve neonatal detection algorithm performance for short seizures through the use of trend templates for seizure onset and end. Bipolar EEG were recorded within a transiently asphyxiated ovine model at 0.7 gestational age, a common experimental model for studying brain development in humans of 30-34 weeks of gestation. Transient asphyxia led to electrographic seizures within 6-8 h. A total of 3159 seizures, 2386 shorter than one minute, were annotated in 1976 h-long EEG recordings from 17 foetal lambs. To capture EEG characteristics, five features, sensitive to seizures, were calculated and used to derive trend information. Feature values and trend information were used as input for support vector machine classification and subsequently post-processed. Performance metrics, calculated after post-processing, were compared between analyses with and without employing trend information. Detector performance was assessed after five-fold cross-validation conducted ten times with random splits. The use of trend templates for seizure onset and end in a neonatal seizure detection algorithm significantly improves the correct detection of short seizures using two-channel EEG recordings from 54.3% (52.6-56.1) to 59.5% (58.5-59.9) at FDR 2.0 (median (range); p < 0.001, Wilcoxon signed rank test). Using trend templates might therefore aid in detection of short seizures by EEG monitoring at the NICU.
Electron launching voltage monitor
Mendel, C.W.; Savage, M.E.
1992-03-17
An electron launching voltage monitor measures MITL voltage using a relationship between anode electric field and electron current launched from a cathode-mounted perturbation. An electron launching probe extends through and is spaced from the edge of an opening in a first MITL conductor, one end of the launching probe being in the gap between the MITL conductor, the other end being adjacent a first side of the first conductor away from the second conductor. A housing surrounds the launching probe and electrically connects the first side of the first conductor to the other end of the launching probe. A detector detects the current passing through the housing to the launching probe, the detected current being representative of the voltage between the conductors. 5 figs.
NASA Astrophysics Data System (ADS)
Hann, Swook; Kim, Dong-Hwan; Park, Chang-Soo
2006-04-01
A monitoring technique for multiple power splitter-passive optical networks (PS-PON) is presented. The technique is based on the remote sensing of fiber Bragg grating (FBG) using a tunable OTDR. To monitor the multiple PS-PON, the FBG can be used for a wavelength dependent reflective reference on each branch end of the PS. The FBG helps discern an individual event of the multiple PS-PON for the monitoring in collaborate with information of Rayleigh backscattered power. The multiple PS-PON can be analyzed by the monitoring method at the central office under 10-Gbit/s in-service.
Developing an operational rangeland water requirement satisfaction index
Senay, Gabriel B.; Verdin, James P.; Rowland, James
2011-01-01
Developing an operational water requirement satisfaction index (WRSI) for rangeland monitoring is an important goal of the famine early warning systems network. An operational WRSI has been developed for crop monitoring, but until recently a comparable WRSI for rangeland was not successful because of the extremely poor performance of the index when based on published crop coefficients (K c) for rangelands. To improve the rangeland WRSI, we developed a simple calibration technique that adjusts the K c values for rangeland monitoring using long-term rainfall distribution and reference evapotranspiration data. The premise for adjusting the K c values is based on the assumption that a viable rangeland should exhibit above-average WRSI (values >80%) during a normal year. The normal year was represented by a median dekadal rainfall distribution (satellite rainfall estimate from 1996 to 2006). Similarly, a long-term average for potential evapotranspiration was used as input to the famine early warning systems network WRSI model in combination with soil-water-holding capacity data. A dekadal rangeland WRSI has been operational for east and west Africa since 2005. User feedback has been encouraging, especially with regard to the end-of-season WRSI anomaly products that compare the index's performance to ‘normal’ years. Currently, rangeland WRSI products are generated on a dekadal basis and posted for free distribution on the US Geological Survey early warning website at http://earlywarning.usgs.gov/adds/
Zhou, Yu; Du, Juan; Hou, Hong-Yan; Lu, Yan-Fang; Yu, Jing; Mao, Li-Yan; Wang, Feng; Sun, Zi-Yong
2017-01-01
Tuberculosis (TB) is a leading global public health problem. To achieve the end TB strategy, non-invasive markers for diagnosis and treatment monitoring of TB disease are urgently needed, especially in high-endemic countries such as China. Interferon-gamma release assays (IGRAs) and tuberculin skin test (TST), frequently used immunological methods for TB detection, are intrinsically unable to discriminate active tuberculosis (ATB) from latent tuberculosis infection (LTBI). Thus, the specificity of these methods in the diagnosis of ATB is dependent upon the local prevalence of LTBI. The pathogen-detecting methods such as acid-fast staining and culture, all have limitations in clinical application. ImmunoScore (IS) is a new promising prognostic tool which was commonly used in tumor. However, the importance of host immunity has also been demonstrated in TB pathogenesis, which implies the possibility of using IS model for ATB diagnosis and therapy monitoring. In the present study, we focused on the performance of IS model in the differentiation between ATB and LTBI and in treatment monitoring of TB disease. We have totally screened five immunological markers (four non-specific markers and one TB-specific marker) and successfully established IS model by using Lasso logistic regression analysis. As expected, the IS model can effectively distinguish ATB from LTBI (with a sensitivity of 95.7% and a specificity of 92.1%) and also has potential value in the treatment monitoring of TB disease.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ueno, Kohta
Adding insulation to the interior side of walls of masonry buildings in cold (and wet) climates may cause performance and durability problems. Some concerns, such as condensation and freeze-thaw, have known solutions, but wood members embedded in the masonry structure will be colder (and potentially wetter) after an interior insulation retrofit. Moisture content & relative humidity were monitored at joist ends in historic mass brick masonry walls retrofitted with interior insulation in a cold climate (Zone 5A); data were collected from 2012-2015. Eleven joist ends were monitored in all four orientations. One limitation of these results is that the renovationmore » is still ongoing, with limited wintertime construction heating and no permanent occupancy to date. Measurements show that many joists ends remain at high moisture contents, especially at north- and east-facing orientations, with constant 100% RH conditions at the worst cases. These high moisture levels are not conducive for wood durability, but no evidence for actual structural damage has been observed. Insulated versus non-insulated joist pockets do not show large differences. South facing joists have safe (10-15%) moisture contents. Given the uncertainty pointed out by research, definitive guidance on the vulnerability of embedded wood members is difficult to formulate. In high-risk situations, or when a very conservative approach is warranted, the embedded wood member condition can be eliminated entirely, supporting the joist ends outside of the masonry pocket.« less
Fetal Heart Rate Monitoring during Intrauterine Open Surgery for Myelomeningocele Repair.
Santana, Eduardo Félix Martins; Moron, Antônio Fernandes; Barbosa, Maurício Mendes; Milani, Herbene Jose Figuinha; Sarmento, Stephanno Gomes Pereira; Araujo Júnior, Edward; Rolo, Liliam Cristine; Cavalheiro, Sérgio
2016-01-01
The aim of this study was to assess fetal hemodynamics during intrauterine open surgery for myelomeningocele (MMC) repair by describing fetal heart rate (FHR) monitoring in detail related to each part of the procedure. A study was performed with 57 fetuses submitted to intrauterine MMC repair between the 24th and 27th week of gestation. Evaluations of FHR were made in specific periods: before anesthesia, after anesthesia, at the beginning of laparotomy, during uterus abdominal withdrawal, hysterotomy, neurosurgery (before incision, during early skin manipulation, spinal cord releasing, and at the end of neurosurgery), abdominal cavity reintroduction, and abdominal closure, and at the end of surgery. Means ± standard deviations of FHR were established for each period, and analysis of variance with repeated measures was used to assess differences between these periods. The mean differences were assessed with 95% confidence intervals and were analyzed by Tukey's multiple comparison test. The mean FHR during the specific periods mentioned above was 140.2, 140, 139.2, 138.8, 135.1, 133.9, 123.1, 134.0, 134.5, 137.9, and 139.9 bpm, respectively (p < 0.0001). Comparing the different periods, the highest frequencies were observed in the initial and final moments. The neurosurgery stage presents lower frequencies, especially during the release of the spinal cord. FHR monitoring revealed interesting findings in terms of physiological fetal changes during MMC repair, especially during neurosurgery, which was the most critical period. © 2015 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Rucinski, Marek; Coates, Adam; Montano, Giuseppe; Allouis, Elie; Jameux, David
2015-09-01
The Lightweight Advanced Robotic Arm Demonstrator (LARAD) is a state-of-the-art, two-meter long robotic arm for planetary surface exploration currently being developed by a UK consortium led by Airbus Defence and Space Ltd under contract to the UK Space Agency (CREST-2 programme). LARAD has a modular design, which allows for experimentation with different electronics and control software. The control system architecture includes the on-board computer, control software and firmware, and the communication infrastructure (e.g. data links, switches) connecting on-board computer(s), sensors, actuators and the end-effector. The purpose of the control system is to operate the arm according to pre-defined performance requirements, monitoring its behaviour in real-time and performing safing/recovery actions in case of faults. This paper reports on the results of a recent study about the feasibility of the development and integration of a novel control system architecture for LARAD fully based on the SpaceWire protocol. The current control system architecture is based on the combination of two communication protocols, Ethernet and CAN. The new SpaceWire-based control system will allow for improved monitoring and telecommanding performance thanks to higher communication data rate, allowing for the adoption of advanced control schemes, potentially based on multiple vision sensors, and for the handling of sophisticated end-effectors that require fine control, such as science payloads or robotic hands.
Attribution of movement: Potential links between subjective reports of agency and output monitoring.
Sugimori, Eriko; Asai, Tomohisa
2015-01-01
According to agency memory theory, individuals decide whether "I did it" based on a memory trace of "I am doing it". The purpose of this study was to validate the agency memory theory. To this end, several hand actions were individually presented as samples, and participants were asked to perform the sample action, observe the performance of that action by another person, or imagine performing the action. Online feedback received by the participants during the action was manipulated among the different conditions, and output monitoring, in which participants were asked whether they had performed each hand action, was conducted. The rate at which respondents thought that they themselves had performed the action was higher when visual feedback was unaltered than when it was altered (Experiment 1A), and this tendency was observed across all types of altered feedback (Experiment 1B). The observation of an action performed by the hand of another person did not increase the rate at which respondents thought that they themselves had performed the action unless the participants actually did perform the action (Experiments 2A and 2B). In Experiment 3, a relationship was observed between the subjective feeling that "I am the one who is causing an action" and the memory that "I did perform the action". These experiments support the hypothesis that qualitative information and sense of "self" are tagged in a memory trace and that such tags can be used as cues for judgements when the memory is related to the "self".
Kobayashi, Leo; Gosbee, John W; Merck, Derek L
2017-07-01
(1) To develop a clinical microsystem simulation methodology for alarm fatigue research with a human factors engineering (HFE) assessment framework and (2) to explore its application to the comparative examination of different approaches to patient monitoring and provider notification. Problems with the design, implementation, and real-world use of patient monitoring systems result in alarm fatigue. A multidisciplinary team is developing an open-source tool kit to promote bedside informatics research and mitigate alarm fatigue. Simulation, HFE, and computer science experts created a novel simulation methodology to study alarm fatigue. Featuring multiple interconnected simulated patient scenarios with scripted timeline, "distractor" patient care tasks, and triggered true and false alarms, the methodology incorporated objective metrics to assess provider and system performance. Developed materials were implemented during institutional review board-approved study sessions that assessed and compared an experimental multiparametric alerting system with a standard monitor telemetry system for subject response, use characteristics, and end-user feedback. A four-patient simulation setup featuring objective metrics for participant task-related performance and response to alarms was developed along with accompanying structured HFE assessment (questionnaire and interview) for monitor systems use testing. Two pilot and four study sessions with individual nurse subjects elicited true alarm and false alarm responses (including diversion from assigned tasks) as well as nonresponses to true alarms. In-simulation observation and subject questionnaires were used to test the experimental system's approach to suppressing false alarms and alerting providers. A novel investigative methodology applied simulation and HFE techniques to replicate and study alarm fatigue in controlled settings for systems assessment and experimental research purposes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xue, Junfeng; Isern, Nancy G.; Ewing, R James
An in-situ nuclear magnetic resonance (NMR) bioreactor was developed and employed to monitor microbial metabolism under batch-growth conditions in real time. We selected Moorella thermoacetica ATCC 49707 as a test case. M. thermoacetica (formerly Clostridium thermoaceticum) is a strictly anaerobic, thermophilic, acetogenic, gram-positive bacterium with potential for industrial production of chemicals. The metabolic profiles of M. thermoacetica were characterized during growth in batch mode on xylose (a component of lignocellulosic biomass) using the new generation NMR bioreactor in combination with high-resolution, high sensitivity NMR (HR-NMR) spectroscopy. In-situ NMR measurements were performed using water-suppressed H-1 NMR spectroscopy at an NMR frequencymore » of 500 MHz, and aliquots of the bioreactor contents were taken for 600 MHz HR-NMR spectroscopy at specific intervals to confirm metabolite identifications and expand metabolite coverage. M. thermoacetica demonstrated the metabolic potential to produce formate, ethanol and methanol from xylose, in addition to its known capability of producing acetic acid. Real-time monitoring of bioreactor conditions showed a temporary pH decrease, with a concomitant increase in formic acid during exponential growth. Fermentation experiments performed outside of the magnet showed that the strong magnetic field employed for NMR detection did not significantly affect cell metabolism. Use of the in-situ NMR bioreactor facilitated monitoring of the fermentation process in real time, enabling identification of intermediate and end-point metabolites and their correlation with pH and biomass produced during culture growth. Real-time monitoring of culture metabolism using the NMR bioreactor in combination with the HR-NMR spectroscopy will allow optimization of the metabolism of microorganisms producing valuable bioproducts.« less
Mondésert, Blandine; Andrade, Jason G; Khairy, Paul; Guerra, Peter G; Dyrda, Katia; Macle, Laurent; Rivard, Léna; Thibault, Bernard; Talajic, Mario; Roy, Denis; Dubuc, Marc; Shohoudi, Azadeh
2014-08-01
Phrenic nerve palsy remains the most frequent complication associated with cryoballoon-based pulmonary vein (PV) isolation. We sought to characterize our experience using a novel monitoring technique for the prevention of phrenic nerve palsy. Two hundred consecutive cryoballoon-based PV isolation procedures between October 2010 and October 2013 were studied. In addition to standard abdominal palpation during right phrenic nerve pacing from the superior vena cava, all patients underwent diaphragmatic electromyographic monitoring using surface electrodes. Cryoablation was terminated on any perceived reduction in diaphragmatic motion or a 30% decrease in the compound motor action potential (CMAP). During right-sided ablation, a ≥30% reduction in CMAP amplitude occurred in 49 patients (24.5%). Diaphragmatic motion decreased in 30 of 49 patients and was preceded by a 30% reduction in CMAP amplitude in all. In 82% of cases, this reduction in CMAP amplitude occurred during right superior PV isolation. The baseline CMAP amplitude was 946.5±609.2 mV and decreased by 13.8±13.8% at the end of application. This decrease was more marked in the 33 PVs with a reduction in diaphragmatic motion than in those without (40.9±15.3% versus 11.3±10.5%; P<0.001). In 3 cases, phrenic nerve palsy persisted beyond the end of the procedure, with all cases recovering within 6 months. Despite the shortened application all veins were isolated. At repeat procedure the right-sided PVs reconnected less frequently than the left-sided PVs in those with phrenic nerve palsy. Electromyographic phrenic nerve monitoring using the surface CMAP is reliable, easy to perform, and offers an early warning to impending phrenic nerve injury. © 2014 American Heart Association, Inc.
Whole blood coagulation analyzers.
1997-08-01
Whole blood Coagulation analyzers (WBCAs) are widely used point-of-care (POC) testing devices found primarily in cardiothoracic surgical suites and cardia catheterization laboratories. Most of these devices can perform a number of coagulation tests that provide information about a patient's blood clotting status. Clinicians use the results of the WBCA tests, which are available minutes after applying a blood sample, primarily to monitor the effectiveness of heparin therapy--an anticoagulation therapy used during cardiopulmonary bypass (CPB) surgery, angioplasty, hemodialysis, and other clinical procedures. In this study we evaluated five WBCAs from four suppliers. Our testing focused on the applications for which WBCAs are primarily used: Monitoring moderate to high heparin levels, as would be required, for example, during CPB are angioplasty. For this function, WCBAs are typically used to perform an activated clotting time (ACT) test or, as one supplier refers to its test, a heparin management test (HMT). All models included in this study offered an ACT test or an HMT. Monitoring low heparin levels, as would be required, for example,during hemodialysis. For this function, WBCAs would normally be used to perform either a low-range ACT (LACT) test or a whole blood activated partial thromboplastin time (WBAPTT) test. Most of the evaluated units could perform at least one of these tests; one unit did not offer either test and was therefore not rated for this application. We rated and ranked each evaluated model separately for each of these two applications. In addition, we provided a combined rating and ranking that considers the units' appropriateness for performing both application. We based our conclusions on a unit's performance and humans factor design, as determined by our testing, and on its five-year life-cycle cost, as determined by our net present value (NPV) analysis. While we rated all evaluated units acceptable for each appropriate category, we did identify some significant differences that enabled us to rank the units in order of preference. We have included a Selection, Purchasing, and use guide at the end of this study to help facilities identify the unit that will best meet their needs.
Real-time people counting system using a single video camera
NASA Astrophysics Data System (ADS)
Lefloch, Damien; Cheikh, Faouzi A.; Hardeberg, Jon Y.; Gouton, Pierre; Picot-Clemente, Romain
2008-02-01
There is growing interest in video-based solutions for people monitoring and counting in business and security applications. Compared to classic sensor-based solutions the video-based ones allow for more versatile functionalities, improved performance with lower costs. In this paper, we propose a real-time system for people counting based on single low-end non-calibrated video camera. The two main challenges addressed in this paper are: robust estimation of the scene background and the number of real persons in merge-split scenarios. The latter is likely to occur whenever multiple persons move closely, e.g. in shopping centers. Several persons may be considered to be a single person by automatic segmentation algorithms, due to occlusions or shadows, leading to under-counting. Therefore, to account for noises, illumination and static objects changes, a background substraction is performed using an adaptive background model (updated over time based on motion information) and automatic thresholding. Furthermore, post-processing of the segmentation results is performed, in the HSV color space, to remove shadows. Moving objects are tracked using an adaptive Kalman filter, allowing a robust estimation of the objects future positions even under heavy occlusion. The system is implemented in Matlab, and gives encouraging results even at high frame rates. Experimental results obtained based on the PETS2006 datasets are presented at the end of the paper.
On the reliable use of satellite-derived surface water products for global flood monitoring
NASA Astrophysics Data System (ADS)
Hirpa, F. A.; Revilla-Romero, B.; Thielen, J.; Salamon, P.; Brakenridge, R.; Pappenberger, F.; de Groeve, T.
2015-12-01
Early flood warning and real-time monitoring systems play a key role in flood risk reduction and disaster response management. To this end, real-time flood forecasting and satellite-based detection systems have been developed at global scale. However, due to the limited availability of up-to-date ground observations, the reliability of these systems for real-time applications have not been assessed in large parts of the globe. In this study, we performed comparative evaluations of the commonly used satellite-based global flood detections and operational flood forecasting system using 10 major flood cases reported over three years (2012-2014). Specially, we assessed the flood detection capabilities of the near real-time global flood maps from the Global Flood Detection System (GFDS), and from the Moderate Resolution Imaging Spectroradiometer (MODIS), and the operational forecasts from the Global Flood Awareness System (GloFAS) for the major flood events recorded in global flood databases. We present the evaluation results of the global flood detection and forecasting systems in terms of correctly indicating the reported flood events and highlight the exiting limitations of each system. Finally, we propose possible ways forward to improve the reliability of large scale flood monitoring tools.
Space Communications Artificial Intelligence for Link Evaluation Terminal (SCAILET)
NASA Technical Reports Server (NTRS)
Shahidi, Anoosh
1991-01-01
A software application to assis end-users of the Link Evaluation Terminal (LET) for satellite communication is being developed. This software application incorporates artificial intelligence (AI) techniques and will be deployed as an interface to LET. The high burst rate (HBR) LET provides 30 GHz transmitting/20 GHz receiving, 220/110 Mbps capability for wideband communications technology experiments with the Advanced Communications Technology Satellite (ACTS). The HBR LET and ACTS are being developed at the NASA Lewis Research Center. The HBR LET can monitor and evaluate the integrity of the HBR communications uplink and downlink to the ACTS satellite. The uplink HBR transmission is performed by bursting the bit-pattern as a modulated signal to the satellite. By comparing the transmitted bit pattern with the received bit pattern, HBR LET can determine the bit error rate BER) under various atmospheric conditions. An algorithm for power augmentation is applied to enhance the system's BER performance at reduced signal strength caused by adverse conditions. Programming scripts, defined by the design engineer, set up the HBR LET terminal by programming subsystem devices through IEEE488 interfaces. However, the scripts are difficult to use, require a steep learning curve, are cryptic, and are hard to maintain. The combination of the learning curve and the complexities involved with editing the script files may discourage end-users from utilizing the full capabilities of the HBR LET system. An intelligent assistant component of SCAILET that addresses critical end-user needs in the programming of the HBR LET system as anticipated by its developers is described. A close look is taken at the various steps involved in writing ECM software for a C&P, computer and at how the intelligent assistant improves the HBR LET system and enhances the end-user's ability to perform the experiments.
NASA Astrophysics Data System (ADS)
Klaessens, John H.; van der Veen, Albert; Verdaasdonk, Rudolf M.
2017-03-01
Recently, low cost smart phone based thermal cameras are being considered to be used in a clinical setting for monitoring physiological temperature responses such as: body temperature change, local inflammations, perfusion changes or (burn) wound healing. These thermal cameras contain uncooled micro-bolometers with an internal calibration check and have a temperature resolution of 0.1 degree. For clinical applications a fast quality measurement before use is required (absolute temperature check) and quality control (stability, repeatability, absolute temperature, absolute temperature differences) should be performed regularly. Therefore, a calibrated temperature phantom has been developed based on thermistor heating on both ends of a black coated metal strip to create a controllable temperature gradient from room temperature 26 °C up to 100 °C. The absolute temperatures on the strip are determined with software controlled 5 PT-1000 sensors using lookup tables. In this study 3 FLIR-ONE cameras and one high end camera were checked with this temperature phantom. The results show a relative good agreement between both low-cost and high-end camera's and the phantom temperature gradient, with temperature differences of 1 degree up to 6 degrees between the camera's and the phantom. The measurements were repeated as to absolute temperature and temperature stability over the sensor area. Both low-cost and high-end thermal cameras measured relative temperature changes with high accuracy and absolute temperatures with constant deviations. Low-cost smart phone based thermal cameras can be a good alternative to high-end thermal cameras for routine clinical measurements, appropriate to the research question, providing regular calibration checks for quality control.
Ogasawara, Kuniaki; Inoue, Takashi; Kobayashi, Masakazu; Endo, Hidehoko; Yoshida, Kenji; Fukuda, Takeshi; Terasaki, Kazunori; Ogawa, Akira
2005-02-01
Cerebral hyperperfusion syndrome is a rare but serious complication of carotid endarterectomy (CEA). The aim of the present study was to determine whether intraoperative blood flow velocity (BFV) monitoring in the middle cerebral artery (MCA) by using transcranial Doppler ultrasonography (TCD) could be used as a reliable technique to detect cerebral hyperperfusion following CEA by comparing findings with those of brain single photon emission CT (SPECT). Intraoperative BFV monitoring was attempted in 67 patients undergoing CEA for treatment of ipsilateral internal carotid artery (ICA) stenosis (> or =70%). Cerebral blood flow (CBF) was also assessed using SPECT, which was performed before and immediately after CEA. Intraoperative BFV monitoring was achieved in 60 patients. Of the 60 patients, post-CEA hyperperfusion (CBF increase > or =100%, compared with preoperative values) was observed in six patients. The sensitivity, specificity, and positive predictive value of the BFV increases immediately after declamping of the ICA for detecting post-CEA hyperperfusion was 100%, 94% and 67%, respectively, with a cut-off point 2.0-fold that of preclamping BFV. The sensitivity and specificity of the BFV increases at the end of the procedure for detecting post-CEA hyperperfusion were 100% for both parameters, with cut-off points of 2.0- to 2.2-fold BFV of preclamping value. Hyperperfusion syndrome developed in two patients with post-CEA hyperperfusion, but intracerebral hemorrhage did not occur. In one of these two patients, BFV monitoring was not possible because of failure to obtain an adequate bone window. Intraoperative MCA BFV monitoring by using TCD is a less reliable method to detect cerebral hyperperfusion following CEA than postoperative MCA BFV monitoring, provided adequate monitoring can be achieved.
Tool Condition Monitoring in Micro-End Milling using wavelets
NASA Astrophysics Data System (ADS)
Dubey, N. K.; Roushan, A.; Rao, U. S.; Sandeep, K.; Patra, K.
2018-04-01
In this work, Tool Condition Monitoring (TCM) strategy is developed for micro-end milling of titanium alloy and mild steel work-pieces. Full immersion slot milling experiments are conducted using a solid tungsten carbide end mill for more than 1900 s to have reasonable amount of tool wear. During the micro-end milling process, cutting force and vibration signals are acquired using Kistler piezo-electric 3-component force dynamometer (9256C2) and accelerometer (NI cDAQ-9188) respectively. The force components and the vibration signals are processed using Discrete Wavelet Transformation (DWT) in both time and frequency window. 5-level wavelet packet decomposition using Db-8 wavelet is carried out and the detailed coefficients D1 to D5 for each of the signals are obtained. The results of the wavelet transformation are correlated with the tool wear. In case of vibration signals, de-noising is done for higher frequency components (D1) and force signals were de-noised for lower frequency components (D5). Increasing value of MAD (Mean Absolute Deviation) of the detail coefficients for successive channels depicted tool wear. The predictions of the tool wear are confirmed from the actual wear observed in the SEM of the worn tool.
Data federation strategies for ATLAS using XRootD
NASA Astrophysics Data System (ADS)
Gardner, Robert; Campana, Simone; Duckeck, Guenter; Elmsheuser, Johannes; Hanushevsky, Andrew; Hönig, Friedrich G.; Iven, Jan; Legger, Federica; Vukotic, Ilija; Yang, Wei; Atlas Collaboration
2014-06-01
In the past year the ATLAS Collaboration accelerated its program to federate data storage resources using an architecture based on XRootD with its attendant redirection and storage integration services. The main goal of the federation is an improvement in the data access experience for the end user while allowing more efficient and intelligent use of computing resources. Along with these advances come integration with existing ATLAS production services (PanDA and its pilot services) and data management services (DQ2, and in the next generation, Rucio). Functional testing of the federation has been integrated into the standard ATLAS and WLCG monitoring frameworks and a dedicated set of tools provides high granularity information on its current and historical usage. We use a federation topology designed to search from the site's local storage outward to its region and to globally distributed storage resources. We describe programmatic testing of various federation access modes including direct access over the wide area network and staging of remote data files to local disk. To support job-brokering decisions, a time-dependent cost-of-data-access matrix is made taking into account network performance and key site performance factors. The system's response to production-scale physics analysis workloads, either from individual end-users or ATLAS analysis services, is discussed.
Optimized Autonomous Space In-situ Sensor-Web for volcano monitoring
Song, W.-Z.; Shirazi, B.; Kedar, S.; Chien, S.; Webb, F.; Tran, D.; Davis, A.; Pieri, D.; LaHusen, R.; Pallister, J.; Dzurisin, D.; Moran, S.; Lisowski, M.
2008-01-01
In response to NASA's announced requirement for Earth hazard monitoring sensor-web technology, a multidisciplinary team involving sensor-network experts (Washington State University), space scientists (JPL), and Earth scientists (USGS Cascade Volcano Observatory (CVO)), is developing a prototype dynamic and scaleable hazard monitoring sensor-web and applying it to volcano monitoring. The combined Optimized Autonomous Space -In-situ Sensor-web (OASIS) will have two-way communication capability between ground and space assets, use both space and ground data for optimal allocation of limited power and bandwidth resources on the ground, and use smart management of competing demands for limited space assets. It will also enable scalability and seamless infusion of future space and in-situ assets into the sensor-web. The prototype will be focused on volcano hazard monitoring at Mount St. Helens, which has been active since October 2004. The system is designed to be flexible and easily configurable for many other applications as well. The primary goals of the project are: 1) integrating complementary space (i.e., Earth Observing One (EO-1) satellite) and in-situ (ground-based) elements into an interactive, autonomous sensor-web; 2) advancing sensor-web power and communication resource management technology; and 3) enabling scalability for seamless infusion of future space and in-situ assets into the sensor-web. To meet these goals, we are developing: 1) a test-bed in-situ array with smart sensor nodes capable of making autonomous data acquisition decisions; 2) efficient self-organization algorithm of sensor-web topology to support efficient data communication and command control; 3) smart bandwidth allocation algorithms in which sensor nodes autonomously determine packet priorities based on mission needs and local bandwidth information in real-time; and 4) remote network management and reprogramming tools. The space and in-situ control components of the system will be integrated such that each element is capable of autonomously tasking the other. Sensor-web data acquisition and dissemination will be accomplished through the use of the Open Geospatial Consortium Sensorweb Enablement protocols. The three-year project will demonstrate end-to-end system performance with the in-situ test-bed at Mount St. Helens and NASA's EO-1 platform. ??2008 IEEE.
NASA Astrophysics Data System (ADS)
Hartmann, H. C.; Pagano, T. C.; Sorooshian, S.; Bales, R.
2002-12-01
Expectations for hydroclimatic research are evolving as changes in the contract between science and society require researchers to provide "usable science" that can improve resource management policies and practices. However, decision makers have a broad range of abilities to access, interpret, and apply scientific research. "High-end users" have technical capabilities and operational flexibility capable of readily exploiting new information and products. "Low-end users" have fewer resources and are less likely to change their decision making processes without clear demonstration of benefits by influential early adopters (i.e., high-end users). Should research programs aim for efficiency, targeting high-end users? Should they aim for impact, targeting decisions with high economic value or great influence (e.g., state or national agencies)? Or should they focus on equity, whereby outcomes benefit groups across a range of capabilities? In this case study, we focus on hydroclimatic variability and forecasts. Agencies and individuals responsible for resource management decisions have varying perspectives about hydroclimatic variability and opportunities for using forecasts to improve decision outcomes. Improper interpretation of forecasts is widespread and many individuals find it difficult to place forecasts in an appropriate regional historical context. In addressing these issues, we attempted to mitigate traditional inequities in the scope, communication, and accessibility of hydroclimatic research results. High-end users were important in prioritizing information needs, while low-end users were important in determining how information should be communicated. For example, high-end users expressed hesitancy to use seasonal forecasts in the absence of quantitative performance evaluations. Our subsequently developed forecast evaluation framework and research products, however, were guided by the need for a continuum of evaluation measures and interpretive materials to enable low-end users to increase their understanding of probabilistic forecasts, credibility concepts, and implications for decision making. We also developed an interactive forecast assessment tool accessible over the Internet, to support resource decisions by individuals as well as agencies. The tool provides tutorials for guiding forecast interpretation, including quizzes that allow users to test their forecast interpretation skills. Users can monitor recent and historical observations for selected regions, communicated using terminology consistent with available forecast products. The tool also allows users to evaluate forecast performance for the regions, seasons, forecast lead times, and performance criteria relevant to their specific decision making situations. Using consistent product formats, the evaluation component allows individuals to use results at the level they are capable of understanding, while offering opportunity to shift to more sophisticated criteria. Recognizing that many individuals lack Internet access, the forecast assessment webtool design also includes capabilities for customized report generation so extension agents or other trusted information intermediaries can provide material to decision makers at meetings or site visits.
Development of autonomous gamma dose logger for environmental monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jisha, N. V.; Krishnakumar, D. N.; Surya Prakash, G.
2012-03-15
Continuous monitoring and archiving of background radiation levels in and around the nuclear installation is essential and the data would be of immense use during analysis of any untoward incidents. A portable Geiger Muller detector based autonomous gamma dose logger (AGDL) for environmental monitoring is indigenously designed and developed. The system operations are controlled by microcontroller (AT89S52) and the main features of the system are software data acquisition, real time LCD display of radiation level, data archiving at removable compact flash card. The complete system operates on 12 V battery backed up by solar panel and hence the system ismore » totally portable and ideal for field use. The system has been calibrated with Co-60 source (8.1 MBq) at various source-detector distances. The system is field tested and performance evaluation is carried out. This paper covers the design considerations of the hardware, software architecture of the system along with details of the front-end operation of the autonomous gamma dose logger and the data file formats. The data gathered during field testing and inter comparison with GammaTRACER are also presented in the paper. AGDL has shown excellent correlation with energy fluence monitor tuned to identify {sup 41}Ar, proving its utility for real-time plume tracking and source term estimation.« less
High-throughput monitoring of major cell functions by means of lensfree video microscopy
Kesavan, S. Vinjimore; Momey, F.; Cioni, O.; David-Watine, B.; Dubrulle, N.; Shorte, S.; Sulpice, E.; Freida, D.; Chalmond, B.; Dinten, J. M.; Gidrol, X.; Allier, C.
2014-01-01
Quantification of basic cell functions is a preliminary step to understand complex cellular mechanisms, for e.g., to test compatibility of biomaterials, to assess the effectiveness of drugs and siRNAs, and to control cell behavior. However, commonly used quantification methods are label-dependent, and end-point assays. As an alternative, using our lensfree video microscopy platform to perform high-throughput real-time monitoring of cell culture, we introduce specifically devised metrics that are capable of non-invasive quantification of cell functions such as cell-substrate adhesion, cell spreading, cell division, cell division orientation and cell death. Unlike existing methods, our platform and associated metrics embrace entire population of thousands of cells whilst monitoring the fate of every single cell within the population. This results in a high content description of cell functions that typically contains 25,000 – 900,000 measurements per experiment depending on cell density and period of observation. As proof of concept, we monitored cell-substrate adhesion and spreading kinetics of human Mesenchymal Stem Cells (hMSCs) and primary human fibroblasts, we determined the cell division orientation of hMSCs, and we observed the effect of transfection of siCellDeath (siRNA known to induce cell death) on hMSCs and human Osteo Sarcoma (U2OS) Cells. PMID:25096726
Development of autonomous gamma dose logger for environmental monitoring
NASA Astrophysics Data System (ADS)
Jisha, N. V.; Krishnakumar, D. N.; Surya Prakash, G.; Kumari, Anju; Baskaran, R.; Venkatraman, B.
2012-03-01
Continuous monitoring and archiving of background radiation levels in and around the nuclear installation is essential and the data would be of immense use during analysis of any untoward incidents. A portable Geiger Muller detector based autonomous gamma dose logger (AGDL) for environmental monitoring is indigenously designed and developed. The system operations are controlled by microcontroller (AT89S52) and the main features of the system are software data acquisition, real time LCD display of radiation level, data archiving at removable compact flash card. The complete system operates on 12 V battery backed up by solar panel and hence the system is totally portable and ideal for field use. The system has been calibrated with Co-60 source (8.1 MBq) at various source-detector distances. The system is field tested and performance evaluation is carried out. This paper covers the design considerations of the hardware, software architecture of the system along with details of the front-end operation of the autonomous gamma dose logger and the data file formats. The data gathered during field testing and inter comparison with GammaTRACER are also presented in the paper. AGDL has shown excellent correlation with energy fluence monitor tuned to identify 41Ar, proving its utility for real-time plume tracking and source term estimation.
USDA-ARS?s Scientific Manuscript database
Airborne imagery has been successfully used for mapping cotton root rot within cotton fields toward the end of the growing season. To better understand the progression of cotton root rot within the season, time series monitoring is required. In this study, an improved spatial and temporal data fusio...
D-Shaped Polarization Maintaining Fiber Sensor for Strain and Temperature Monitoring.
Qazi, Hummad Habib; Mohammad, Abu Bakar; Ahmad, Harith; Zulkifli, Mohd Zamani
2016-09-15
A D-shaped polarization-maintaining fiber (PMF) as fiber optic sensor for the simultaneous monitoring of strain and the surrounding temperature is presented. A mechanical end and edge polishing system with aluminum oxide polishing film is utilized to perform sequential polishing on one side (lengthwise) of the PMF in order to fabricate a D-shaped cross-section. Experimental results show that the proposed sensor has high sensitivity of 46 pm/µε and 130 pm/°C for strain and temperature, respectively, which is significantly higher than other recently reported work (mainly from 2013) related to fiber optic sensors. The easy fabrication method, high sensitivity, and good linearity make this sensing device applicable in various applications such as health monitoring and spatial analysis of engineering structures.
D-Shaped Polarization Maintaining Fiber Sensor for Strain and Temperature Monitoring
Qazi, Hummad Habib; Mohammad, Abu Bakar; Ahmad, Harith; Zulkifli, Mohd Zamani
2016-01-01
A D-shaped polarization-maintaining fiber (PMF) as fiber optic sensor for the simultaneous monitoring of strain and the surrounding temperature is presented. A mechanical end and edge polishing system with aluminum oxide polishing film is utilized to perform sequential polishing on one side (lengthwise) of the PMF in order to fabricate a D-shaped cross-section. Experimental results show that the proposed sensor has high sensitivity of 46 pm/µε and 130 pm/°C for strain and temperature, respectively, which is significantly higher than other recently reported work (mainly from 2013) related to fiber optic sensors. The easy fabrication method, high sensitivity, and good linearity make this sensing device applicable in various applications such as health monitoring and spatial analysis of engineering structures. PMID:27649195
Misistia, Anthony; Kahali, Sudeepto; Sundaramurthy, Aravind; Chandra, Namas
2016-01-01
The end plate mounted at the mouth of the shock tube is a versatile and effective implement to control and mitigate the end effects. We have performed a series of measurements of incident shock wave velocities and overpressures followed by quantification of impulse values (integral of pressure in time domain) for four different end plate configurations (0.625, 2, 4 inches, and an open end). Shock wave characteristics were monitored by high response rate pressure sensors allocated in six positions along the length of 6 meters long 229 mm square cross section shock tube. Tests were performed at three shock wave intensities, which was controlled by varying the Mylar membrane thickness (0.02, 0.04 and 0.06 inch). The end reflector plate installed at the exit of the shock tube allows precise control over the intensity of reflected waves penetrating into the shock tube. At the optimized distance of the tube to end plate gap the secondary waves were entirely eliminated from the test section, which was confirmed by pressure sensor at T4 location. This is pronounced finding for implementation of pure primary blast wave animal model. These data also suggest only deep in the shock tube experimental conditions allow exposure to a single shock wave free of artifacts. Our results provide detailed insight into spatiotemporal dynamics of shock waves with Friedlander waveform generated using helium as a driver gas and propagating in the air inside medium sized tube. Diffusion of driver gas (helium) inside the shock tube was responsible for velocity increase of reflected shock waves. Numerical simulations combined with experimental data suggest the shock wave attenuation mechanism is simply the expansion of the internal pressure. In the absence of any other postulated shock wave decay mechanisms, which were not implemented in the model the agreement between theory and experimental data is excellent. PMID:27603017
Kuriakose, Matthew; Skotak, Maciej; Misistia, Anthony; Kahali, Sudeepto; Sundaramurthy, Aravind; Chandra, Namas
2016-01-01
The end plate mounted at the mouth of the shock tube is a versatile and effective implement to control and mitigate the end effects. We have performed a series of measurements of incident shock wave velocities and overpressures followed by quantification of impulse values (integral of pressure in time domain) for four different end plate configurations (0.625, 2, 4 inches, and an open end). Shock wave characteristics were monitored by high response rate pressure sensors allocated in six positions along the length of 6 meters long 229 mm square cross section shock tube. Tests were performed at three shock wave intensities, which was controlled by varying the Mylar membrane thickness (0.02, 0.04 and 0.06 inch). The end reflector plate installed at the exit of the shock tube allows precise control over the intensity of reflected waves penetrating into the shock tube. At the optimized distance of the tube to end plate gap the secondary waves were entirely eliminated from the test section, which was confirmed by pressure sensor at T4 location. This is pronounced finding for implementation of pure primary blast wave animal model. These data also suggest only deep in the shock tube experimental conditions allow exposure to a single shock wave free of artifacts. Our results provide detailed insight into spatiotemporal dynamics of shock waves with Friedlander waveform generated using helium as a driver gas and propagating in the air inside medium sized tube. Diffusion of driver gas (helium) inside the shock tube was responsible for velocity increase of reflected shock waves. Numerical simulations combined with experimental data suggest the shock wave attenuation mechanism is simply the expansion of the internal pressure. In the absence of any other postulated shock wave decay mechanisms, which were not implemented in the model the agreement between theory and experimental data is excellent.
Acquisition and processing of data for isotope-ratio-monitoring mass spectrometry
NASA Technical Reports Server (NTRS)
Ricci, M. P.; Merritt, D. A.; Freeman, K. H.; Hayes, J. M.
1994-01-01
Methods are described for continuous monitoring of signals required for precise analyses of 13C, 18O, and 15N in gas streams containing varying quantities of CO2 and N2. The quantitative resolution (i.e. maximum performance in the absence of random errors) of these methods is adequate for determination of isotope ratios with an uncertainty of one part in 10(5); the precision actually obtained is often better than one part in 10(4). This report describes data-processing operations including definition of beginning and ending points of chromatographic peaks and quantitation of background levels, allowance for effects of chromatographic separation of isotopically substituted species, integration of signals related to specific masses, correction for effects of mass discrimination, recognition of drifts in mass spectrometer performance, and calculation of isotopic delta values. Characteristics of a system allowing off-line revision of parameters used in data reduction are described and an algorithm for identification of background levels in complex chromatograms is outlined. Effects of imperfect chromatographic resolution are demonstrated and discussed and an approach to deconvolution of signals from coeluting substances described.
Evaluating changes of the Bárdarbunga caldera using repeating earthquakes
NASA Astrophysics Data System (ADS)
Jónsdóttir, K.; Hjorleifsdottir, V.; Hooper, A.; Rivalta, E.; Rodriguez Cardozo, F. R.; Gudmundsson, M. T.; Geirsson, H.; Barsotti, S.
2017-12-01
The natural hazard monitoring in Iceland relies heavily on seismic monitoring. With an automated system for detecting earthquakes, locating and evaluating their focal mechanisms, 500 earthquakes are recorded weekly with magnitudes down to -0.5. During the Bárdarbunga volcanic unrest in 2014-2015 the seismicity intensified and up to thousands of earthquakes were recorded daily. The unrest was accompanied by caldera collapse, a rare event that has not been monitored in such detail before, providing a unique opportunity for better understanding the volcanic structure and processes. The 8x11 km caldera gradually subsided, triggering thousands of events with 80 earthquakes between M5-M5.8. A subsidence bowl up to 65 m deep was formed, while about 1.8 km3 of magma drained laterally along a subterranean path, forming flood basalt 47 km northeast of the volcano. The caldera collapse and magma outflow gradually declined until the eruption ended some 6 months later (27 February 2015). The seismicity continued to decline, both in the far end of the dyke as well as within the caldera for a few months. However, half a year later (in September 2015) seismicity within the caldera started to increase again and has been rather constant since, with tens of earthquakes recorded on the caldera rim every week and biggest events reaching magnitude 4.4. Here we present a seismic waveform correlation analysis where we look for similar repeating waveforms of the large caldera dataset. The analysis reveals a dramatic change occurring between February and May 2015. By allowing for anticorrelation we find that the earthquake's polarity reverses sign completely. The timing coincides with the ending of the caldera collapse and the eruption. Our results suggest that caldera fault movements were reversed soon after the eruption ended in spring 2015 when we also observe outwards movement of GPS stations around the caldera, indicating re-inflation of the magma chamber half a year before any seismicity increase was detected. These data and their interpretation are helpful to improve our understanding of the current status of the volcano and, eventually, to perform a more accurate and reliable hazard assessment.
DORIS system and integrity survey
NASA Astrophysics Data System (ADS)
Jayles, C.; Chauveau, J. P.; Didelot, F.; Auriol, A.; Tourain, C.
2016-12-01
DORIS, as other techniques for space geodesy (SLR, VLBI, GPS) has regularly progressed to meet the ever increasing needs of the scientific community in oceanography, geodesy or geophysics. Over the past 10 years, a particular emphasis has been placed on integrity monitoring of the system, which has contributed to the enhancement of the overall availability and quality of DORIS data products. A high level of monitoring is now provided by a centralized control of the whole system, including the global network of beacons and the onboard instruments, which perform a constant end-to-end survey. At first signs of any unusual behavior, a dedicated team is activated with well-established tools to investigate, to anticipate and to contain the impact of any potential failures. The procedure has increased the availability of DORIS beacons to 90%. The core topic of this article is to demonstrate that DORIS has implemented a high-level integrity control of its data. Embedded in the DORIS receiver, DIODE (DORIS Immediate Orbit Determination) is a Real-Time On-Board Orbit Determination software. Its accuracy has also been dramatically improved when compared to Precise Orbit Ephemeris (P.O.E.), down to 2.7 cm RMS on Jason-2, 3.0 cm on Saral and 3.3 cm on CryoSat-2. Specific quality indices were derived from the DIODE-based Kalman filters and are used to monitor network and system performance. This paper covers the definition of these indices and how the reliability and the reactiveness to incidents or anomalies of the system are improved. From these indices, we have provided detailed diagnostic information about the DORIS system, which is available in real-time, on-board each DORIS satellite. Using these capabilities, we have developed real-time functions that give an immediate diagnosis of the status of key components in the DORIS system. The Near-Real Time navigation system was improved and can distinguish and handle both satellite events and beacon anomalies. The next missions to use DORIS will be Jason-3 and Sentinel-3, and then Jason-CS and SWOT (Surface Water and Ocean Topography). The real-time information on satellite positions should be better than 2.5 cm RMS on the radial component. Science products will benefit from this improvement in DORIS's performance and data integrity.
Khalil, Mohammed K; Kirkley, Debbie L; Kibble, Jonathan D
2013-01-01
This article describes the development of an interactive computer-based laboratory manual, created to facilitate the teaching and learning of medical histology. The overarching goal of developing the manual is to facilitate self-directed group interactivities that actively engage students during laboratory sessions. The design of the manual includes guided instruction for students to navigate virtual slides, exercises for students to monitor learning, and cases to provide clinical relevance. At the end of the laboratory activities, student groups can generate a laboratory report that may be used to provide formative feedback. The instructional value of the manual was evaluated by a questionnaire containing both closed-ended and open-ended items. Closed-ended items using a five-point Likert-scale assessed the format and navigation, instructional contents, group process, and learning process. Open-ended items assessed student's perception on the effectiveness of the manual in facilitating their learning. After implementation for two consecutive years, student evaluation of the manual was highly positive and indicated that it facilitated their learning by reinforcing and clarifying classroom sessions, improved their understanding, facilitated active and cooperative learning, and supported self-monitoring of their learning. Copyright © 2013 American Association of Anatomists.
AsyncStageOut: Distributed user data management for CMS Analysis
NASA Astrophysics Data System (ADS)
Riahi, H.; Wildish, T.; Ciangottini, D.; Hernández, J. M.; Andreeva, J.; Balcas, J.; Karavakis, E.; Mascheroni, M.; Tanasijczuk, A. J.; Vaandering, E. W.
2015-12-01
AsyncStageOut (ASO) is a new component of the distributed data analysis system of CMS, CRAB, designed for managing users' data. It addresses a major weakness of the previous model, namely that mass storage of output data was part of the job execution resulting in inefficient use of job slots and an unacceptable failure rate at the end of the jobs. ASO foresees the management of up to 400k files per day of various sizes, spread worldwide across more than 60 sites. It must handle up to 1000 individual users per month, and work with minimal delay. This creates challenging requirements for system scalability, performance and monitoring. ASO uses FTS to schedule and execute the transfers between the storage elements of the source and destination sites. It has evolved from a limited prototype to a highly adaptable service, which manages and monitors the user file placement and bookkeeping. To ensure system scalability and data monitoring, it employs new technologies such as a NoSQL database and re-uses existing components of PhEDEx and the FTS Dashboard. We present the asynchronous stage-out strategy and the architecture of the solution we implemented to deal with those issues and challenges. The deployment model for the high availability and scalability of the service is discussed. The performance of the system during the commissioning and the first phase of production are also shown, along with results from simulations designed to explore the limits of scalability.
AsyncStageOut: Distributed User Data Management for CMS Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riahi, H.; Wildish, T.; Ciangottini, D.
2015-12-23
AsyncStageOut (ASO) is a new component of the distributed data analysis system of CMS, CRAB, designed for managing users' data. It addresses a major weakness of the previous model, namely that mass storage of output data was part of the job execution resulting in inefficient use of job slots and an unacceptable failure rate at the end of the jobs. ASO foresees the management of up to 400k files per day of various sizes, spread worldwide across more than 60 sites. It must handle up to 1000 individual users per month, and work with minimal delay. This creates challenging requirementsmore » for system scalability, performance and monitoring. ASO uses FTS to schedule and execute the transfers between the storage elements of the source and destination sites. It has evolved from a limited prototype to a highly adaptable service, which manages and monitors the user file placement and bookkeeping. To ensure system scalability and data monitoring, it employs new technologies such as a NoSQL database and re-uses existing components of PhEDEx and the FTS Dashboard. We present the asynchronous stage-out strategy and the architecture of the solution we implemented to deal with those issues and challenges. The deployment model for the high availability and scalability of the service is discussed. The performance of the system during the commissioning and the first phase of production are also shown, along with results from simulations designed to explore the limits of scalability.« less
Brown, Andrew S; Brown, Richard J C; Coleman, Peter J; Conolly, Christopher; Sweetman, Andrew J; Jones, Kevin C; Butterfield, David M; Sarantaridis, Dimitris; Donovan, Brian J; Roberts, Ian
2013-06-01
The impact of human activities on the health of the population and of the wider environment has prompted action to monitor the presence of toxic compounds in the atmosphere. Toxic organic micropollutants (TOMPs) are some of the most insidious and persistent of these pollutants. Since 1991 the United Kingdom has operated nationwide air quality networks to assess the presence of TOMPs, including polycyclic aromatic hydrocarbons (PAHs), in ambient air. The data produced in 2010 marked 20 years of nationwide PAH monitoring. This paper marks this milestone by providing a novel and critical review of the data produced since nationwide monitoring began up to the end of 2011 (the latest year for which published data is available), discussing how the networks performing this monitoring has evolved, and elucidating trends in the concentrations of the PAHs measured. The current challenges in the area and a forward look to the future of air quality monitoring for PAHs are also discussed briefly.
Management of mechanical ventilation during laparoscopic surgery.
Valenza, Franco; Chevallard, Giorgio; Fossali, Tommaso; Salice, Valentina; Pizzocri, Marta; Gattinoni, Luciano
2010-06-01
Laparoscopy is widely used in the surgical treatment of a number of diseases. Its advantages are generally believed to lie on its minimal invasiveness, better cosmetic outcome and shorter length of hospital stay based on surgical expertise and state-of-the-art equipment. Thousands of laparoscopic surgical procedures performed safely prove that mechanical ventilation during anaesthesia for laparoscopy is well tolerated by a vast majority of patients. However, the effects of pneumoperitoneum are particularly relevant to patients with underlying lung disease as well as to the increasing number of patients with higher-than-normal body mass index. Moreover, many surgical procedures are significantly longer in duration when performed with laparoscopic techniques. Taken together, these factors impose special care for the management of mechanical ventilation during laparoscopic surgery. The purpose of the review is to summarise the consequences of pneumoperitoneum on the standard monitoring of mechanical ventilation during anaesthesia and to discuss the rationale of using a protective ventilation strategy during laparoscopic surgery. The consequences of chest wall derangement occurring during pneumoperitoneum on airway pressure and central venous pressure, together with the role of end-tidal-CO2 monitoring are emphasised. Ventilatory and non-ventilatory strategies to protect the lung are discussed.
Safety of air travel following acute myocardial infarction.
Roby, Howard; Lee, Anna; Hopkins, Andrew
2002-02-01
A randomized, single-blind, controlled trial was carried out to: 1) examine the safety of patients flying on commercial airlines 2 wk after a myocardial infarction; 2) determine whether or not the use of supplemental oxygen was associated with a reduced risk of in-flight adverse events; and 3) determine the need for a medical escort. There were 38 patients who were prospectively and randomly assigned supplemental continuous oxygen therapy (2 L x min(-1) via nasal prongs; n = 19) or no oxygen (n = 19) during the flight. Prior to flying, an escorting doctor completed a medical questionnaire for each patient. Both groups underwent Holter monitoring throughout the flight. The major end-point was the development of inflight myocardial ischemia, as detected by Holter monitoring. Minor end-points included patients complaining of chest pain or dyspnea; the detection of bigeminy or trigeminy by Holter monitoring; or oxygen desaturation to less than 90%, as measured by pulse oximetry. Of the 38 patients enrolled, there was only 1 major end-point. This patient had a brief, self-limiting, asymptomatic episode of myocardial ischemia diagnosed by Holter monitoring. Minor end-points occurred in 13 (34%) patients. One patient had asymptomatic evidence of S-T depression on a transport monitor, but not on the Holter. Five patients had transient low (<90%) oxygen saturations, two complained of chest pain, and five had complex ventricular ectopic beats or periods of transient ventricular tachycardia. None of the minor end-points were associated with Holter evidence of myocardial ischemia. Of the 30 patients with completed questionnaires and Holter results, there was no difference in the incidence of minor end-points between the oxygen (5/13) and no oxygen groups (6/15) (p = 0.93). Intervention by the medical escort consisted of commencing oxygen therapy on those patients with low oxygen saturations and those with chest pain. Use of an already dispensed glyceryl trinitrate spray was initiated in one patient with chest pain that turned out to be non-ischemic when the Holter traces were later analyzed. This study suggests that, provided that care is taken during the immediate preflight and postflight phases not to overexert the patients, neither supplemental oxygen nor medical escorts are needed in the transportation of patients who fly 2 wk after acute myocardial infarction.
Pasion, Editha; Good, Levell; Tizon, Jisebelle; Krieger, Staci; O'Kier, Catherine; Taylor, Nicole; Johnson, Jennifer; Horton, Carrie M; Peterson, Mary
2010-11-01
To determine if the monitor cursor-line feature on bedside monitors is accurate for measuring central venous and pulmonary artery pressures in cardiac surgery patients. Central venous and pulmonary artery pressures were measured via 3 methods (end-expiratory graphic recording, monitor cursor-line display, and monitor digital display) in a convenience sample of postoperative cardiac surgery patients. Pressures were measured twice during both mechanical ventilation and spontaneous breathing. Analysis of variance was used to determine differences between measurement methods and the percentage of monitor pressures that differed by 4 mm Hg or more from the measurement obtained from the graphic recording. Significance level was set at P less than .05. Twenty-five patients were studied during mechanical ventilation (50 measurements) and 21 patients during spontaneous breathing (42 measurements). Measurements obtained via the 3 methods did not differ significantly for either type of pressure (P > .05). Graphically recorded pressures and measurements obtained via the monitor cursor-line or digital display methods differed by 4 mm Hg or more in 4% and 6% of measurements, respectively, during mechanical ventilation and 4% and 11%, respectively, during spontaneous breathing. The monitor cursor-line method for measuring central venous and pulmonary artery pressures may be a reasonable alternative to the end-expiratory graphic recording method in hemodynamically stable, postoperative cardiac surgery patients. Use of the digital display on the bedside monitor may result in larger discrepancies from the graphically recorded pressures than when the cursor-line method is used, particularly in spontaneously breathing patients.
Development of GUI Type On-Line Condition Monitoring Program for a Turboprop Engine Using Labview
NASA Astrophysics Data System (ADS)
Kong, Changduk; Kim, Keonwoo
2011-12-01
Recently, an aero gas turbine health monitoring system has been developed for precaution and maintenance action against faults or performance degradations of the advanced propulsion system which occurs in severe environments such as high altitude, foreign object damage particles, hot and heavy rain and snowy atmospheric conditions. However to establish this health monitoring system, the online condition monitoring program is firstly required, and the program must monitor the engine performance trend through comparison between measured engine performance data and base performance results calculated by base engine performance model. This work aims to develop a GUI type on-line condition monitoring program for the PT6A-67 turboprop engine of a high altitude and long endurance operation UAV using LabVIEW. The base engine performance of the on-line condition monitoring program is simulated using component maps inversely generated from the limited performance deck data provided by engine manufacturer. The base engine performance simulation program is evaluated because analysis results by this program agree well with the performance deck data. The proposed on-line condition program can monitor the real engine performance as well as the trend through precise comparison between clean engine performance results calculated by the base performance simulation program and measured engine performance signals. In the development phase of this monitoring system, a signal generation module is proposed to evaluate the proposed online monitoring system. For user friendly purpose, all monitoring program are coded by LabVIEW, and monitoring examples are demonstrated using the proposed GUI type on-condition monitoring program.
NASA Astrophysics Data System (ADS)
Flanigan, Katherine A.; Johnson, Nephi R.; Hou, Rui; Ettouney, Mohammed; Lynch, Jerome P.
2017-04-01
The ability to quantitatively assess the condition of railroad bridges facilitates objective evaluation of their robustness in the face of hazard events. Of particular importance is the need to assess the condition of railroad bridges in networks that are exposed to multiple hazards. Data collected from structural health monitoring (SHM) can be used to better maintain a structure by prompting preventative (rather than reactive) maintenance strategies and supplying quantitative information to aid in recovery. To that end, a wireless monitoring system is validated and installed on the Harahan Bridge which is a hundred-year-old long-span railroad truss bridge that crosses the Mississippi River near Memphis, TN. This bridge is exposed to multiple hazards including scour, vehicle/barge impact, seismic activity, and aging. The instrumented sensing system targets non-redundant structural components and areas of the truss and floor system that bridge managers are most concerned about based on previous inspections and structural analysis. This paper details the monitoring system and the analytical method for the assessment of bridge condition based on automated data-driven analyses. Two primary objectives of monitoring the system performance are discussed: 1) monitoring fatigue accumulation in critical tensile truss elements; and 2) monitoring the reliability index values associated with sub-system limit states of these members. Moreover, since the reliability index is a scalar indicator of the safety of components, quantifiable condition assessment can be used as an objective metric so that bridge owners can make informed damage mitigation strategies and optimize resource management on single bridge or network levels.
Advancing Stage 2 Research on Measures for Monitoring Kindergarten Reading Progress.
Clemens, Nathan H; Soohoo, Michelle M; Wiley, Colby P; Hsiao, Yu-Yu; Estrella, Ivonne; Allee-Smith, Paula J; Yoon, Myeongsun
Although several measures exist for frequently monitoring early reading progress, little research has specifically investigated their technical properties when administered on a frequent basis with kindergarten students. In this study, kindergarten students ( N = 137) of whom the majority was receiving supplemental intervention for reading skills were monitored using Letter Sound Fluency, Phoneme Segmentation Fluency, Word Reading Fluency, Nonsense Word Fluency, Highly Decodable Passages, and Spelling on a biweekly basis between February and May. Acceptable reliability was observed for all measures. Analyses of slope validity using latent growth models, latent change score models, and slope differences according to level of year-end achievement indicated that the relation of slope to overall reading skills varied across the measures. A suggested approach to kindergarten students' reading progress is offered that includes Letter Sound Fluency and a measure of word-reading skills to provide a comprehensive picture of student growth toward important year-end reading outcomes.
Ferreira, Sandro S.; Krinski, Kleverton; Alves, Ragami C.; Benites, Mariana L.; Redkva, Paulo E.; Elsangedy, Hassan M.; Buzzachera, Cosme F.; Souza-Junior, Tácito P.; da Silva, Sergio G.
2014-01-01
The rating of perceived exertion (RPE) is ability to detect and interpret organic sensations while performing exercises. This method has been used to measure the level of effort that is felt during weight-training at a given intensity. The purpose of this investigation was to compare session RPE values with those of traditional RPE measurements for different weight-training muscle actions, performed together or separately. Fourteen women with no former weight-training experience were recruited for the investigation. All participants completed five sessions of exercise: familiarization, maximum force, concentric-only (CONC-only), eccentric-only (ECC-only), and dynamic (DYN = CONC + ECC). The traditional RPE method was measured after each series of exercises, and the session RPE was measured 30 min after the end of the training session. The statistical analyses used were the paired t-test, one-way analysis of variance, and repeated measures analysis of variance. Significant differences between traditional RPE and session RPE for DYN, CONC, and ECC exercises were not found. This investigation demonstrated that session RPE is similar to traditional RPE in terms of weight-training involving concentric, eccentric, or dynamic muscle exercises, and that it can be used to prescribe and monitor weight-training sessions in older subjects. PMID:24834354
Sentinel-3 Mission Performance Center: paving the way of high-quality controlled data
NASA Astrophysics Data System (ADS)
Bruniquel, Jerome; Féménias, Pierre; Goryl, Philippe; Bonekamp, Hans
2015-04-01
As part of the Sentinel-3 mission and in order to ensure the highest quality of products, ESA and EUMETSAT set up the Sentinel-3 Mission Performance Centre (S-3 MPC). This facility is part of the Payload Data Ground Segment (PDGS) and aims at controlling the quality of all generated products, from L0 to L2. The S-3 MPC is composed of a Coordinating Centre (CC), where the core infrastructure is hosted, which is in charge of the main routine activities (especially the quality control of data) and the overall service management. Expert Support Laboratories (ESLs) are involved in calibration and validation activities and provide specific assessment of the products (e.g., analysis of trends, ad hoc analysis of anomalies, etc.). The S-3 MPC interacts with the Processing Archiving Centers (PACs) and the marine centre at EUMETSAT. The S-3 MPC service contract is currently carried out by 23-partners consortium led by ACRI-ST, France. The S-3 MPC contract was kick-offed in September 2014 with a first set-up phase of 12 months. After the launch of S3-A (planned before end of 2015), the S-3 MPC will start its second phase to support commissioning activities. Then a routine operation phase of up to 5 years will begin, including the commissioning activities related to S3-B. The main S-3 MPC activities are: - Calibration: to update on-board and on-ground configuration data in order to meet product quality requirements. - Validation: to assess, by independent means with respect to the methods and tools used for calibration, the quality of the generated data products. Validation functions provide feedback to calibration and data processors corrective and perfective maintenance activities. - Verification: to confirm that the specified requirements on a system have been satisfied. - Quality Control: to routinely monitor the status of the sensor and to check if the derived products (Level 0, Level 1 and Level 2) meet the quality requirements along mission lifetime. - Algorithm Maintenance and Evolution: to maintain the algorithm documentation baseline and to perform the necessary corrections/evolutions as agreed with the mission management and to validate them. - System performance monitoring: to monitor the end-to-end performance of the Sentinel-3 relevant system operations and assess them with respect to the operations plan. Due to the high volume of data and in order to facilitate the analysis to be performed by the expert scientists, an innovative facility is being implemented as part of the MPC/CC. We propose to all ESLs to use a collaborative platform which is a secured IT environment mixing hardware and software elements enabling users to work remotely. The main benefit is that they don't need to download huge amount of data by performing their processing and analysis where the products are located. First tests of the platform have been successfully done in last December. Note: The work performed in the frame of this contract is carried out with funding by the European Union. The views expressed herein can in no way be taken to reflect the official opinion of either the European Union or the European Space Agency.
Ayers, John W; Ribisl, Kurt M; Brownstein, John S
2011-04-01
Public interest in electronic nicotine delivery systems (ENDS) is undocumented. By monitoring search queries, ENDS popularity and correlates of their popularity were assessed in Australia, Canada, the United Kingdom (UK), and the U.S. English-language Google searches conducted from January 2008 through September 2010 were compared to snus, nicotine replacement therapy (NRT), and Chantix® or Champix®. Searches for each week were scaled to the highest weekly search proportion (100), with lower values indicating the relative search proportion compared to the highest-proportion week (e.g., 50=50% of the highest observed proportion). Analyses were performed in 2010. From July 2008 through February 2010, ENDS searches increased in all nations studied except Australia, there an increase occurred more recently. By September 2010, ENDS searches were several-hundred-fold greater than searches for smoking alternatives in the UK and U.S., and were rivaling alternatives in Australia and Canada. Across nations, ENDS searches were highest in the U.S., followed by similar search intensity in Canada and the UK, with Australia having the fewest ENDS searches. Stronger tobacco control, created by clean indoor air laws, cigarette taxes, and anti-smoking populations, were associated with consistently higher levels of ENDS searches. The online popularity of ENDS has surpassed that of snus and NRTs, which have been on the market for far longer, and is quickly outpacing Chantix or Champix. In part, the association between ENDS's popularity and stronger tobacco control suggests ENDS are used to bypass, or quit in response to, smoking restrictions. Search query surveillance is a valuable, real-time, free, and public method to evaluate the diffusion of new health products. This method may be generalized to other behavioral, biological, informational, or psychological outcomes manifested on search engines. Copyright © 2011 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Nishimura, Akiko; Harashima, Shin-ichi; Honda, Ikumi; Shimizu, Yoshiyuki; Harada, Norio; Nagashima, Kazuaki; Hamasaki, Akihiro; Hosoda, Kiminori; Inagaki, Nobuya
2014-07-01
Color affects emotions, feelings, and behaviors. We hypothesized that color used in self-monitoring of blood glucose (SMBG) is helpful for patients to recognize and act on their glucose levels to improve glycemic control. Here, two color-indication methods, color record (CR) and color display (CD), were independently compared for their effects on glycemic control in less frequently insulin-treated type 2 diabetes. One hundred twenty outpatients were randomly allocated to four groups with 2×2 factorial design: CR or non-CR and CD or non-CD. Blood glucose levels were recorded in red or blue pencil in the CR arm, and a red or blue indicator light on the SMBG meter was lit in the CD arm, under hyperglycemia or hypoglycemia, respectively. The primary end point was difference in glycated hemoglobin (HbA1c) reduction in 24 weeks. Secondary end points were self-management performance change and psychological state change. HbA1c levels at 24 weeks were significantly decreased in the CR arm by -0.28% but were increased by 0.03% in the non-CR arm (P=0.044). In addition, diet and exercise scores were significantly improved in the CR arm compared with the non-CR arm. The exercise score showed significant improvement in the CD arm compared with the non-CD arm but without a significant difference in HbA1c reduction. Changes in psychological states were not altered between the arms. CR has a favorable effect on self-management performance without any influence on psychological stress, resulting in improved glycemic control in type 2 diabetes patients using less frequent insulin injection. Thus, active but not passive usage of color-indication methods by patients is important in successful SMBG.
A multi-GPU real-time dose simulation software framework for lung radiotherapy.
Santhanam, A P; Min, Y; Neelakkantan, H; Papp, N; Meeks, S L; Kupelian, P A
2012-09-01
Medical simulation frameworks facilitate both the preoperative and postoperative analysis of the patient's pathophysical condition. Of particular importance is the simulation of radiation dose delivery for real-time radiotherapy monitoring and retrospective analyses of the patient's treatment. In this paper, a software framework tailored for the development of simulation-based real-time radiation dose monitoring medical applications is discussed. A multi-GPU-based computational framework coupled with inter-process communication methods is introduced for simulating the radiation dose delivery on a deformable 3D volumetric lung model and its real-time visualization. The model deformation and the corresponding dose calculation are allocated among the GPUs in a task-specific manner and is performed in a pipelined manner. Radiation dose calculations are computed on two different GPU hardware architectures. The integration of this computational framework with a front-end software layer and back-end patient database repository is also discussed. Real-time simulation of the dose delivered is achieved at once every 120 ms using the proposed framework. With a linear increase in the number of GPU cores, the computational time of the simulation was linearly decreased. The inter-process communication time also improved with an increase in the hardware memory. Variations in the delivered dose and computational speedup for variations in the data dimensions are investigated using D70 and D90 as well as gEUD as metrics for a set of 14 patients. Computational speed-up increased with an increase in the beam dimensions when compared with a CPU-based commercial software while the error in the dose calculation was <1%. Our analyses show that the framework applied to deformable lung model-based radiotherapy is an effective tool for performing both real-time and retrospective analyses.
TOWARDS A NOVEL MODULAR ARCHITECTURE FOR CERN RADIATION MONITORING.
Boukabache, Hamza; Pangallo, Michel; Ducos, Gael; Cardines, Nicola; Bellotta, Antonio; Toner, Ciarán; Perrin, Daniel; Forkel-Wirth, Doris
2017-04-01
The European Organization for Nuclear Research (CERN) has the legal obligation to protect the public and the people working on its premises from any unjustified exposure to ionising radiation. In this context, radiation monitoring is one of the main concerns of the Radiation Protection Group. After 30 y of reliable service, the ARea CONtroller (ARCON) system is approaching the end of its lifecycle, which raises the need for new, more efficient radiation monitors with a high level of modularity to ensure better maintainability. Based on these two main principles, new detectors are currently being developed that will be capable of measuring very low dose rates down to 50 nSv h-1, whilst being able to measure radiation over an extensive range of 8 decades without any auto scaling. To reach these performances, CERN Radiation MOnitoring Electronics (CROME), the new generation of CERN radiation monitors, is based on the versatile architecture that includes new read-out electronics developed by the Instrumentation and Logistics section of the CERN Radiation Protection Group as well as a reconfigurable system on chip capable of performing complex processing calculations. Beside the capabilities of CROME to continuously measure the ambient dose rate, the system generates radiation alarms, provides interlock signals, drives alarm display units through a fieldbus and provides long-term, permanent and reliable data logging. The measurement tests performed during the first phase of the development show very promising results that pave the way to the second phase: the certification. © The Author 2016. Published by Oxford University Press.
TOWARDS A NOVEL MODULAR ARCHITECTURE FOR CERN RADIATION MONITORING
Boukabache, Hamza; Pangallo, Michel; Ducos, Gael; Cardines, Nicola; Bellotta, Antonio; Toner, Ciarán; Perrin, Daniel; Forkel-Wirth, Doris
2017-01-01
Abstract The European Organization for Nuclear Research (CERN) has the legal obligation to protect the public and the people working on its premises from any unjustified exposure to ionising radiation. In this context, radiation monitoring is one of the main concerns of the Radiation Protection Group. After 30 y of reliable service, the ARea CONtroller (ARCON) system is approaching the end of its lifecycle, which raises the need for new, more efficient radiation monitors with a high level of modularity to ensure better maintainability. Based on these two main principles, new detectors are currently being developed that will be capable of measuring very low dose rates down to 50 nSv h−1, whilst being able to measure radiation over an extensive range of 8 decades without any auto scaling. To reach these performances, CERN Radiation MOnitoring Electronics (CROME), the new generation of CERN radiation monitors, is based on the versatile architecture that includes new read-out electronics developed by the Instrumentation and Logistics section of the CERN Radiation Protection Group as well as a reconfigurable system on chip capable of performing complex processing calculations. Beside the capabilities of CROME to continuously measure the ambient dose rate, the system generates radiation alarms, provides interlock signals, drives alarm display units through a fieldbus and provides long-term, permanent and reliable data logging. The measurement tests performed during the first phase of the development show very promising results that pave the way to the second phase: the certification. PMID:27909154
NASA Astrophysics Data System (ADS)
Abbou, S.; Dillet, J.; Maranzana, G.; Didierjean, S.; Lottin, O.
2017-02-01
Operating a PEMFC with a dead-ended anode may lead to local fuel-starvation because of water and possibly nitrogen accumulation in the anode compartment. In previous works, we used a segmented linear cell with reference electrodes to monitor simultaneously the local potentials and current densities during dead-ended anode operation. The results indicated that water transport as well as nitrogen crossover through the membrane were most probably the two key factors governing fuel starvation. In this first from a set of two papers, we evaluated with more details the contributions of nitrogen crossover and water transport to hydrogen starvation. To assess nitrogen contribution, the fuel cell cathode compartment was first supplied with pure oxygen instead of air. The results showed that in the absence of nitrogen (in the cathode side) the fuel starvation was much slower than with air, suggesting that nitrogen contribution cannot be neglected. On the other hand, the contribution of water flooding to hydrogen starvation was investigated by using different cooling temperature on the cathode and anode sides in order to drive water toward the colder plate. The results showed that with a colder anode side, fuel starvation was faster. In the opposite case of a hotter anode plate, water accumulation in the anode compartment was limited, nitrogen crossover through the membrane was the main reason for hydrogen starvation in this case. To fully assess the impact of the thermal configurations on membrane-electrode assembly (MEA) degradation, aging protocols with a dead-ended anode and a fixed closing time were also performed. The results showed that operation with a hotter anode could help to limit significantly cathode ElectroChemical Surface Area (ECSA) losses along the cell area and performance degradation induced by hydrogen starvation.
Oziel, Moshe; Korenstein, Rafi; Rubinsky, Boris
2017-01-01
This theoretical study examines the use of radar to continuously monitor "accumulation of blood in the head" (ACBH) non-invasively and from a distance, after the location of a hematoma or hemorrhage in the brain was initially identified with conventional medical imaging. Current clinical practice is to monitor ABCH with multiple, subsequent, conventional medical imaging. The radar technology introduced in this study could provide a lower cost and safe alternative to multiple conventional medical imaging monitoring for ACBH. The goal of this study is to evaluate the feasibility of using radar to monitor changes in blood volume in the brain through a numerical simulation of ACBH monitoring from remote, with a directional spiral slot antennae, in 3-D models of the brain. The focus of this study is on evaluating the effect of frequencies on the antennae reading. To that end we performed the calculations for frequencies of 100 MHz, 500 MHz and 1 GHz. The analysis shows that the ACBH can be monitored with radar and the monitoring resolution improves with an increase in frequency, in the range studied. However, it also appears that when typical clinical dimensions of hematoma and hemorrhage are used, the variable ratio of blood volume radius and radar wavelength can bring the measurements into the Mie and Rayleigh regions of the radar cross section. In these regions there is an oscillatory change in signal with blood volume size. For some frequencies there is an increase in signal with an increase in volume while in others there is a decrease. While radar can be used to monitor ACBH non-invasively and from a distance, the observed Mie region dependent oscillatory relation between blood volume size and wavelength requires further investigation. Classifiers, multifrequency algorithms or ultra-wide band radar are possible solutions that should be explored in the future.
Korenstein, Rafi; Rubinsky, Boris
2017-01-01
Background This theoretical study examines the use of radar to continuously monitor “accumulation of blood in the head” (ACBH) non-invasively and from a distance, after the location of a hematoma or hemorrhage in the brain was initially identified with conventional medical imaging. Current clinical practice is to monitor ABCH with multiple, subsequent, conventional medical imaging. The radar technology introduced in this study could provide a lower cost and safe alternative to multiple conventional medical imaging monitoring for ACBH. Materials and methods The goal of this study is to evaluate the feasibility of using radar to monitor changes in blood volume in the brain through a numerical simulation of ACBH monitoring from remote, with a directional spiral slot antennae, in 3-D models of the brain. The focus of this study is on evaluating the effect of frequencies on the antennae reading. To that end we performed the calculations for frequencies of 100 MHz, 500 MHz and 1 GHz. Results and discussion The analysis shows that the ACBH can be monitored with radar and the monitoring resolution improves with an increase in frequency, in the range studied. However, it also appears that when typical clinical dimensions of hematoma and hemorrhage are used, the variable ratio of blood volume radius and radar wavelength can bring the measurements into the Mie and Rayleigh regions of the radar cross section. In these regions there is an oscillatory change in signal with blood volume size. For some frequencies there is an increase in signal with an increase in volume while in others there is a decrease. Conclusions While radar can be used to monitor ACBH non-invasively and from a distance, the observed Mie region dependent oscillatory relation between blood volume size and wavelength requires further investigation. Classifiers, multifrequency algorithms or ultra-wide band radar are possible solutions that should be explored in the future. PMID:29023544
2018-01-01
Water-borne bacteria, found in cold water storage tanks, are causative agents for various human infections and diseases including Legionnaires’ disease. Consequently, regular microbiological monitoring of tank water is undertaken as part of the regulatory framework used to control pathogenic bacteria. A key assumption is that a small volume of water taken from under the ball valve (where there is easy access to the stored water) will be representative of the entire tank. To test the reliability of this measure, domestic water samples taken from different locations of selected tanks in London properties between November 2015 and July 2016 were analysed for TVCs, Pseudomonas and Legionella at an accredited laboratory, according to regulatory requirements. Out of ~6000 tanks surveyed, only 15 were selected based on the ability to take a water sample from the normal sampling hatch (located above the ball valve) and from the far end of the tank (usually requiring disassembly of the tank lid with risk of structural damage), and permission being granted by the site manager to undertake the additional investigation and sampling. Despite seasonal differences in water temperature, we found 100% compliance at the ball valve end. In contrast, 40% of the tanks exceeded the regulatory threshold for temperature at the far end of the tank in the summer months. Consequently, 20% of the tanks surveyed failed to trigger appropriate regulatory action based on microbiological analyses of the water sample taken under the ball valve compared to the far end sample using present-day standards. These data show that typical water samples collected for routine monitoring may often underestimate the microbiological status of the water entering the building, thereby increasing the risk of exposure to water bourne pathogens with potential public health implications. We propose that water storage tanks should be redesigned to allow access to the far end of tanks for routine monitoring purposes, and that water samples used to ascertain the regulatory compliance of stored water in tanks should be taken at the point at which water is abstracted for use in the building. PMID:29649274
Open-Ended Learning Environments: A Theoretical Framework and Model for Design.
ERIC Educational Resources Information Center
Hill, Janette R.; Land, Susan M.
This paper presents a framework and model for design of open-ended learning environments (OELEs). First, an overview is presented that addresses key characteristics of OELEs, including: use of meaningful, complex contexts; provision of tools and resources; learner reflection and self-monitoring; and social, material, or technological scaffolding.…
Cardiopulmonary Laboratory AFSC 904X0
1990-10-01
SET UP POSITIVE END EXPIRATORY PRESSURE (PEEP) DEVICES 100 J321 SET UP CONTINUOUS POSITIVE AIRWAY PRESSURE ( CPAP ) DEVICES 100 J298 ASSIST PHYSICIAN IN...PRESSURE VENTILATORS 61 COMPUTERIZED PULMONARY FUNCTION ANALYZERS 61 TREADMILLS 59 HOLTER MONITOR EQUIPMENT 57 CPAP EQUIPMENT 54 PRESSURE REGULATORS 48...SUCTIONING PROCEDURES 95 J321 SET UP CONTINUOUS POSITIVE AIRWAY PRESSURE ( CPAP ) DEVICES 95 J332 SET UP VOLUME VENTILATORS 93 F148 PERFORM ARTERIAL PUNCTURES 93
NASA Astrophysics Data System (ADS)
Coppola, L.; Prieur, L.; Taupier-Letage, I.; Estournel, C.; Testor, P.; Lefevre, D.; Belamari, S.; LeReste, S.; Taillandier, V.
2017-08-01
During the winter 2013, an intense observation and monitoring was performed in the north-western Mediterranean Sea to study deep water formation process that drives thermohaline circulation and biogeochemical processes (HYMEX SOP2 and DEWEX projects). To observe intensively and continuously the impact of deep convection on oxygen (O2) ventilation, an observation strategy was based on the enhancement of the Argo-O2 floats to monitor the offshore dense water formation area (DWF) in the Gulf of Lion prior to and at the end of the convective period (December 2012 to April 2013). The intense O2 measurements performed through shipborne CTD casts and Argo-O2 floats deployment revealed an O2 inventory rapidly impacted by mixed layer (ML) deepening on the month scale. The open-sea convection in winter 2013 ventilated the deep waters from mid-February to the end of May 2013. The newly ventilated dense water volume, based on an Apparent Oxygen Utilization (AOU) threshold, was estimated to be about 1.5 × 1013 m3 during the DWF episode, increasing the deep O2 concentrations from 196 to 205 µmol kg-1 in the north-western basin.
Code of Federal Regulations, 2013 CFR
2013-07-01
... conditions of paragraphs (a)(4)(i) through (a)(4)(iv) of this section. Further, for OBD monitors that run... shall erase the permanent DTC at the end of a drive cycle if the monitor has run and made one or more... criteria have independently been satisfied: (i) The monitor has run and made one or more determinations...
gLExec and MyProxy integration in the ATLAS/OSG PanDA workload management system
NASA Astrophysics Data System (ADS)
Caballero, J.; Hover, J.; Litmaath, M.; Maeno, T.; Nilsson, P.; Potekhin, M.; Wenaus, T.; Zhao, X.
2010-04-01
Worker nodes on the grid exhibit great diversity, making it difficult to offer uniform processing resources. A pilot job architecture, which probes the environment on the remote worker node before pulling down a payload job, can help. Pilot jobs become smart wrappers, preparing an appropriate environment for job execution and providing logging and monitoring capabilities. PanDA (Production and Distributed Analysis), an ATLAS and OSG workload management system, follows this design. However, in the simplest (and most efficient) pilot submission approach of identical pilots carrying the same identifying grid proxy, end-user accounting by the site can only be done with application-level information (PanDA maintains its own end-user accounting), and end-user jobs run with the identity and privileges of the proxy carried by the pilots, which may be seen as a security risk. To address these issues, we have enabled PanDA to use gLExec, a tool provided by EGEE which runs payload jobs under an end-user's identity. End-user proxies are pre-staged in a credential caching service, MyProxy, and the information needed by the pilots to access them is stored in the PanDA DB. gLExec then extracts from the user's proxy the proper identity under which to run. We describe the deployment, installation, and configuration of gLExec, and how PanDA components have been augmented to use it. We describe how difficulties were overcome, and how security risks have been mitigated. Results are presented from OSG and EGEE Grid environments performing ATLAS analysis using PanDA and gLExec.
Berg, C.J.; Bundy, L.; Escoffery, C.; Haardörfer, R.; Kegler, M.C.
2013-01-01
SUMMARY Objectives To examine the feasibility of telephone-assisted placement of air nicotine monitors among low socio-economic intervention participants, and examine the use of this strategy in differentiating air nicotine concentrations in rooms where smoking is allowed from rooms where smoking is not allowed. Methods Forty participants were recruited from a county health department clinic and were enrolled in a brief smoke-free home policy intervention study. Twenty participants were selected at random for air nicotine monitor placement, and were instructed to telephone study staff who assisted them in monitor placement in their homes at the end of the intervention. Assessments were conducted at Weeks 0 and 8, with air nicotine assessment performed post-test. Results Of the 20 participants, 17 placed and returned the air nicotine monitors, and 16 also completed the follow-up survey. Follow-up survey data were not obtained on one monitor, and one participant who did not return the monitor completed the follow-up survey. Among those who reported a smoke-free policy (n=7), the average nicotine concentration was 0.62 μg/m3 [standard deviation (SD) 0.48]. Among those without a smoke-free policy (n=9), the average nicotine concentration was 2.30 μg/m3 (SD 2.04). Thus, the air nicotine concentration was significantly higher in those rooms where smoking was allowed [t(9, 11)=-2.39, P=0.04]. Conclusions The use of a telephone-assisted protocol for placement of air nicotine monitors was feasible. Despite the variability of air nicotine concentrations in rooms where smoking is allowed compared with rooms where smoking is not allowed, average concentrations were lower in smoke-free rooms. PMID:23480954
Clarke, Malcolm; de Folter, Joost; Verma, Vivek; Gokalp, Hulya
2018-05-01
This paper describes the implementation of an end-to-end remote monitoring platform based on the IEEE 11073 standards for personal health devices (PHD). It provides an overview of the concepts and approaches and describes how the standard has been optimized for small devices with limited resources of processor, memory, and power that use short-range wireless technology. It explains aspects of IEEE 11073, including the domain information model, state model, and nomenclature, and how these support its plug-and-play architecture. It shows how these aspects underpin a much larger ecosystem of interoperable devices and systems that include IHE PCD-01, HL7, and BlueTooth LE medical devices, and the relationship to the Continua Guidelines, advocating the adoption of data standards and nomenclature to support semantic interoperability between health and ambient assisted living in future platforms. The paper further describes the adaptions that have been made in order to implement the standard on the ZigBee Health Care Profile and the experiences of implementing an end-to-end platform that has been deployed to frail elderly patients with chronic disease(s) and patients with diabetes.
Learning patterns of life from intelligence analyst chat
NASA Astrophysics Data System (ADS)
Schneider, Michael K.; Alford, Mark; Babko-Malaya, Olga; Blasch, Erik; Chen, Lingji; Crespi, Valentino; HandUber, Jason; Haney, Phil; Nagy, Jim; Richman, Mike; Von Pless, Gregory; Zhu, Howie; Rhodes, Bradley J.
2016-05-01
Our Multi-INT Data Association Tool (MIDAT) learns patterns of life (POL) of a geographical area from video analyst observations called out in textual reporting. Typical approaches to learning POLs from video make use of computer vision algorithms to extract locations in space and time of various activities. Such approaches are subject to the detection and tracking performance of the video processing algorithms. Numerous examples of human analysts monitoring live video streams annotating or "calling out" relevant entities and activities exist, such as security analysis, crime-scene forensics, news reports, and sports commentary. This user description typically corresponds with textual capture, such as chat. Although the purpose of these text products is primarily to describe events as they happen, organizations typically archive the reports for extended periods. This archive provides a basis to build POLs. Such POLs are useful for diagnosis to assess activities in an area based on historical context, and for consumers of products, who gain an understanding of historical patterns. MIDAT combines natural language processing, multi-hypothesis tracking, and Multi-INT Activity Pattern Learning and Exploitation (MAPLE) technologies in an end-to-end lab prototype that processes textual products produced by video analysts, infers POLs, and highlights anomalies relative to those POLs with links to "tracks" of related activities performed by the same entity. MIDAT technologies perform well, achieving, for example, a 90% F1-value on extracting activities from the textual reports.
Extensible Computational Chemistry Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-08-09
ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less
A Monitoring System for the LHCb Data Flow
NASA Astrophysics Data System (ADS)
Barbosa, João; Gaspar, Clara; Jost, Beat; Frank, Markus; Cardoso, Luis G.
2017-06-01
The LHCb experiment uses the LHC accelerator for the collisions that produce the physics data necessary for analysis. The data produced by the detector by measuring the results of the collisions at a rate of 40 MHz are read out by a complex data acquisition (DAQ) system, which is summarily described in this paper. Distributed systems of such dimensions rely on monitoring and control systems that account for the numerous faults that can happen throughout the whole operation. With this in mind, a new system was created to extend the monitoring of the readout system, in this case by providing an overview of what is happening in each stage of the DAQ process, starting in the hardware trigger performed right after the detector measurements and ending in the local storage of the experiment. This system, a complement to the current run control (experimental control system), intends to shorten reaction times when a problem occurs by providing the operators with detailed information of where a certain fault is occurring. The architecture of the tool and its utilization by the experiment operators are described in this paper.
NASA Astrophysics Data System (ADS)
Magalhães, F.; Cunha, A.; Caetano, E.
2012-04-01
In order to evaluate the usefulness of approaches based on modal parameters tracking for structural health monitoring of bridges, in September of 2007, a dynamic monitoring system was installed in a concrete arch bridge at the city of Porto, in Portugal. The implementation of algorithms to perform the continuous on-line identification of modal parameters based on structural responses to ambient excitation (automated Operational Modal Analysis) has permitted to create a very complete database with the time evolution of the bridge modal characteristics during more than 2 years. This paper describes the strategy that was followed to minimize the effects of environmental and operational factors on the bridge natural frequencies, enabling, in a subsequent stage, the identification of structural anomalies. Alternative static and dynamic regression models are tested and complemented by a Principal Components Analysis. Afterwards, the identification of damages is tried with control charts. At the end, it is demonstrated that the adopted processing methodology permits the detection of realistic damage scenarios, associated with frequency shifts around 0.2%, which were simulated with a numerical model.
Moonasar, Devanand; Goga, Ameena Ebrahim; Frean, John; Kruger, Philip; Chandramohan, Daniel
2007-06-02
Malaria rapid diagnostic tests (RDTs) are relatively simple to perform and provide results quickly for making treatment decisions. However, the accuracy and application of RDT results depends on several factors such as quality of the RDT, storage, transport and end user performance. A cross sectional survey to explore factors that affect the performance and use of RDTs was conducted in the primary care facilities in South Africa. This study was conducted in three malaria risk sub-districts of the Limpopo Province, in South Africa. Twenty nurses were randomly selected from 17 primary health care facilities, three nurses from hospitals serving the study area and 10 other key informants, representing the managers of the malaria control programmes, routine and research laboratories, were interviewed, using semi-structured questionnaires. There was a high degree of efficiency in ordering and distribution of RDTs, however only 13/20 (65%) of the health facilities had appropriate air-conditioning and monitoring of room temperatures. Sixty percent (12/20) of the nurses did not receive any external training on conducting and interpreting RDT. Fifty percent of nurses (10/20) reported RDT stock-outs. Only 3/20 nurses mentioned that they periodically checked quality of RDT. Fifteen percent of nurses reported giving antimalarial drugs even if the RDT was negative. Storage, quality assurance, end user training and use of RDT results for clinical decision making in primary care facilities in South Africa need to be improved. Further studies of the factors influencing the quality control of RDTs, their performance of RDTs and the ways to improve their use of RDTs are needed.
Moonasar, Devanand; Goga, Ameena Ebrahim; Frean, John; Kruger, Philip; Chandramohan, Daniel
2007-01-01
Background Malaria rapid diagnostic tests (RDTs) are relatively simple to perform and provide results quickly for making treatment decisions. However, the accuracy and application of RDT results depends on several factors such as quality of the RDT, storage, transport and end user performance. A cross sectional survey to explore factors that affect the performance and use of RDTs was conducted in the primary care facilities in South Africa. Methods This study was conducted in three malaria risk sub-districts of the Limpopo Province, in South Africa. Twenty nurses were randomly selected from 17 primary health care facilities, three nurses from hospitals serving the study area and 10 other key informants, representing the managers of the malaria control programmes, routine and research laboratories, were interviewed, using semi-structured questionnaires. Results There was a high degree of efficiency in ordering and distribution of RDTs, however only 13/20 (65%) of the health facilities had appropriate air-conditioning and monitoring of room temperatures. Sixty percent (12/20) of the nurses did not receive any external training on conducting and interpreting RDT. Fifty percent of nurses (10/20) reported RDT stock-outs. Only 3/20 nurses mentioned that they periodically checked quality of RDT. Fifteen percent of nurses reported giving antimalarial drugs even if the RDT was negative. Conclusion Storage, quality assurance, end user training and use of RDT results for clinical decision making in primary care facilities in South Africa need to be improved. Further studies of the factors influencing the quality control of RDTs, their performance of RDTs and the ways to improve their use of RDTs are needed. PMID:17543127
Bosi, Emanuele; Scavini, Marina; Ceriello, Antonio; Cucinotta, Domenico; Tiengo, Antonio; Marino, Raffaele; Bonizzoni, Erminio; Giorgino, Francesco
2013-10-01
We aimed to evaluate the added value of intensive self-monitoring of blood glucose (SMBG), structured in timing and frequency, in noninsulin-treated patients with type 2 diabetes. The 12-month, randomized, clinical trial enrolled 1,024 patients with noninsulin-treated type 2 diabetes (median baseline HbA1c, 7.3% [IQR, 6.9-7.8%]) at 39 diabetes clinics in Italy. After standardized education, 501 patients were randomized to intensive structured monitoring (ISM) with 4-point glycemic profiles (fasting, preprandial, 2-h postprandial, and postabsorptive measurements) performed 3 days/week; 523 patients were randomized to active control (AC) with 4-point glycemic profiles performed at baseline and at 6 and 12 months. Two primary end points were tested in hierarchical order: HbA1c change at 12 months and percentage of patients at risk target for low and high blood glucose index. Intent-to-treat analysis showed greater HbA1c reductions over 12 months in ISM (-0.39%) than in AC patients (-0.27%), with a between-group difference of -0.12% (95% CI, -0.210 to -0.024; P=0.013). In the per-protocol analysis, the between-group difference was -0.21% (-0.331 to -0.089; P=0.0007). More ISM than AC patients achieved clinically meaningful reductions in HbA1c (>0.3, >0.4, or >0.5%) at study end (P<0.025). The proportion of patients reaching/maintaining the risk target at month 12 was similar in ISM (74.6%) and AC (70.1%) patients (P=0.131). At visits 2, 3, and 4, diabetes medications were changed more often in ISM than in AC patients (P<0.001). Use of structured SMBG improves glycemic control and provides guidance in prescribing diabetes medications in patients with relatively well-controlled noninsulin-treated type 2 diabetes.
Simulation of CNT-AFM tip based on finite element analysis for targeted probe of the biological cell
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yousefi, Amin Termeh, E-mail: at.tyousefi@gmail.com; Miyake, Mikio, E-mail: miyakejaist@gmail.com; Ikeda, Shoichiro, E-mail: sho16.ikeda@gmail.com
Carbon nanotubes (CNTs) are potentially ideal tips for atomic force microscopy (AFM) due to the robust mechanical properties, nano scale diameter and also their ability to be functionalized by chemical and biological components at the tip ends. This contribution develops the idea of using CNTs as an AFM tip in computational analysis of the biological cell’s. Finite element analysis employed for each section and displacement of the nodes located in the contact area was monitored by using an output database (ODB). This reliable integration of CNT-AFM tip process provides a new class of high performance nanoprobes for single biological cellmore » analysis.« less
Algorithms for Monitoring Heart Rate and Respiratory Rate From the Video of a User’s Face
Sanyal, Shourjya
2018-01-01
Smartphone cameras can measure heart rate (HR) by detecting pulsatile photoplethysmographic (iPPG) signals from post-processing the video of a subject’s face. The iPPG signal is often derived from variations in the intensity of the green channel as shown by Poh et. al. and Verkruysse et. al.. In this pilot study, we have introduced a novel iPPG method where by measuring variations in color of reflected light, i.e., Hue, and can therefore measure both HR and respiratory rate (RR) from the video of a subject’s face. This paper was performed on 25 healthy individuals (Ages 20–30, 15 males and 10 females, and skin color was Fitzpatrick scale 1–6). For each subject we took two 20 second video of the subject’s face with minimal movement, one with flash ON and one with flash OFF. While recording the videos we simultaneously measuring HR using a Biosync B-50DL Finger Heart Rate Monitor, and RR using self-reporting. This paper shows that our proposed approach of measuring iPPG using Hue (range 0–0.1) gives more accurate readings than the Green channel. HR/Hue (range 0–0.1) (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$r=0.9201$ \\end{document}, \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$p$ \\end{document}-value = 4.1617, and RMSE = 0.8887) is more accurate compared with HR/Green (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$r=0.4916$ \\end{document}, \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$p$ \\end{document}-value = 11.60172, and RMSE = 0.9068). RR/Hue (range 0–0.1) (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$r=0.6575$ \\end{document}, \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$p$ \\end{document}-value = 0.2885, and RMSE = 3.8884) is more accurate compared with RR/Green (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$r=0.3352$ \\end{document}, \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$p$ \\end{document}-value = 0.5608, and RMSE = 5.6885). We hope that this hardware agnostic approach for detection of vital signals will have a huge potential impact in telemedicine, and can be used to tackle challenges, such as continuous non-contact monitoring of neo-natal and elderly patients. An implementation of the algorithm can be found at https://pulser.thinkbiosolution.com
NASA Technical Reports Server (NTRS)
Otaguro, W. S.; Kesler, L. O.; Land, K. C.; Rhoades, D. E.
1987-01-01
An intelligent tracker capable of robotic applications requiring guidance and control of platforms, robotic arms, and end effectors has been developed. This packaged system capable of supervised autonomous robotic functions is partitioned into a multiple processor/parallel processing configuration. The system currently interfaces to cameras but has the capability to also use three-dimensional inputs from scanning laser rangers. The inputs are fed into an image processing and tracking section where the camera inputs are conditioned for the multiple tracker algorithms. An executive section monitors the image processing and tracker outputs and performs all the control and decision processes. The present architecture of the system is presented with discussion of its evolutionary growth for space applications. An autonomous rendezvous demonstration of this system was performed last year. More realistic demonstrations in planning are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, G; Ha, J; Zhou, S
Purpose: To examine and validate the absolute dose for total skin electron therapy (TSET) through an end-to-end test with a Rando phantom using optically stimulated luminescent dosimeters (OSLDs) and EBT3 radiochromic films. Methods: A Varian Trilogy linear accelerator equipped with the special procedure 6 MeV HDTSe- was used to perform TSET irradiations using a modified Stanford 6-dual-field technique. The absolute dose was calibrated using a Markus ion chamber at a reference depth of 1.3cm at 100 cm SSD with a field size of 36 × 36 cm at the isocenter in solid water slabs. The absolute dose was cross validatedmore » by a farmer ion chamber. Then the dose rate in the unit of cGy/Mu was calibrated using the Markus chamber at the treatment position. OSLDs were used to independently verify the dose using the calibrated dose rate. Finally, a patient treatment plan (200 cGy/cycle) was delivered in the QA mode to a Rando phantom, which had 16 pairs of OSLDs and EBT3 films taped onto its surface at different anatomical positions. The doses recorded were read out to validate the absolute dosimetry for TSET. Results: The OSLD measurements were within 7% agreement with the planned dose except the shoulder areas, where the doses recorded were 23% lower on average than those of the planned. The EBT3 film measurements were within 10% agreement with the planned dose except the shoulder and the scalp vertex areas, where the respective doses recorded were 18% and 14% lower on average than those of the planned. The OSLDs gave more consistent dose measurements than those of the EBT3 films. Conclusion: The absolute dosimetry for TSET was validated by an end-to-end test with a Rando phantom using the OSLDs and EBT3 films. The beam calibration and monitor unit calculations were confirmed.« less
Deep Space Network (DSN), Network Operations Control Center (NOCC) computer-human interfaces
NASA Technical Reports Server (NTRS)
Ellman, Alvin; Carlton, Magdi
1993-01-01
The Network Operations Control Center (NOCC) of the DSN is responsible for scheduling the resources of DSN, and monitoring all multi-mission spacecraft tracking activities in real-time. Operations performs this job with computer systems at JPL connected to over 100 computers at Goldstone, Australia and Spain. The old computer system became obsolete, and the first version of the new system was installed in 1991. Significant improvements for the computer-human interfaces became the dominant theme for the replacement project. Major issues required innovating problem solving. Among these issues were: How to present several thousand data elements on displays without overloading the operator? What is the best graphical representation of DSN end-to-end data flow? How to operate the system without memorizing mnemonics of hundreds of operator directives? Which computing environment will meet the competing performance requirements? This paper presents the technical challenges, engineering solutions, and results of the NOCC computer-human interface design.
A balloon-borne prototype for demonstrating the concept of JEM-EUSO
NASA Astrophysics Data System (ADS)
von Ballmoos, P.; Santangelo, A.; Adams, J. H.; Barrillon, P.; Bayer, J.; Bertaina, M.; Cafagna, F.; Casolino, M.; Dagoret, S.; Danto, P.; Distratis, G.; Dupieux, M.; Ebersoldt, A.; Ebisuzaki, T.; Evrard, J.; Gorodetzky, Ph.; Haungs, A.; Jung, A.; Kawasaki, Y.; Medina-Tanco, G.; Mot, B.; Osteria, G.; Parizot, E.; Park, I. H.; Picozza, P.; Prévôt, G.; Prieto, H.; Ricci, M.; Rodríguez Frías, M. D.; Roudil, G.; Scotti, V.; Szabelski, J.; Takizawa, Y.; Tusno, K.
2014-05-01
EUSO-BALLOON has been conceived as a pathfinder for JEM-EUSO, a mission concept for a space-borne wide-field telescope monitoring the Earth's nighttime atmosphere with the objective of recording the ultraviolet light from tracks initiated by ultra-high energy cosmic rays. Through a series of stratospheric balloon flights performed by the French Space Agency CNES, EUSO-BALLOON will serve as a test-bench for the key technologies of JEM-EUSO. EUSO-BALLOON shall perform an end-to-end test of all subsystems and components, and prove the global detection chain while improving our knowledge of the atmospheric and terrestrial ultraviolet background. The balloon-instrument also has the potential to detect for the first time UV-light generated by atmospheric air-shower from above, marking a milestone in the development of UHECR science, and paving the way for any future large scale, space-based ultra-high energy cosmic ray observatory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, S.H.; Yang, B.X.; Decker, G.
Accurate and stable x-ray beam position monitors (XBPMs) are ke y elements in obtaining the desired user beam stability in the Advanced Photon Source (APS). The next generat ion XBPMs for high heat load front ends (HHL FEs) have been designed to meet these requirements by utilizing Cu K-edge x-ray fluorescence (XRF) from a pair of copper absorbers and have been installed at the front ends (FEs) of the APS. Com missioning data showed a significant performance improvement over the existing photoemission-based XBPMs. While a similar design concept can be applied for the canted undulator front ends, where two undulatormore » beams are separated by 1.0-mrad, the lower beam power (< 10 kW) per undulator allows us to explore lower-cost solutions based on Compton scat tering from the diamond blades placed edge-on to the x- ray beam. A prototype of the Compton scattering XBPM system was i nstalled at 24-ID-A in May 2015. In this report, the design and test results for XRF-based XBPM and Compton scattering based XBPM are presented. Ongoing research related to the development of the next generation XBPMs on thermal contac t resistance of a joint between two solid bodies is also discussed« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, S. H., E-mail: shlee@aps.anl.gov; Yang, B. X., E-mail: bxyang@aps.anl.gov; Decker, G., E-mail: decker@aps.anl.gov
Accurate and stable x-ray beam position monitors (XBPMs) are key elements in obtaining the desired user beam stability in the Advanced Photon Source (APS). The next generation XBPMs for high heat load front ends (HHL FEs) have been designed to meet these requirements by utilizing Cu K-edge x-ray fluorescence (XRF) from a pair of copper absorbers and have been installed at the front ends (FEs) of the APS. Commissioning data showed a significant performance improvement over the existing photoemission-based XBPMs. While a similar design concept can be applied for the canted undulator front ends, where two undulator beams are separatedmore » by 1.0-mrad, the lower beam power (< 10 kW) per undulator allows us to explore lower-cost solutions based on Compton scattering from the diamond blades placed edge-on to the x-ray beam. A prototype of the Compton scattering XBPM system was installed at 24-ID-A in May 2015. In this report, the design and test results for XRF-based XBPM and Compton scattering based XBPM are presented. Ongoing research related to the development of the next generation XBPMs on thermal contact resistance of a joint between two solid bodies is also discussed.« less
Germanakis, Ioannis; Tsarouhas, Konstantinos; Fragkiadaki, Persefoni; Tsitsimpikou, Christina; Goutzourelas, Nikolaos; Champsas, Maria Christakis; Stagos, Demetrios; Rentoukas, Elias; Tsatsakis, Aristidis M
2013-11-01
The present study focuses on the short term effects of repeated low level administration of turinabol and methanabol on cardiac function in young rabbits (4 months-old). The experimental scheme consisted of two oral administration periods, lasting 1 month each, interrupted by 1-month wash-out period. Serial echocardiographic evaluation at the end of all three experimental periods was performed in all animals. Oxidative stress markers have also been monitored at the end of each administration period. Treated animals originally showed significantly increased myocardial mass and systolic cardiac output, which normalized at the end of the wash out period. Re-administration led to increased cardiac output, at the cost though of a progressive myocardial mass reduction. A dose-dependent trend towards impaired longitudinal systolic, diastolic and global myocardial function was also observed. The adverse effects were more pronounced in the methanabol group. For both anabolic steroids studied, the low dose had no significant effects on oxidative stress markers monitored, while the high dose created a hostile oxidative environment. In conclusion, anabolic administration has been found to create a possible deleterious long term effect on the growth of the immature heart and should be strongly discouraged especially in young human subjects. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bazin, S.
2012-04-01
Landslide monitoring means the comparison of landslide characteristics like areal extent, speed of movement, surface topography and soil humidity from different periods in order to assess landslide activity. An ultimate "universal" methodology for this purpose does not exist; every technology has its own advantages and disadvantages. End-users should carefully consider each one to select the methodologies that represent the best compromise between pros and cons, and are best suited for their needs. Besides monitoring technology, there are many factors governing the choice of an Early Warning System (EWS). A people-centred EWS necessarily comprises five key elements: (1) knowledge of the risks; (2) identification, monitoring, analysis and forecasting of the hazards; (3) operational centre; (4) communication or dissemination of alerts and warnings; and (5) local capabilities to respond to the warnings received. The expression "end-to-end warning system" is also used to emphasize that EWSs need to span all steps from hazard detection through to community response. The aim of the present work is to provide guidelines for establishing the different components for landslide EWSs. One of the main deliverables of the EC-FP7 SafeLand project addresses the technical and practical issues related to monitoring and early warning for landslides, and identifies the best technologies available in the context of both hazard assessment and design of EWSs. This deliverable targets the end-users and aims to facilitate the decision process by providing guidelines. For the purpose of sharing the globally accumulated expertise, a screening study was done on 14 EWSs from 8 different countries. On these bases, the report presents a synoptic view of existing monitoring methodologies and early-warning strategies and their applicability for different landslide types, scales and risk management steps. Several comprehensive checklists and toolboxes are also included to support informed decisions. The deliverable was compiled with contributions from experts on landslides, monitoring technologies, remote sensing, and social researchers from 16 European institutions. The deliverable addresses one of the main objectives of the SafeLand project, namely to merge experience and expert judgment and create synergies on European level towards guidelines for early warning and to make these results available to end-users and local stakeholders.
PERFORM: A System for Monitoring, Assessment and Management of Patients with Parkinson's Disease
Tzallas, Alexandros T.; Tsipouras, Markos G.; Rigas, Georgios; Tsalikakis, Dimitrios G.; Karvounis, Evaggelos C.; Chondrogiorgi, Maria; Psomadellis, Fotis; Cancela, Jorge; Pastorino, Matteo; Waldmeyer, María Teresa Arredondo; Konitsiotis, Spiros; Fotiadis, Dimitrios I.
2014-01-01
In this paper, we describe the PERFORM system for the continuous remote monitoring and management of Parkinson's disease (PD) patients. The PERFORM system is an intelligent closed-loop system that seamlessly integrates a wide range of wearable sensors constantly monitoring several motor signals of the PD patients. Data acquired are pre-processed by advanced knowledge processing methods, integrated by fusion algorithms to allow health professionals to remotely monitor the overall status of the patients, adjust medication schedules and personalize treatment. The information collected by the sensors (accelerometers and gyroscopes) is processed by several classifiers. As a result, it is possible to evaluate and quantify the PD motor symptoms related to end of dose deterioration (tremor, bradykinesia, freezing of gait (FoG)) as well as those related to over-dose concentration (Levodopa-induced dyskinesia (LID)). Based on this information, together with information derived from tests performed with a virtual reality glove and information about the medication and food intake, a patient specific profile can be built. In addition, the patient specific profile with his evaluation during the last week and last month, is compared to understand whether his status is stable, improving or worsening. Based on that, the system analyses whether a medication change is needed—always under medical supervision—and in this case, information about the medication change proposal is sent to the patient. The performance of the system has been evaluated in real life conditions, the accuracy and acceptability of the system by the PD patients and healthcare professionals has been tested, and a comparison with the standard routine clinical evaluation done by the PD patients' physician has been carried out. The PERFORM system is used by the PD patients and in a simple and safe non-invasive way for long-term record of their motor status, thus offering to the clinician a precise, long-term and objective view of patient's motor status and drug/food intake. Thus, with the PERFORM system the clinician can remotely receive precise information for the PD patient's status on previous days and define the optimal therapeutical treatment. PMID:25393786
PERFORM: a system for monitoring, assessment and management of patients with Parkinson's disease.
Tzallas, Alexandros T; Tsipouras, Markos G; Rigas, Georgios; Tsalikakis, Dimitrios G; Karvounis, Evaggelos C; Chondrogiorgi, Maria; Psomadellis, Fotis; Cancela, Jorge; Pastorino, Matteo; Waldmeyer, María Teresa Arredondo; Konitsiotis, Spiros; Fotiadis, Dimitrios I
2014-11-11
In this paper, we describe the PERFORM system for the continuous remote monitoring and management of Parkinson's disease (PD) patients. The PERFORM system is an intelligent closed-loop system that seamlessly integrates a wide range of wearable sensors constantly monitoring several motor signals of the PD patients. Data acquired are pre-processed by advanced knowledge processing methods, integrated by fusion algorithms to allow health professionals to remotely monitor the overall status of the patients, adjust medication schedules and personalize treatment. The information collected by the sensors (accelerometers and gyroscopes) is processed by several classifiers. As a result, it is possible to evaluate and quantify the PD motor symptoms related to end of dose deterioration (tremor, bradykinesia, freezing of gait (FoG)) as well as those related to over-dose concentration (Levodopa-induced dyskinesia (LID)). Based on this information, together with information derived from tests performed with a virtual reality glove and information about the medication and food intake, a patient specific profile can be built. In addition, the patient specific profile with his evaluation during the last week and last month, is compared to understand whether his status is stable, improving or worsening. Based on that, the system analyses whether a medication change is needed--always under medical supervision--and in this case, information about the medication change proposal is sent to the patient. The performance of the system has been evaluated in real life conditions, the accuracy and acceptability of the system by the PD patients and healthcare professionals has been tested, and a comparison with the standard routine clinical evaluation done by the PD patients' physician has been carried out. The PERFORM system is used by the PD patients and in a simple and safe non-invasive way for long-term record of their motor status, thus offering to the clinician a precise, long-term and objective view of patient's motor status and drug/food intake. Thus, with the PERFORM system the clinician can remotely receive precise information for the PD patient's status on previous days and define the optimal therapeutical treatment.
Lagosky, Stephanie; Bartlett, Doreen; Shaw, Lynn
2016-01-01
Parents who care for young children with chronic conditions are knowledge users. Their efforts, time, and energy to source, consider and monitor information add to the 'invisible' work of parents in making decisions about care, school transitions, and interventions. Little is known or understood about the work of parents as knowledge users. To understand the knowledge use patterns and how these patterns may be monitored in parents caring for their young children with cerebral palsy (CP). An embedded case study methodology was used. In-depth qualitative interviews and visual mapping were employed to collect and analyze data based on the experiences of three mothers of young children with CP. Knowledge use in parents caring for their young children with CP is multi-factorial, complex and temporal. Findings resulted in a provisional model elaborating on the ways knowledge is used by parents and how it may be monitored. The visual mapping of pathways and actions of parents as end users makes the processes of knowledge use more visible and open to be valued as well as appreciated by others. The provisional model has implications for knowledge mobilization as a strategy in childhood rehabilitation and the facilitation of knowledge use in the lives of families with children with chronic health conditions.
Gear Fault Detection Effectiveness as Applied to Tooth Surface Pitting Fatigue Damage
NASA Technical Reports Server (NTRS)
Lewicki, David G.; Dempsey, Paula J.; Heath, Gregory F.; Shanthakumaran, Perumal
2009-01-01
A study was performed to evaluate fault detection effectiveness as applied to gear tooth pitting fatigue damage. Vibration and oil-debris monitoring (ODM) data were gathered from 24 sets of spur pinion and face gears run during a previous endurance evaluation study. Three common condition indicators (RMS, FM4, and NA4) were deduced from the time-averaged vibration data and used with the ODM to evaluate their performance for gear fault detection. The NA4 parameter showed to be a very good condition indicator for the detection of gear tooth surface pitting failures. The FM4 and RMS parameters performed average to below average in detection of gear tooth surface pitting failures. The ODM sensor was successful in detecting a significant amount of debris from all the gear tooth pitting fatigue failures. Excluding outliers, the average cumulative mass at the end of a test was 40 mg.
Spacelab output processing system architectural study
NASA Technical Reports Server (NTRS)
1977-01-01
Two different system architectures are presented. The two architectures are derived from two different data flows within the Spacelab Output Processing System. The major differences between these system architectures are in the position of the decommutation function (the first architecture performs decommutation in the latter half of the system and the second architecture performs that function in the front end of the system). In order to be examined, the system was divided into five stand-alone subsystems; Work Assembler, Mass Storage System, Output Processor, Peripheral Pool, and Resource Monitor. The work load of each subsystem was estimated independent of the specific devices to be used. The candidate devices were surveyed from a wide sampling of off-the-shelf devices. Analytical expressions were developed to quantify the projected workload in conjunction with typical devices which would adequately handle the subsystem tasks. All of the study efforts were then directed toward preparing performance and cost curves for each architecture subsystem.
Real-time measurement of mental workload: A feasibility study
NASA Technical Reports Server (NTRS)
Kramer, Arthur; Humphrey, Darryl; Sirevaag, Erik; Mecklinger, Axel
1990-01-01
The primary goal of the study was to explore the utility of event-related brain potentials (ERP) as real-time measures of workload. To this end, subjects performed two different tasks both separately and together. One task required that subjects monitor a bank of constantly changing gauges and detect critical deviations. Difficulty was varied by changing the predictability of the gauges. The second task was mental arithmetic. Difficulty was varied by requiring subjects to perform operations on either two or three columns of numbers. Two conditions that could easily be distinguished on the basis of performance measures were selected for the real-time evaluation of ERPs. A bootstrapping approach was adopted in which one thousand samples of n trials (n = 1, 3, 5 ...65) were classified using several measures of P300 and Slow Wave amplitude. Classification accuracies of 85 percent were achieved with 25 trials. Results are discussed in terms of potential enhancements for real-time recording.
NASA Astrophysics Data System (ADS)
Abbou, S.; Dillet, J.; Maranzana, G.; Didierjean, S.; Lottin, O.
2017-02-01
Proton exchange membrane (PEM) fuel cells operate with dead-ended anode in order to reduce system cost and complexity when compared with hydrogen re-circulation systems. In the first part of this work, we showed that localized fuel starvation events may occur, because of water and nitrogen accumulation in the anode side, which could be particularly damaging to the cell performance. To prevent these degradations, the anode compartment must be purged which may lead to an overall system efficiency decrease because of significant hydrogen waste. In the second part, we present several purge strategies in order to minimize both hydrogen waste and membrane-electrode assembly degradations during dead-ended anode operation. A linear segmented cell with reference electrodes was used to monitor simultaneously the current density distribution along the gas channel and the time evolution of local anode and cathode potentials. To asses MEA damages, Platinum ElectroChemical Surface Area (ECSA) and cell performance were periodically measured. The results showed that dead-end mode operation with an anode plate maintained at a temperature 5 °C hotter than the cathode plate limits water accumulation in the anode side, reducing significantly purge frequency (and thus hydrogen losses) as well as MEA damages. As nitrogen contribution to hydrogen starvation is predominant in this thermal configuration, we also tested a microleakage solution to discharge continuously most the nitrogen accumulating in the anode side while ensuring low hydrogen losses and minimum ECSA losses provided the right microleakage flow rate is chosen.
To track or not to track: user reactions to concepts in longitudinal health monitoring.
Beaudin, Jennifer S; Intille, Stephen S; Morris, Margaret E
2006-01-01
Advances in ubiquitous computing, smart homes, and sensor technologies enable novel, longitudinal health monitoring applications in the home. Many home monitoring technologies have been proposed to detect health crises, support aging-in-place, and improve medical care. Health professionals and potential end users in the lay public, however, sometimes question whether home health monitoring is justified given the cost and potential invasion of privacy. The aim of the study was to elicit specific feedback from health professionals and laypeople about how they might use longitudinal health monitoring data for proactive health and well-being. Interviews were conducted with 8 health professionals and 26 laypeople. Participants were asked to evaluate mock data visualization displays that could be generated by novel home monitoring systems. The mock displays were used to elicit reactions to longitudinal monitoring in the home setting as well as what behaviors, events, and physiological indicators people were interested in tracking. Based on the qualitative data provided by the interviews, lists of benefits of and concerns about health tracking from the perspectives of the practitioners and laypeople were compiled. Variables of particular interest to the interviewees, as well as their specific ideas for applications of collected data, were documented. Based upon these interviews, we recommend that ubiquitous "monitoring" systems may be more readily adopted if they are developed as tools for personalized, longitudinal self-investigation that help end users learn about the conditions and variables that impact their social, cognitive, and physical health.
Monitoring Training Load in Indian Male Swimmers
MAJUMDAR, PRALAY; SRIVIDHYA, SRI
2010-01-01
The present study was initiated to monitor the training load with the magnitude of impact on the hormone concentrations such as testosterone, cortisol and T/C (Testosterone/Cortisol) ratio during the three phases of training (i.e. preparatory, pre-competitive, and competitive phases) in Indian male swimmers preparing for the 2010 Commonwealth Games. Blood samples were collected at the end of each training phase and hormone concentration was determined by ELISA. Our results reveal that testosterone concentration and the T/C ratio significantly decreased and the cortisol concentration increased in the subsequent periodized cycle. Change in hormone concentration was associated with the intensity and duration of individual exercise sessions. The greatest performance enhancement was realized with the lowest plasma cortisol, highest testosterone, and a high T/C ratio. Monitoring of these hormones also have implications for identifying and preventing overreaching in swimmers. PMID:27182335
Improving Robotic Operator Performance Using Augmented Reality
NASA Technical Reports Server (NTRS)
Maida, James C.; Bowen, Charles K.; Pace, John W.
2007-01-01
The Special Purpose Dexterous Manipulator (SPDM) is a two-armed robot that functions as an extension to the end effector of the Space Station Robotics Manipulator System (SSRMS), currently in use on the International Space Station (ISS). Crew training for the SPDM is accomplished using a robotic hardware simulator, which performs most of SPDM functions under normal static Earth gravitational forces. Both the simulator and SPDM are controlled from a standard robotic workstation using a laptop for the user interface and three monitors for camera views. Most operations anticipated for the SPDM involve the manipulation, insertion, and removal of any of several types of Orbital Replaceable Unit (ORU), modules which control various ISS functions. Alignment tolerances for insertion of the ORU into its receptacle are 0.25 inch and 0.5 degree from nominal values. The pre-insertion alignment task must be performed within these tolerances by using available video camera views of the intrinsic features of the ORU and receptacle, without special registration markings. Since optimum camera views may not be available, and dynamic orbital lighting conditions may limit periods of viewing, a successful ORU insertion operation may require an extended period of time. This study explored the feasibility of using augmented reality (AR) to assist SPDM operations. Geometric graphical symbols were overlaid on one of the workstation monitors to afford cues to assist the operator in attaining adequate pre-insertion ORU alignment. Twelve skilled subjects performed eight ORU insertion tasks using the simulator with and without the AR symbols in a repeated measures experimental design. Results indicated that using the AR symbols reduced pre-insertion alignment error for all subjects and reduced the time to complete pre-insertion alignment for most subjects.
Room air monitor for radioactive aerosols
Balmer, David K.; Tyree, William H.
1989-04-11
A housing assembly for use with a room air monitor for simultaneous collection and counting of suspended particles includes a casing containing a combination detector-preamplifier system at one end, a filter system at the other end, and an air flow system consisting of an air inlet formed in the casing between the detector-preamplifier system and the filter system and an air passageway extending from the air inlet through the casing and out the end opposite the detector-preamplifier combination. The filter system collects suspended particles transported directly through the housing by means of the air flow system, and these particles are detected and examined for radioactivity by the detector-pre The U.S. Government has rights in this invention pursuant to Contract No. DE-AC04-76DP03533 between the Department of Energy and Rockwell International Corporation.
An Optimized Autonomous Space In-situ Sensorweb (OASIS) for Volcano Monitoring
NASA Astrophysics Data System (ADS)
Song, W.; Shirazi, B.; Lahusen, R.; Chien, S.; Kedar, S.; Webb, F.
2006-12-01
In response to NASA's announced requirement for Earth hazard monitoring sensor-web technology, we are developing a prototype real-time Optimized Autonomous Space In-situ Sensorweb. The prototype will be focused on volcano hazard monitoring at Mount St. Helens, which has been in continuous eruption since October 2004. The system is designed to be flexible and easily configurable for many other applications as well. The primary goals of the project are: 1) integrating complementary space (i.e., Earth Observing One (EO- 1) satellite) and in-situ (ground-based) elements into an interactive, autonomous sensor-web; 2) advancing sensor-web power and communication resource management technology; and 3) enabling scalability for seamless infusion of future space and in-situ assets into the sensor-web. To meet these goals, we are developing: 1) a test-bed in-situ array with smart sensor nodes capable of making autonomous data acquisition decisions; 2) efficient self-organization algorithm of sensor-web topology to support efficient data communication and command control; 3) smart bandwidth allocation algorithms in which sensor nodes autonomously determine packet priorities based on mission needs and local bandwidth information in real- time; and 4) remote network management and reprogramming tools. The space and in-situ control components of the system will be integrated such that each element is capable of triggering the other. Sensor-web data acquisition and dissemination will be accomplished through the use of SensorML language standards for geospatial information. The three-year project will demonstrate end-to-end system performance with the in-situ test-bed at Mount St. Helens and NASA's EO-1 platform.
Kalfa, David; Chai, Paul; Bacha, Emile
2014-08-01
A significant inverse relationship of surgical institutional and surgeon volumes to outcome has been demonstrated in many high-stakes surgical specialties. By and large, the same results were found in pediatric cardiac surgery, for which a more thorough analysis has shown that this relationship depends on case complexity and type of surgical procedures. Lower-volume programs tend to underperform larger-volume programs as case complexity increases. High-volume pediatric cardiac surgeons also tend to have better results than low-volume surgeons, especially at the more complex end of the surgery spectrum (e.g., the Norwood procedure). Nevertheless, this trend for lower mortality rates at larger centers is not universal. All larger programs do not perform better than all smaller programs. Moreover, surgical volume seems to account for only a small proportion of the overall between-center variation in outcome. Intraoperative technical performance is one of the most important parts, if not the most important part, of the therapeutic process and a critical component of postoperative outcome. Thus, the use of center-specific, risk-adjusted outcome as a tool for quality assessment together with monitoring of technical performance using a specific score may be more reliable than relying on volume alone. However, the relationship between surgical volume and outcome in pediatric cardiac surgery is strong enough that it ought to support adapted and well-balanced health care strategies that take advantage of the positive influence that higher center and surgeon volumes have on outcome.
Flow Cytometry Technician | Center for Cancer Research
PROGRAM DESCRIPTION The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). KEY ROLES/RESPONSIBILITIES The Flow Cytometry Core (Flow Core) of the Cancer and Inflammation Program (CIP) is a service core which supports the research efforts of the CCR by providing expertise in the field of flow cytometry (using analyzers and sorters) with the goal of gaining a more thorough understanding of the biology of cancer and cancer cells. The Flow Core provides service to 12-15 CIP laboratories and more than 22 non-CIP laboratories. Flow core staff provide technical advice on the experimental design of applications, which include immunological phenotyping, cell function assays, and cell cycle analysis. Work is performed per customer requirements, and no independent research is involved. The Flow Cytometry Technician will be responsible for: Monitor performance of and maintain high dimensional flow cytometer analyzers and cell sorters Operate high dimensional flow cytometer analyzers and cell sorters Monitoring lab supply levels and order lab supplies, perform various record keeping responsibilities Assist in the training of scientific end users on the use of flow cytometry in their research, as well as how to operate and troubleshoot the bench-top analyzer instruments Experience with sterile technique and tissue culture
An Open Source Low-Cost Wireless Control System for a Forced Circulation Solar Plant
Salamone, Francesco; Belussi, Lorenzo; Danza, Ludovico; Ghellere, Matteo; Meroni, Italo
2015-01-01
The article describes the design phase, development and practical application of a low-cost control system for a forced circulation solar plant in an outdoor test cell located near Milan. Such a system provides for the use of an electric pump for the circulation of heat transfer fluid connecting the solar thermal panel to the storage tank. The running plant temperatures are the fundamental parameter to evaluate the system performance such as proper operation, and the control and management system has to consider these parameters. A solar energy-powered wireless-based smart object was developed, able to monitor the running temperatures of a solar thermal system and aimed at moving beyond standard monitoring approaches to achieve a low-cost and customizable device, even in terms of installation in different environmental conditions. To this end, two types of communications were used: the first is a low-cost communication based on the ZigBee protocol used for control purposes, so that it can be customized according to specific needs, while the second is based on a Bluetooth protocol used for data display. PMID:26556356
An Open Source Low-Cost Wireless Control System for a Forced Circulation Solar Plant.
Salamone, Francesco; Belussi, Lorenzo; Danza, Ludovico; Ghellere, Matteo; Meroni, Italo
2015-11-05
The article describes the design phase, development and practical application of a low-cost control system for a forced circulation solar plant in an outdoor test cell located near Milan. Such a system provides for the use of an electric pump for the circulation of heat transfer fluid connecting the solar thermal panel to the storage tank. The running plant temperatures are the fundamental parameter to evaluate the system performance such as proper operation, and the control and management system has to consider these parameters. A solar energy-powered wireless-based smart object was developed, able to monitor the running temperatures of a solar thermal system and aimed at moving beyond standard monitoring approaches to achieve a low-cost and customizable device, even in terms of installation in different environmental conditions. To this end, two types of communications were used: the first is a low-cost communication based on the ZigBee protocol used for control purposes, so that it can be customized according to specific needs, while the second is based on a Bluetooth protocol used for data display.
The use of multisensor data for robotic applications
NASA Technical Reports Server (NTRS)
Abidi, M. A.; Gonzalez, R. C.
1990-01-01
The feasibility of realistic autonomous space manipulation tasks using multisensory information is shown through two experiments involving a fluid interchange system and a module interchange system. In both cases, autonomous location of the mating element, autonomous location of the guiding light target, mating, and demating of the system were performed. Specifically, vision-driven techniques were implemented to determine the arbitrary two-dimensional position and orientation of the mating elements as well as the arbitrary three-dimensional position and orientation of the light targets. The robotic system was also equipped with a force/torque sensor that continuously monitored the six components of force and torque exerted on the end effector. Using vision, force, torque, proximity, and touch sensors, the two experiments were completed successfully and autonomously.
Totally opportunistic routing algorithm (TORA) for underwater wireless sensor network
Hashim, Fazirulhisyam; Rasid, Mohd Fadlee A.; Othman, Mohamed
2018-01-01
Underwater Wireless Sensor Network (UWSN) has emerged as promising networking techniques to monitor and explore oceans. Research on acoustic communication has been conducted for decades, but had focused mostly on issues related to physical layer such as high latency, low bandwidth, and high bit error. However, data gathering process is still severely limited in UWSN due to channel impairment. One way to improve data collection in UWSN is the design of routing protocol. Opportunistic Routing (OR) is an emerging technique that has the ability to improve the performance of wireless network, notably acoustic network. In this paper, we propose an anycast, geographical and totally opportunistic routing algorithm for UWSN, called TORA. Our proposed scheme is designed to avoid horizontal transmission, reduce end to end delay, overcome the problem of void nodes and maximize throughput and energy efficiency. We use TOA (Time of Arrival) and range based equation to localize nodes recursively within a network. Once nodes are localized, their location coordinates and residual energy are used as a matrix to select the best available forwarder. All data packets may or may not be acknowledged based on the status of sender and receiver. Thus, the number of acknowledgments for a particular data packet may vary from zero to 2-hop. Extensive simulations were performed to evaluate the performance of the proposed scheme for high network traffic load under very sparse and very dense network scenarios. Simulation results show that TORA significantly improves the network performance when compared to some relevant existing routing protocols, such as VBF, HHVBF, VAPR, and H2DAB, for energy consumption, packet delivery ratio, average end-to-end delay, average hop-count and propagation deviation factor. TORA reduces energy consumption by an average of 35% of VBF, 40% of HH-VBF, 15% of VAPR, and 29% of H2DAB, whereas the packet delivery ratio has been improved by an average of 43% of VBF, 26% of HH-VBF, 15% of VAPR, and 25% of H2DAB. Moreover, the average end-to-end delay has been reduced by 70% of VBF, 69% of HH-VBF, 46% of VAPR, and 73% of H2DAB. Furthermore, average hope-count has been improved by 57%, 53%, 16% and 31% as compared to VBF, HHVBF, VAPR, and H2DAB, respectively. Also, propagation delay has been reduced by 34%, 30%, 15% and 23% as compared to VBF, HHVBF, VAPR, and H2DAB, respectively. PMID:29874237
Totally opportunistic routing algorithm (TORA) for underwater wireless sensor network.
Rahman, Ziaur; Hashim, Fazirulhisyam; Rasid, Mohd Fadlee A; Othman, Mohamed
2018-01-01
Underwater Wireless Sensor Network (UWSN) has emerged as promising networking techniques to monitor and explore oceans. Research on acoustic communication has been conducted for decades, but had focused mostly on issues related to physical layer such as high latency, low bandwidth, and high bit error. However, data gathering process is still severely limited in UWSN due to channel impairment. One way to improve data collection in UWSN is the design of routing protocol. Opportunistic Routing (OR) is an emerging technique that has the ability to improve the performance of wireless network, notably acoustic network. In this paper, we propose an anycast, geographical and totally opportunistic routing algorithm for UWSN, called TORA. Our proposed scheme is designed to avoid horizontal transmission, reduce end to end delay, overcome the problem of void nodes and maximize throughput and energy efficiency. We use TOA (Time of Arrival) and range based equation to localize nodes recursively within a network. Once nodes are localized, their location coordinates and residual energy are used as a matrix to select the best available forwarder. All data packets may or may not be acknowledged based on the status of sender and receiver. Thus, the number of acknowledgments for a particular data packet may vary from zero to 2-hop. Extensive simulations were performed to evaluate the performance of the proposed scheme for high network traffic load under very sparse and very dense network scenarios. Simulation results show that TORA significantly improves the network performance when compared to some relevant existing routing protocols, such as VBF, HHVBF, VAPR, and H2DAB, for energy consumption, packet delivery ratio, average end-to-end delay, average hop-count and propagation deviation factor. TORA reduces energy consumption by an average of 35% of VBF, 40% of HH-VBF, 15% of VAPR, and 29% of H2DAB, whereas the packet delivery ratio has been improved by an average of 43% of VBF, 26% of HH-VBF, 15% of VAPR, and 25% of H2DAB. Moreover, the average end-to-end delay has been reduced by 70% of VBF, 69% of HH-VBF, 46% of VAPR, and 73% of H2DAB. Furthermore, average hope-count has been improved by 57%, 53%, 16% and 31% as compared to VBF, HHVBF, VAPR, and H2DAB, respectively. Also, propagation delay has been reduced by 34%, 30%, 15% and 23% as compared to VBF, HHVBF, VAPR, and H2DAB, respectively.
Surface Infiltration Rates of Permeable Surfaces: Six Month ...
At the end of October 2009, EPA opened a parking lot on the Edison Environmental Center that included three parking rows of permeable pavement. The construction was a cooperative effort among EPA’s Office of Administration and Resources Management, National Risk Management Research Laboratory, and the facility owner, Region 2. The lot serves as an active parking area for facility staff and visitors and also as a research platform. Key unknowns in the application of green infrastructure include the long term performance and the maintenance requirements. The perceived uncertainty in these is a barrier to widespread adoption of the installation of permeable surfaces for stormwater management. EPA recognizes the need for credible long-term performance maintenance data and has begun a long-term monitoring effort on this installation. This document outlines the methods and results of the surface infiltration monitoring of the permeable parking surfaces during the first six months of operation. To inform the public.
Temperature - Emissivity Separation Assessment in a Sub-Urban Scenario
NASA Astrophysics Data System (ADS)
Moscadelli, M.; Diani, M.; Corsini, G.
2017-10-01
In this paper, a methodology that aims at evaluating the effectiveness of different TES strategies is presented. The methodology takes into account the specific material of interest in the monitored scenario, sensor characteristics, and errors in the atmospheric compensation step. The methodology is proposed in order to predict and analyse algorithms performances during the planning of a remote sensing mission, aimed to discover specific materials of interest in the monitored scenario. As case study, the proposed methodology is applied to a real airborne data set of a suburban scenario. In order to perform the TES problem, three state-of-the-art algorithms, and a recently proposed one, are investigated: Temperature-Emissivity Separation '98 (TES-98) algorithm, Stepwise Refining TES (SRTES) algorithm, Linear piecewise TES (LTES) algorithm, and Optimized Smoothing TES (OSTES) algorithm. At the end, the accuracy obtained with real data, and the ones predicted by means of the proposed methodology are compared and discussed.
Crossing the quality chasm: the role of information technology departments.
Weir, Charlene R; Hicken, Bret L; Rappaport, Hank Steven; Nebeker, Jonathan R
2006-01-01
Integrating information technology (IT) into medical settings is considered essential to transforming hospitals into 21st-century health care institutions. Yet the role of IT departments in maximizing the effectiveness of information systems is not well understood. This article reports a 3-round Delphi panel of Veterans Administration personnel experienced with provider order entry electronic systems. In round 1, 35 administrative, clinical, and IT personnel answered 10 open-ended questions about IT strategies and structures that best support successful transformation. In round 2, panelists rated item importance and ranked proposed strategies. In round 3, panelists received aggregate feedback and rerated the items. Four domains emerged from round 1: IT organization, IT performance monitoring, user-support activities, and core IT responsibilities (eg, computer security, training). In rounds 2 and 3, IT performance monitoring was rated the most important, closely followed by clinical support. Strategies associated with each domain are identified and discussed.
High throughput wafer defect monitor for integrated metrology applications in photolithography
NASA Astrophysics Data System (ADS)
Rao, Nagaraja; Kinney, Patrick; Gupta, Anand
2008-03-01
The traditional approach to semiconductor wafer inspection is based on the use of stand-alone metrology tools, which while highly sensitive, are large, expensive and slow, requiring inspection to be performed off-line and on a lot sampling basis. Due to the long cycle times and sparse sampling, the current wafer inspection approach is not suited to rapid detection of process excursions that affect yield. The semiconductor industry is gradually moving towards deploying integrated metrology tools for real-time "monitoring" of product wafers during the manufacturing process. Integrated metrology aims to provide end-users with rapid feedback of problems during the manufacturing process, and the benefit of increased yield, and reduced rework and scrap. The approach of monitoring 100% of the wafers being processed requires some trade-off in sensitivity compared to traditional standalone metrology tools, but not by much. This paper describes a compact, low-cost wafer defect monitor suitable for integrated metrology applications and capable of detecting submicron defects on semiconductor wafers at an inspection rate of about 10 seconds per wafer (or 360 wafers per hour). The wafer monitor uses a whole wafer imaging approach to detect defects on both un-patterned and patterned wafers. Laboratory tests with a prototype system have demonstrated sensitivity down to 0.3 µm on un-patterned wafers and down to 1 µm on patterned wafers, at inspection rates of 10 seconds per wafer. An ideal application for this technology is preventing photolithography defects such as "hot spots" by implementing a wafer backside monitoring step prior to exposing wafers in the lithography step.
Monitoring transients in low inductance circuits
Guilford, Richard P.; Rosborough, John R.
1987-01-01
A pair of flat cable transmission lines are monitored for transient current spikes by using a probe connected to a current transformer by a pickup loop and monitoring the output of the current transformer. The approach utilizes a U-shaped pickup probe wherein the pair of flat cable transmission lines are received between the legs of the U-shaped probe. The U-shaped probe is preferably formed of a flat coil conductor adhered to one side of a flexible substrate. On the other side of the flexible substrate there is a copper foil shield. The copper foil shield is connected to one end of the flat conductor coil and connected to one leg of the pickup loop which passes through the current transformer. The other end of the flat conductor coil is connected to the other leg of the pickup loop.
Baars, Maria A. E.; Nije Bijvank, Marije; Tonnaer, Geertje H.; Jolles, Jelle
2015-01-01
Recent studies in late adolescents (age 17+) show that brain development may proceed till around the 25th year of age. This implies that study performance in higher education could be dependent upon the stage of brain maturation and neuropsychological development. Individual differences in development of neuropsychological skills may thus have a substantial influence on the outcome of the educational process. This hypothesis was evaluated in a large survey of 1760 first-year students at a University of Applied Sciences, of which 1332 are included in the current analyses. This was because of their fit within the age range we pre-set (17–20 years' old at start of studies). Student characteristics and three behavioral ratings of executive functioning (EF) were evaluated with regard to their influence on academic performance. Self-report measures were used: self-reported attention, planning, and self-control and self-monitoring. Results showed that students with better self-reported EF at the start of the first year of their studies obtained more study credits at the end of that year than students with a lower EF self-rating. The correlation between self-control and self-monitoring on the one hand, and study progress on the other, appeared to differ for male and female students and to be influenced by the level of prior education. The results of this large-scale study could have practical relevance. The profound individual differences between students may at least partly be a consequence of their stage of development as an adolescent. Students who show lower levels of attention control, planning, and self-control/self-monitoring can be expected to have a problem in study planning and study progress monitoring and hence study progress. The findings imply that interventions directed at the training of these (executive) functions should be developed and used in higher education in order to improve academic achievement, learning attitude, and motivation. PMID:26300823
Baars, Maria A E; Nije Bijvank, Marije; Tonnaer, Geertje H; Jolles, Jelle
2015-01-01
Recent studies in late adolescents (age 17+) show that brain development may proceed till around the 25th year of age. This implies that study performance in higher education could be dependent upon the stage of brain maturation and neuropsychological development. Individual differences in development of neuropsychological skills may thus have a substantial influence on the outcome of the educational process. This hypothesis was evaluated in a large survey of 1760 first-year students at a University of Applied Sciences, of which 1332 are included in the current analyses. This was because of their fit within the age range we pre-set (17-20 years' old at start of studies). Student characteristics and three behavioral ratings of executive functioning (EF) were evaluated with regard to their influence on academic performance. Self-report measures were used: self-reported attention, planning, and self-control and self-monitoring. Results showed that students with better self-reported EF at the start of the first year of their studies obtained more study credits at the end of that year than students with a lower EF self-rating. The correlation between self-control and self-monitoring on the one hand, and study progress on the other, appeared to differ for male and female students and to be influenced by the level of prior education. The results of this large-scale study could have practical relevance. The profound individual differences between students may at least partly be a consequence of their stage of development as an adolescent. Students who show lower levels of attention control, planning, and self-control/self-monitoring can be expected to have a problem in study planning and study progress monitoring and hence study progress. The findings imply that interventions directed at the training of these (executive) functions should be developed and used in higher education in order to improve academic achievement, learning attitude, and motivation.
Carbon ions beam therapy monitoring with the INSIDE in-beam PET.
Pennazio, Francesco; Battistoni, Giuseppe; Bisogni, Maria Giuseppina; Camarlinghi, Niccolò; Ferrari, Alfredo; Ferrero, Veronica; Fiorina, Elisa; Morrocchi, Matteo; Sala, Paola R; Sportelli, Giancarlo; Wheadon, Richard; Cerello, Piergiorgio
2018-06-06
In-vivo range monitoring techniques are necessary in order to fully take advantage of the high dose gradients deliverable in hadrontherapy treatments. Positron Emission Tomography (PET) scanners can be used to monitor beam-induced activation in tissues and hence measure the range. The INSIDE (Innovative Solutions for In-beam DosimEtry in Hadrontherapy) in-beam PET scanner, installed at the Italian National Center of Oncological Hadrontherapy (CNAO, Pavia, Italy) synchrotron facility, has already been successfully tested in-vivo during a proton therapy treatment. We discuss here the system performance evaluation with carbon ion beams, in view of future in-vivo tests. The work is focused on the analysis of activity images obtained with therapeutic treatments delivered to polymethyl methacrylate (PMMA) phantoms, as well as on the test of an innovative and robust Monte Carlo simulation technique for the production of reliable prior activity maps. Images are reconstructed using different integration intervals, so as to monitor the activity evolution during and after the treatment. Three procedures to compare activity images are presented, namely Pearson Correlation Coefficient, Beam's Eye View and Overall View. Images of repeated irradiations of the same treatments are compared to assess the integration time necessary to provide reproducible images. The range agreement between simulated and experimental images is also evaluated, so as to validate the simulation capability to provide sound prior information. The results indicate that at treatment end, or at most 20 s afterwards, the range measurement is reliable within 1-2 mm, when comparing both different experimental sessions and data with simulations. In conclusion, this work shows that the INSIDE in-beam PET scanner performance is promising towards its in-vivo test with carbon ions. © 2018 Institute of Physics and Engineering in Medicine.
Investigating the efficiency of IEEE 802.15.4 for medical monitoring applications.
Pelegris, P; Banitsas, K
2011-01-01
Recent advancements in wireless communications technologies bring us one step closer to provide reliable Telecare services as an alternative to patients staying in a hospital mainly for monitoring purposes. In this research we investigate the efficiency of IEEE 802.15.4 in a simple scenario where a patient is being monitored using an ECG and a blood analysis module. This approach binds well with assisted living solutions, by sharing the network infrastructure for both monitoring and control while taking advantage of the low power features of the protocol. Such applications are becoming more and more realistic to implement as IEEE 802.15.4 compatible hardware becomes increasingly available. Our aim is to examine the impact of Beacon and Superframe Order in the medium access delay, dropped packets, end to end delay, average retransmission attempts and consumed power focusing on this bandwidth demanding situation where the network load does not allow low duty cycles, in order to draw some conclusions on the effect that this will have to telemonitoring applications.
Tao, Shiquan; Winstead, Christopher B.
2005-04-12
A monitor is provided for use in measuring the concentration of hexavalent chromium in a liquid, such as water. The monitor includes a sample cell, a light source, and a photodetector. The sample cell is in the form of a liquid-core waveguide, the sample cell defining an interior core and acting as a receiver for the liquid to be analyzed, the interior surface of the sample cell having a refractive index of less than 1.33. The light source is in communication with a first end of the sample cell for emitting radiation having a wavelength of about and between 350 to 390 nm into the interior core of the waveguide. The photodetector is in communication with a second end of the waveguide for measuring the absorption of the radiation emitted by the light source by the liquid in the sample cell. The monitor may also include a processor electronically coupled to the photodetector for receipt of an absorption signal to determine the concentration of hexavalent chromium in the liquid.
NASA Technical Reports Server (NTRS)
Olson, B. A.; Lee, H. C.; Osgerby, I. T.; Heck, R. M.; Hess, H.
1980-01-01
The durability of CATCOM catalysts and catalyst supports was experimentally demonstrated in a combustion environment under simulated gas turbine engine combustor operating conditions. A test of 1000 hours duration was completed with one catalyst using no. 2 diesel fuel and operating at catalytically-supported thermal combustion conditions. The performance of the catalyst was determined by monitoring emissions throughout the test, and by examining the physical condition of the catalyst core at the conclusion of the test. Tests were performed periodically to determine changes in catalytic activity of the catalyst core. Detailed parametric studies were also run at the beginning and end of the durability test, using no. 2 fuel oil. Initial and final emissions for the 1000 hours test respectively were: unburned hydrocarbons (C3 vppm):0, 146, carbon monoxide (vppm):30, 2420; nitrogen oxides (vppm):5.7, 5.6.
The SupraThermal Ion Monitor for space weather predictions.
Allegrini, F; Desai, M I; Livi, S; McComas, D J; Ho, G C
2014-05-01
Measurement of suprathermal energy ions in the heliosphere has always been challenging because (1) these ions are situated in the energy regime only a few times higher than the solar wind plasma, where intensities are orders of magnitude higher and (2) ion energies are below or close to the threshold of state-of-art solid-state detectors. Suprathermal ions accelerated at coronal mass ejection-driven shocks propagate out ahead of the shocks. These shocks can cause geomagnetic storms in the Earth's magnetosphere that can affect spacecraft and ground-based power and communication systems. An instrument with sufficient sensitivity to measure these ions can be used to predict the arrival of the shocks and provide an advance warning for potentially geo-effective space weather. In this paper, we present a novel energy analyzer concept, the Suprathermal Ion Monitor (STIM) that is designed to measure suprathermal ions with high sensitivity. We show results from a laboratory prototype and demonstrate the feasibility of the concept. A list of key performances is given, as well as a discussion of various possible detectors at the back end. STIM is an ideal candidate for a future space weather monitor in orbit upstream of the near-earth environment, for example, around L1. A scaled-down version is suitable for a CubeSat mission. Such a platform allows proofing the concept and demonstrating its performance in the space environment.
NASA Astrophysics Data System (ADS)
Krotov, Eugene V.; Yakovlev, Ivan V.; Zhadobov, Maxim; Reyman, Alexander M.; Zharov, Vladimir P.
2002-06-01
This work present the results of experimental study of applicability of acoustical brightness thermometry (ABT) in monitoring of internal temperature during laser hyperthermia and interstitial therapy. In these experiments the radiation of pulse repetition Nd:YAG laser (1064 nm) and continuous diode laser (800 nm) were used as heating sources. Experiments were performed in vitro by insertion of optical fiber inside the objects - optically transparent gelatin with incorporated light absorbing heterogeneities and samples of biological tissues (e.g. liver). During laser heating, internal temperature in absorbing heterogeneity and at fiber end were monitored by means of multi-channel ABT. The independent temperature control was performed with tiny electronic thermometer incorporated in heated zones. The results of experiments demonstrated reasonable sensitivity and accuracy of ABT for real-time temperature control during different kind of laser thermal therapies. According to preliminary data, ABT allow to measure temperature in depth up to 3-5 cm (depends on tissue properties) with spatial resolution some mm. Obtained data show that ABT is a very promising tool to give quantitative measure for different types of energy deposition (laser, microwave, focused ultrasound etc) at the depth commonly encountered in tumors of vital organs. Besides, ABT could give information about diffusion effects in heated zones or optical absorption. This work was supported by Russian Foundation for Basic Research and 6th competition-expertise of young scientists of Russian Academy of Sciences.
Coolant monitoring apparatus for nuclear reactors
Tokarz, Richard D.
1983-01-01
A system for monitoring coolant conditions within a pressurized vessel. A length of tubing extends outward from the vessel from an open end containing a first line restriction at the location to be monitored. The flowing fluid is cooled and condensed before passing through a second line restriction. Measurement of pressure drop at the second line restriction gives an indication of fluid condition at the first line restriction. Multiple lengths of tubing with open ends at incremental elevations can measure coolant level within the vessel.
Informing Drought Preparedness and Response with the South Asia Land Data Assimilation System
NASA Astrophysics Data System (ADS)
Zaitchik, B. F.; Ghatak, D.; Matin, M. A.; Qamer, F. M.; Adhikary, B.; Bajracharya, B.; Nelson, J.; Pulla, S. T.; Ellenburg, W. L.
2017-12-01
Decision-relevant drought monitoring in South Asia is a challenge from both a scientific and an institutional perspective. Scientifically, climatic diversity, inconsistent in situ monitoring, complex hydrology, and incomplete knowledge of atmospheric processes mean that monitoring and prediction are fraught with uncertainty. Institutionally, drought monitoring efforts need to align with the information needs and decision-making processes of relevant agencies at national and subnational levels. Here we present first results from an emerging operational drought monitoring and forecast system developed and supported by the NASA SERVIR Hindu-Kush Himalaya hub. The system has been designed in consultation with end users from multiple sectors in South Asian countries to maximize decision-relevant information content in the monitoring and forecast products. Monitoring of meteorological, agricultural, and hydrological drought is accomplished using the South Asia Land Data Assimilation System, a platform that supports multiple land surface models and meteorological forcing datasets to characterize uncertainty, and subseasonal to seasonal hydrological forecasts are produced by driving South Asia LDAS with downscaled meteorological fields drawn from an ensemble of global dynamically-based forecast systems. Results are disseminated to end users through a Tethys online visualization platform and custom communications that provide user oriented, easily accessible, timely, and decision-relevant scientific information.
Stabile, Luca; Cauda, Emanuele; Marini, Sara; Buonanno, Giorgio
2014-08-01
Adverse health effects caused by worker exposure to ultrafine particles have been detected in recent years. The scientific community focuses on the assessment of ultrafine aerosols in different microenvironments in order to determine the related worker exposure/dose levels. To this end, particle size distribution measurements have to be taken along with total particle number concentrations. The latter are obtainable through hand-held monitors. A portable particle size distribution analyzer (Nanoscan SMPS 3910, TSI Inc.) was recently commercialized, but so far no metrological assessment has been performed to characterize its performance with respect to well-established laboratory-based instruments such as the scanning mobility particle sizer (SMPS) spectrometer. The present paper compares the aerosol monitoring capability of the Nanoscan SMPS to the laboratory SMPS in order to evaluate whether the Nanoscan SMPS is suitable for field experiments designed to characterize particle exposure in different microenvironments. Tests were performed both in a Marple calm air chamber, where fresh diesel particulate matter and atomized dioctyl phthalate particles were monitored, and in microenvironments, where outdoor, urban, indoor aged, and indoor fresh aerosols were measured. Results show that the Nanoscan SMPS is able to properly measure the particle size distribution for each type of aerosol investigated, but it overestimates the total particle number concentration in the case of fresh aerosols. In particular, the test performed in the Marple chamber showed total concentrations up to twice those measured by the laboratory SMPS-likely because of the inability of the Nanoscan SMPS unipolar charger to properly charge aerosols made up of aggregated particles. Based on these findings, when field test exposure studies are conducted, the Nanoscan SMPS should be used in tandem with a condensation particle counter in order to verify and correct the particle size distribution data. Published by Oxford University Press on behalf of the British Occupational Hygiene Society 2014.
Using eddy currents for noninvasive in vivo pH monitoring for bone tissue engineering.
Beck-Broichsitter, Benedicta E; Daschner, Frank; Christofzik, David W; Knöchel, Reinhard; Wiltfang, Jörg; Becker, Stephan T
2015-03-01
The metabolic processes that regulate bone healing and bone induction in tissue engineering models are not fully understood. Eddy current excitation is widely used in technical approaches and in the food industry. The aim of this study was to establish eddy current excitation for monitoring metabolic processes during heterotopic osteoinduction in vivo. Hydroxyapatite scaffolds were implanted into the musculus latissimus dorsi of six rats. Bone morphogenetic protein 2 (BMP-2) was applied 1 and 2 weeks after implantation. Weekly eddy current excitation measurements were performed. Additionally, invasive pH measurements were obtained from the scaffolds using fiber optic detection devices. Correlations between the eddy current measurements and the metabolic values were calculated. The eddy current measurements and pH values decreased significantly in the first 2 weeks of the study, followed by a steady increase and stabilization at higher levels towards the end of the study. The measurement curves and statistical evaluations indicated a significant correlation between the resonance frequency values of the eddy current excitation measurements and the observed pH levels (p = 0.0041). This innovative technique was capable of noninvasively monitoring metabolic processes in living tissues according to pH values, showing a direct correlation between eddy current excitation and pH in an in vivo tissue engineering model.
SBIR Phase II Final Report: Low cost Autonomous NMR and Multi-sensor Soil Monitoring Instrument
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walsh, David O.
In this 32-month SBIR Phase 2 program, Vista Clara designed, assembled and successfully tested four new NMR instruments for soil moisture measurement and monitoring: An enhanced performance man-portable Dart NMR logging probe and control unit for rapid, mobile measurement in core holes and 2” PVC access wells; A prototype 4-level Dart NMR monitoring probe and prototype multi-sensor soil monitoring control unit for long-term unattended monitoring of soil moisture and other measurements in-situ; A non-invasive 1m x 1m Discus NMR soil moisture sensor with surface based magnet/coil array for rapid measurement of soil moisture in the top 50 cm of themore » subsurface; A non-invasive, ultra-lightweight Earth’s field surface NMR instrument for non-invasive measurement and mapping of soil moisture in the top 3 meters of the subsurface. The Phase 2 research and development achieved most, but not all of our technical objectives. The single-coil Dart in-situ sensor and control unit were fully developed, demonstrated and successfully commercialized within the Phase 2 period of performance. The multi-level version of the Dart probe was designed, assembled and demonstrated in Phase 2, but its final assembly and testing were delayed until close to the end of the Phase 2 performance period, which limited our opportunities for demonstration in field settings. Likewise, the multi-sensor version of the Dart control unit was designed and assembled, but not in time for it to be deployed for any long-term monitoring demonstrations. The prototype ultra-lightweight surface NMR instrument was developed and demonstrated, and this result will be carried forward into the development of a new flexible surface NMR instrument and commercial product in 2018.« less
Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2005
Dixon, James P.; Stihler, Scott D.; Power, John A.; Tytgat, Guy; Estes, Steve; McNutt, Stephen R.
2006-01-01
The Alaska Volcano Observatory (AVO), a cooperative program of the U.S. Geological Survey, the Geophysical Institute of the University of Alaska Fairbanks, and the Alaska Division of Geological and Geophysical Surveys, has maintained seismic monitoring networks at historically active volcanoes in Alaska since 1988 (Figure 1). The primary objectives of the seismic program are the real-time seismic monitoring of active, potentially hazardous, Alaskan volcanoes and the investigation of seismic processes associated with active volcanism. This catalog presents calculated earthquake hypocenters and seismic phase arrival data, and details changes in the seismic monitoring program for the period January 1 through December 31, 2005.The AVO seismograph network was used to monitor the seismic activity at thirty-two volcanoes within Alaska in 2005 (Figure 1). The network was augmented by two new subnetworks to monitor the Semisopochnoi Island volcanoes and Little Sitkin Volcano. Seismicity at these volcanoes was still being studied at the end of 2005 and has not yet been added to the list of permanently monitored volcanoes in the AVO weekly update. Following an extended period of monitoring to determine the background seismicity at the Mount Peulik, Ukinrek Maars, and Korovin Volcano, formal monitoring of these volcanoes began in 2005. AVO located 9,012 earthquakes in 2005.Monitoring highlights in 2005 include: (1) seismicity at Mount Spurr remaining above background, starting in February 2004, through the end of the year and into 2006; (2) an increase in seismicity at Augustine Volcano starting in May 2005, and continuing through the end of the year into 2006; (3) volcanic tremor and seismicity related to low-level strombolian activity at Mount Veniaminof in January to March and September; and (4) a seismic swarm at Tanaga Volcano in October and November.This catalog includes: (1) descriptions and locations of seismic instrumentation deployed in the field in 2005; (2) a description of earthquake detection, recording, analysis, and data archival systems; (3) a description of seismic velocity models used for earthquake locations; (4) a summary of earthquakes located in 2005; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, and location quality statistics; daily station usage statistics; and all HYPOELLIPSE files used to determine the earthquake locations in 2005.
Study to design and develop remote manipulator system
NASA Technical Reports Server (NTRS)
Hill, J. W.; Sword, A. J.
1973-01-01
Human performance measurement techniques for remote manipulation tasks and remote sensing techniques for manipulators are described for common manipulation tasks, performance is monitored by means of an on-line computer capable of measuring the joint angles of both master and slave arms as a function of time. The computer programs allow measurements of the operator's strategy and physical quantities such as task time and power consumed. The results are printed out after a test run to compare different experimental conditions. For tracking tasks, we describe a method of displaying errors in three dimensions and measuring the end-effector position in three dimensions.
The SKA1 LOW telescope: system architecture and design performance
NASA Astrophysics Data System (ADS)
Waterson, Mark F.; Labate, Maria Grazia; Schnetler, Hermine; Wagg, Jeff; Turner, Wallace; Dewdney, Peter
2016-07-01
The SKA1-LOW radio telescope will be a low-frequency (50-350 MHz) aperture array located in Western Australia. Its scientific objectives will prioritize studies of the Epoch of Reionization and pulsar physics. Development of the telescope has been allocated to consortia responsible for the aperture array front end, timing distribution, signal and data transport, correlation and beamforming signal processors, infrastructure, monitor and control systems, and science data processing. This paper will describe the system architectural design and key performance parameters of the telescope and summarize the high-level sub-system designs of the consortia.
A Modular IoT Platform for Real-Time Indoor Air Quality Monitoring.
Benammar, Mohieddine; Abdaoui, Abderrazak; Ahmad, Sabbir H M; Touati, Farid; Kadri, Abdullah
2018-02-14
The impact of air quality on health and on life comfort is well established. In many societies, vulnerable elderly and young populations spend most of their time indoors. Therefore, indoor air quality monitoring (IAQM) is of great importance to human health. Engineers and researchers are increasingly focusing their efforts on the design of real-time IAQM systems using wireless sensor networks. This paper presents an end-to-end IAQM system enabling measurement of CO₂, CO, SO₂, NO₂, O₃, Cl₂, ambient temperature, and relative humidity. In IAQM systems, remote users usually use a local gateway to connect wireless sensor nodes in a given monitoring site to the external world for ubiquitous access of data. In this work, the role of the gateway in processing collected air quality data and its reliable dissemination to end-users through a web-server is emphasized. A mechanism for the backup and the restoration of the collected data in the case of Internet outage is presented. The system is adapted to an open-source Internet-of-Things (IoT) web-server platform, called Emoncms, for live monitoring and long-term storage of the collected IAQM data. A modular IAQM architecture is adopted, which results in a smart scalable system that allows seamless integration of various sensing technologies, wireless sensor networks (WSNs) and smart mobile standards. The paper gives full hardware and software details of the proposed solution. Sample IAQM results collected in various locations are also presented to demonstrate the abilities of the system.
Hagmann, Henning; Oelmann, Katrin; Stangl, Robert; Michels, Guido
2016-12-20
The phenomenon of autoresuscitation is rare, yet it is known to most emergency physicians. However, the pathophysiology of the delayed return of spontaneous circulation remains enigmatic. Among other causes hyperinflation of the lungs and excessively high positive end-expiratory pressure have been suggested, but reports including cardiopulmonary monitoring during cardiopulmonary resuscitation are scarce to support this hypothesis. We report a case of autoresuscitation in a 44-year-old white man after 80 minutes of advanced cardiac life support accompanied by continuous capnometry and repeated evaluation by ultrasound and echocardiography. After prolonged cardiopulmonary resuscitation, refractory electromechanical dissociation on electrocardiogram and ventricular akinesis were recorded. In addition, a precipitous drop in end-tidal partial pressure of carbon dioxide was noted and cardiopulmonary resuscitation was discontinued. Five minutes after withdrawal of all supportive measures his breathing resumed and a perfusing rhythm ensued. Understanding the underlying pathophysiology of autoresuscitation is hampered by a lack of reports including extensive cardiopulmonary monitoring during cardiopulmonary resuscitation in a preclinical setting. In this case, continuous capnometry was combined with repetitive ultrasound evaluation, which ruled out most assumed causes of autoresuscitation. Our observation of a rapid decline in end-tidal partial pressure of carbon dioxide supports the hypothesis of increased intrathoracic pressure. Continuous capnometry can be performed easily during cardiopulmonary resuscitation, also in a preclinical setting. Knowledge of the pathophysiologic mechanisms may lead to facile interventions to be incorporated into cardiopulmonary resuscitation algorithms. A drop in end-tidal partial pressure of carbon dioxide, for example, might prompt disconnection of the ventilation to allow left ventricular filling. Further reports and research on this topic are encouraged.
NASA Astrophysics Data System (ADS)
Kramer, J. L. A. M.; Ullings, A. H.; Vis, R. D.
1993-05-01
A real-time data acquisition system for microprobe analysis has been developed at the Free University of Amsterdam. The system is composed of two parts: a front-end real-time and a back-end monitoring system. The front-end consists of a VMEbus based system which reads out a CAMAC crate. The back-end is implemented on a Sun work station running the UNIX operating system. This separation allows the integration of a minimal, and consequently very fast, real-time executive within the sophisticated possibilities of advanced UNIX work stations.
Calibration and performance of the ATLAS Tile Calorimeter during the LHC Run 2
NASA Astrophysics Data System (ADS)
Cerda Alberich, L.
2018-02-01
The Tile Calorimeter (TileCal) is the hadronic sampling calorimeter of the ATLAS experiment at the Large Hadron Collider (LHC). TileCal uses iron absorbers and scintillators as active material and it covers the central region | η| < 1.7. Jointly with the other sub-detectors it is designed for measurements of hadrons, jets, tau-particles and missing transverse energy. It also assists in muon identification. TileCal is regularly monitored and calibrated by several different calibration systems: a Cs radioactive source, a laser light system to check the PMT response, and a charge injection system (CIS) to check the front-end electronics. These calibration systems, in conjunction with data collected during proton-proton collisions, Minimum Bias (MB) events, provide extensive monitoring of the instrument and a means for equalizing the calorimeter response at each stage of the signal propagation. The performance of the calorimeter has been established with cosmic ray muons and the large sample of the proton-proton collisions and compared to Monte Carlo (MC) simulations. The response of high momentum isolated muons is also used to study the energy response at the electromagnetic scale, isolated hadrons are used as a probe of the hadronic response. The calorimeter time resolution is studied with multijet events. A description of the different TileCal calibration systems and the results on the calorimeter performance during the LHC Run 2 are presented. The results on the pile-up noise and response uniformity studies are also discussed.
40 CFR 63.2161 - What performance tests and other procedures must I use if I monitor brew ethanol?
Code of Federal Regulations, 2013 CFR
2013-07-01
... procedures must I use if I monitor brew ethanol? 63.2161 Section 63.2161 Protection of Environment... performance tests and other procedures must I use if I monitor brew ethanol? (a) You must conduct each... performance test simultaneously with brew ethanol monitoring to establish a brew-to-exhaust correlation...
40 CFR 63.2161 - What performance tests and other procedures must I use if I monitor brew ethanol?
Code of Federal Regulations, 2012 CFR
2012-07-01
... procedures must I use if I monitor brew ethanol? 63.2161 Section 63.2161 Protection of Environment... performance tests and other procedures must I use if I monitor brew ethanol? (a) You must conduct each... performance test simultaneously with brew ethanol monitoring to establish a brew-to-exhaust correlation...
40 CFR 63.2161 - What performance tests and other procedures must I use if I monitor brew ethanol?
Code of Federal Regulations, 2014 CFR
2014-07-01
... procedures must I use if I monitor brew ethanol? 63.2161 Section 63.2161 Protection of Environment... performance tests and other procedures must I use if I monitor brew ethanol? (a) You must conduct each... performance test simultaneously with brew ethanol monitoring to establish a brew-to-exhaust correlation...
SRTR center-specific reporting tools: Posttransplant outcomes.
Dickinson, D M; Shearon, T H; O'Keefe, J; Wong, H-H; Berg, C L; Rosendale, J D; Delmonico, F L; Webb, R L; Wolfe, R A
2006-01-01
Measuring and monitoring performance--be it waiting list and posttransplant outcomes by a transplant center, or organ donation success by an organ procurement organization and its partnering hospitals--is an important component of ensuring good care for people with end-stage organ failure. Many parties have an interest in examining these outcomes, from patients and their families to payers such as insurance companies or the Centers for Medicare and Medicaid Services; from primary caregivers providing patient counseling to government agencies charged with protecting patients. The Scientific Registry of Transplant Recipients produces regular, public reports on the performance of transplant centers and organ procurement organizations. This article explains the statistical tools used to prepare these reports, with a focus on graft survival and patient survival rates of transplant centers--especially the methods used to fairly and usefully compare outcomes of centers that serve different populations. The article concludes with a practical application of these statistics--their use in screening transplant center performance to identify centers that may need remedial action by the OPTN/UNOS Membership and Professional Standards Committee.
MOLAR: Modular Linux and Adaptive Runtime Support for HEC OS/R Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frank Mueller
2009-02-05
MOLAR is a multi-institution research effort that concentrates on adaptive, reliable,and efficient operating and runtime system solutions for ultra-scale high-end scientific computing on the next generation of supercomputers. This research addresses the challenges outlined by the FAST-OS - forum to address scalable technology for runtime and operating systems --- and HECRTF --- high-end computing revitalization task force --- activities by providing a modular Linux and adaptable runtime support for high-end computing operating and runtime systems. The MOLAR research has the following goals to address these issues. (1) Create a modular and configurable Linux system that allows customized changes based onmore » the requirements of the applications, runtime systems, and cluster management software. (2) Build runtime systems that leverage the OS modularity and configurability to improve efficiency, reliability, scalability, ease-of-use, and provide support to legacy and promising programming models. (3) Advance computer reliability, availability and serviceability (RAS) management systems to work cooperatively with the OS/R to identify and preemptively resolve system issues. (4) Explore the use of advanced monitoring and adaptation to improve application performance and predictability of system interruptions. The overall goal of the research conducted at NCSU is to develop scalable algorithms for high-availability without single points of failure and without single points of control.« less
Tsukamoto, S; Hoshino, H; Tamura, T
2008-01-01
This paper describes an indoor behavioral monitoring system for improving the quality of life in ordinary houses. It employs a device that uses weak radio waves for transmitting the obtained data and it is designed such that it can be installed by a user without requiring any technical knowledge or extra constructions. This study focuses on determining the usage statistics of home electric appliances by using an electromagnetic field sensor as a detection device. The usage of the home appliances is determined by measuring the electromagnetic field that can be observed in an area near the appliance. It is assumed that these usage statistics could provide information regarding the indoor behavior of a subject. Since the sensor is not direction sensitive and does not require precise positioning and wiring, it can be easily installed in ordinary houses by the end users. For evaluating the practicability of the sensor unit, several simple tests have been performed. The results indicate that the proposed system could be useful for collecting the usage statistics of home appliances. PMID:19415135
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drotar, Alexander P.; Quinn, Erin E.; Sutherland, Landon D.
2012-07-30
Project description is: (1) Build a high performance computer; and (2) Create a tool to monitor node applications in Component Based Tool Framework (CBTF) using code from Lightweight Data Metric Service (LDMS). The importance of this project is that: (1) there is a need a scalable, parallel tool to monitor nodes on clusters; and (2) New LDMS plugins need to be able to be easily added to tool. CBTF stands for Component Based Tool Framework. It's scalable and adjusts to different topologies automatically. It uses MRNet (Multicast/Reduction Network) mechanism for information transport. CBTF is flexible and general enough to bemore » used for any tool that needs to do a task on many nodes. Its components are reusable and 'EASILY' added to a new tool. There are three levels of CBTF: (1) frontend node - interacts with users; (2) filter nodes - filters or concatenates information from backend nodes; and (3) backend nodes - where the actual work of the tool is done. LDMS stands for lightweight data metric servies. It's a tool used for monitoring nodes. Ltool is the name of the tool we derived from LDMS. It's dynamically linked and includes the following components: Vmstat, Meminfo, Procinterrupts and more. It works by: Ltool command is run on the frontend node; Ltool collects information from the backend nodes; backend nodes send information to the filter nodes; and filter nodes concatenate information and send to a database on the front end node. Ltool is a useful tool when it comes to monitoring nodes on a cluster because the overhead involved with running the tool is not particularly high and it will automatically scale to any size cluster.« less
Tiong, Ho Yee; Goh, Benjamin Yen Seow; Chiong, Edmund; Tan, Lincoln Guan Lim; Vathsala, Anatharaman
2018-03-31
Robotic-assisted kidney transplantation (RKT) with the Da Vinci (Intuitive, USA) platform has been recently developed to improve outcomes by decreasing surgical site complications and morbidity, especially in obese patients. This potential paradigm shift in the surgical technique of kidney transplantation is performed in only a few centers. For wider adoption of this high stake complex operation, we aimed to develop a procedure-specific simulation platform in a porcine model for the training of robotic intracorporeal vascular anastomosis and evaluating vascular anastomoses patency. This paper describes the requirements and steps developed for the above training purpose. Over a series of four animal ethics' approved experiments, the technique of robotic-assisted laparoscopic autotransplantation of the kidney was developed in Amsterdam live pigs (60-70 kg). The surgery was based around the vascular anastomosis technique described by Menon et al. This non-survival porcine training model is targeted at transplant surgeons with robotic surgery experience. Under general anesthesia, each pig was placed in lateral decubitus position with the placement of one robotic camera port, two robotic 8 mm ports and one assistant port. Robotic docking over the pig posteriorly was performed. The training platform involved the following procedural steps. First, ipsilateral iliac vessel dissection was performed. Second, robotic-assisted laparoscopic donor nephrectomy was performed with in situ perfusion of the kidney with cold Hartmann's solution prior to complete division of the hilar vessels, ureter and kidney mobilization. Thirdly, the kidney was either kept in situ for orthotopic autotransplantation or mobilized to the pelvis and orientated for the vascular anastomosis, which was performed end to end or end to side after vessel loop clamping of the iliac vessels, respectively, using 6/0 Gore-Tex sutures. Following autotransplantation and release of vessel loops, perfusion of the graft was assessed using intraoperative indocyanine green imaging and monitoring urine output after unclamping. This training platform demonstrates adequate face and content validity. With practice, arterial anastomotic time could be improved, showing its construct validity. This porcine training model can be useful in providing training for robotic intracorporeal vascular anastomosis and may facilitate confident translation into a transplant human recipient.
An acoustic sensor for monitoring airflow in pediatric tracheostomy patients.
Ruscher, Thomas; Wicks Phd, Alexandrina; Muelenaer Md, Andre
2012-01-01
Without proper monitoring, patients with artificial airways in the trachea are at high risk for complications or death. Despite routine maintenance of the tube, dislodged or copious mucus can obstruct the airway. Young children ( 3yrs) have difficulty tending to their own tubes and are particularly vulnerable to blockages. They require external respiratory sensors. In a hospital environment, ventilators, end-tidal CO2 monitors, thermistors, and other auxiliary equipment provide sufficient monitoring of respiration. However, outpatient monitoring methods, such as thoracic impedance and pulse oximetry, are indirect and prone to false positives. Desensitization of caregivers to frequent false alarms has been cited in medical literature as a contributing factor in cases of child death. Ultrasonic time-of-flight (TOF) is a technique used in specialized industrial applications to non-invasively measure liquid and gas flow. Two transducers are oriented at a diagonal across a flow channel. Velocity measurement is accomplished by detecting slight variations in transit time of contra-propagating acoustic signals with a directional component parallel to air flow. Due to the symmetry of acoustic pathway between sensors, velocity measurements are immune to partial fouling in the tube from mucus, saliva, and condensation. A first generation proof of concept prototype was constructed to evaluate the ultrasonic TOF technique for medical tracheostomy monitoring. After successful performance, a second generation prototype was designed with a smaller form factor and more advanced electronics. This prototype was tested and found to measure inspired volume with a root-mean-square error < 2% during initial trials.
Yang, Gordon C C; Yen, Chia-Heng; Wang, Chih-Lung
2014-07-30
This study monitored the occurrence and removal efficiencies of 8 phthalate esters (PAEs) and 13 pharmaceuticals present in the drinking water of Kaohsiung City, Taiwan. The simultaneous electrocoagulation and electrofiltration (EC/EF) process was used to remove the contaminants. To this end, a monitoring program was conducted and a novel laboratory-prepared tubular carbon nanofiber/carbon/alumina composite membrane (TCCACM) was incorporated into the EC/EF treatment module (collectively designated as "TCCACM-EC/EF treatment module") to remove the abovementioned compounds from water samples. The monitoring results showed that the concentrations of PAEs were lower in water samples from drinking fountains as compared with tap water samples. No significant differences were found between the concentrations of pharmaceuticals in the two types of water samples. Under optimal operating conditions, the TCCACM-EC/EF treatment module yielded the lowest residual concentrations, ranging from not detected (ND) to 52ng/L for PAEs and pharmaceuticals of concern in the tap water samples. Moreover, the performance of the TCCACM-EC/EF treatment module is comparable with a series of treatment units employed for the drinking fountain water treatment system. The relevant removal mechanisms involved in the TCCACM-EC/EF treatment module were also discussed in this work. Copyright © 2014 Elsevier B.V. All rights reserved.
Development of structural health monitoring techniques using dynamics testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
James, G.H. III
Today`s society depends upon many structures (such as aircraft, bridges, wind turbines, offshore platforms, buildings, and nuclear weapons) which are nearing the end of their design lifetime. Since these structures cannot be economically replaced, techniques for structural health monitoring must be developed and implemented. Modal and structural dynamics measurements hold promise for the global non-destructive inspection of a variety of structures since surface measurements of a vibrating structure can provide information about the health of the internal members without costly (or impossible) dismantling of the structure. In order to develop structural health monitoring for application to operational structures, developments inmore » four areas have been undertaken within this project: operational evaluation, diagnostic measurements, information condensation, and damage identification. The developments in each of these four aspects of structural health monitoring have been exercised on a broad range of experimental data. This experimental data has been extracted from structures from several application areas which include aging aircraft, wind energy, aging bridges, offshore structures, structural supports, and mechanical parts. As a result of these advances, Sandia National Laboratories is in a position to perform further advanced development, operational implementation, and technical consulting for a broad class of the nation`s aging infrastructure problems.« less
Kivlehan, Francine; Mavré, François; Talini, Luc; Limoges, Benoît; Marchal, Damien
2011-09-21
We described an electrochemical method to monitor in real-time the isothermal helicase-dependent amplification of nucleic acids. The principle of detection is simple and well-adapted to the development of portable, easy-to-use and inexpensive nucleic acids detection technologies. It consists of monitoring a decrease in the electrochemical current response of a reporter DNA intercalating redox probe during the isothermal DNA amplification. The method offers the possibility to quantitatively analyze target nucleic acids in less than one hour at a single constant temperature, and to perform at the end of the isothermal amplification a DNA melt curve analysis for differentiating between specific and non-specific amplifications. To illustrate the potentialities of this approach for the development of a simple, robust and low-cost instrument with high throughput capability, the method was validated with an electrochemical system capable of monitoring up to 48 real-time isothermal HDA reactions simultaneously in a disposable microplate consisting of 48-electrochemical microwells. Results obtained with this approach are comparable to that obtained with a well-established but more sophisticated and expensive fluorescence-based method. This makes for a promising alternative detection method not only for real-time isothermal helicase-dependent amplification of nucleic acid, but also for other isothermal DNA amplification strategies.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Requirements for the Back-End Process Provisions 8 Table 8 to Subpart U of Part 63 Protection of Environment...: Group I Polymers and Resins Pt. 63, Subpt. U, Table 8 Table 8 to Subpart U of Part 63—Summary of... be monitored Requirements Compliance Using Stripping Technology, Demonstrated through Periodic...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Requirements for the Back-End Process Provisions 8 Table 8 to Subpart U of Part 63 Protection of Environment...: Group I Polymers and Resins Pt. 63, Subpt. U, Table 8 Table 8 to Subpart U of Part 63—Summary of... be monitored Requirements Compliance Using Stripping Technology, Demonstrated through Periodic...
Code of Federal Regulations, 2014 CFR
2014-07-01
... Requirements for the Back-End Process Provisions 8 Table 8 to Subpart U of Part 63 Protection of Environment...: Group I Polymers and Resins Pt. 63, Subpt. U, Table 8 Table 8 to Subpart U of Part 63—Summary of... be monitored Requirements Compliance Using Stripping Technology, Demonstrated through Periodic...
Code of Federal Regulations, 2012 CFR
2012-07-01
... Requirements for the Back-End Process Provisions 8 Table 8 to Subpart U of Part 63 Protection of Environment...: Group I Polymers and Resins Pt. 63, Subpt. U, Table 8 Table 8 to Subpart U of Part 63—Summary of... be monitored Requirements Compliance Using Stripping Technology, Demonstrated through Periodic...
Code of Federal Regulations, 2013 CFR
2013-07-01
... Requirements for the Back-End Process Provisions 8 Table 8 to Subpart U of Part 63 Protection of Environment...: Group I Polymers and Resins Pt. 63, Subpt. U, Table 8 Table 8 to Subpart U of Part 63—Summary of... be monitored Requirements Compliance Using Stripping Technology, Demonstrated through Periodic...
Ten Eyck, Raymond P; Tews, Matthew; Ballester, John M; Hamilton, Glenn C
2010-06-01
To determine the impact of simulation-based instruction on student performance in the role of emergency department resuscitation team leader. A randomized, single-blinded, controlled study using an intention to treat analysis. Eighty-three fourth-year medical students enrolled in an emergency medicine clerkship were randomly allocated to two groups differing only by instructional format. Each student individually completed an initial simulation case, followed by a standardized curriculum of eight cases in either group simulation or case-based group discussion format before a second individual simulation case. A remote coinvestigator measured eight objective performance end points using digital recordings of all individual simulation cases. McNemar chi2, Pearson correlation, repeated measures multivariate analysis of variance, and follow-up analysis of variance were used for statistical evaluation. Sixty-eight students (82%) completed both initial and follow-up individual simulations. Eight students were lost from the simulation group and seven from the discussion group. The mean postintervention case performance was significantly better for the students allocated to simulation instruction compared with the group discussion students for four outcomes including a decrease in mean time to (1) order an intravenous line; (2) initiate cardiac monitoring; (3) order initial laboratory tests; and (4) initiate blood pressure monitoring. Paired comparisons of each student's initial and follow-up simulations demonstrated significant improvement in the same four areas, in mean time to order an abdominal radiograph and in obtaining an allergy history. A single simulation-based teaching session significantly improved student performance as a team leader. Additional simulation sessions provided further improvement compared with instruction provided in case-based group discussion format.
Ferreira, J; Seoane, F; Lindecrantz, K
2013-01-01
Personalised Health Systems (PHS) that could benefit the life quality of the patients as well as decreasing the health care costs for society among other factors are arisen. The purpose of this paper is to study the capabilities of the System-on-Chip Impedance Network Analyser AD5933 performing high speed single frequency continuous bioimpedance measurements. From a theoretical analysis, the minimum continuous impedance estimation time was determined, and the AD5933 with a custom 4-Electrode Analog Front-End (AFE) was used to experimentally determine the maximum continuous impedance estimation frequency as well as the system impedance estimation error when measuring a 2R1C electrical circuit model. Transthoracic Electrical Bioimpedance (TEB) measurements in a healthy subject were obtained using 3M gel electrodes in a tetrapolar lateral spot electrode configuration. The obtained TEB raw signal was filtered in MATLAB to obtain the respiration and cardiogenic signals, and from the cardiogenic signal the impedance derivative signal (dZ/dt) was also calculated. The results have shown that the maximum continuous impedance estimation rate was approximately 550 measurements per second with a magnitude estimation error below 1% on 2R1C-parallel bridge measurements. The displayed respiration and cardiac signals exhibited good performance, and they could be used to obtain valuable information in some plethysmography monitoring applications. The obtained results suggest that the AD5933-based monitor could be used for the implementation of a portable and wearable Bioimpedance plethysmograph that could be used in applications such as Impedance Cardiography. These results combined with the research done in functional garments and textile electrodes might enable the implementation of PHS applications in a relatively short time from now.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Chase Qishi; Zhu, Michelle Mengxia
The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less
NASA Technical Reports Server (NTRS)
Patashnick, H.; Rupprecht, G.
1977-01-01
The tapered element oscillating microbalance (TEOM), an ultrasensitive mass measurement device which is suitable for both particulate and vapor deposition measurements is described. The device can be used in contamination measurements, surface reaction studies, particulate monitoring systems or any microweighing activity where either laboratory or field monitoring capability is desired. The active element of the TEOM consists of a tube or reed constructed of a material with high mechanical quality factor and having a special taper. The element is firmly mounted at the wide end while the other end supports a substrate surface which can be composed of virtually any material. The tapered element with the substrate at the free (narrow) end is set into oscillation in a clamped free mode. A feedback system maintains the oscillation whose natural frequency will change in relation to the mass deposited on the substrate.
NASA Astrophysics Data System (ADS)
Lei, Zeyu; Zhou, Xin; Yang, Jie; He, Xiaolong; Wang, Yalin; Yang, Tian
2017-04-01
Integrating surface plasmon resonance (SPR) devices upon single-mode fiber (SMF) end facets renders label-free biosensing systems that have a dip-and-read configuration, high compatibility with fiber-optic techniques, and in vivo monitoring capability, which however meets the challenge to match the performance of free-space counterparts. We report a second-order distributed feedback (DFB) SPR cavity on an SMF end facet and its application in protein interaction analysis. In our device, a periodic array of nanoslits in a gold film is used to couple fiber guided lightwaves to surface plasmon polaritons (SPPs) with its first order spatial Fourier component, while the second order spatial Fourier component provides DFB to SPP propagation and produces an SPP bandgap. A phase shift section in the DFB structure introduces an SPR defect state within the SPP bandgap, whose mode profile is optimized to match that of the SMF to achieve a reasonable coupling efficiency. We report an experimental refractive index sensitivity of 628 nm RIU-1, a figure-of-merit of 80 RIU-1, and a limit of detection of 7 × 10-6 RIU. The measurement of the real-time interaction between human immunoglobulin G molecules and their antibodies is demonstrated.
NASA Astrophysics Data System (ADS)
Liu, Zijian; Corley, Steven; Shenderova, Olga; Brenner, Donald; Krim, Jacqueline
2013-03-01
Nano-diamond (ND) particles are known to be beneficial for wear and friction reduction when used as additives in liquids, but the fundamental origins of the improvement in tribological properties has not been established. In order to explore this issue, we have investigated the nanotribological properties of ND coated with self-assembled monolayers (SAM) as additives to solutions, employing gold/chrome coated quartz crystal microbalances (QCM). Measurements were performed with the QCM initially immersed in deionized water. ND particles with positively and negatively charged SAM end groups were then added to the water, while the frequency and amplitude of the QCM were monitored. Negative shifts in both the QCM frequency and amplitude were observed when ND with positively charged SAM end groups were added, while positive shifts in both the QCM frequency and amplitude were observed when ND with negatively charged ND end groups were added. The results are consistent with a lubricating effect for the negatively charged ND, but were only observed for sufficiently small negative ND particle size. Experiments on QCM surfaces with differing textures and roughness are in progress, to determine the separate contributing effects of surface roughness charge-water interactions. Funding provided by NSF DMR.
Garaventa, Francesca; Gambardella, Chiara; Di Fino, Alessio; Pittore, Massimiliano; Faimali, Marco
2010-03-01
In this study, we investigated the possibility to improve a new behavioural bioassay (Swimming Speed Alteration test-SSA test) using larvae of marine cyst-forming organisms: e.g. the brine shrimp Artemia sp. and the rotifer Brachionus plicatilis. Swimming speed was investigated as a behavioural end-point for application in ecotoxicology studies. A first experiment to analyse the linear swimming speed of the two organisms was performed to verify the applicability of the video-camera tracking system, here referred to as Swimming Behavioural Recorder (SBR). A second experiment was performed, exposing organisms to different toxic compounds (zinc pyrithione, Macrotrol MT-200, and Eserine). Swimming speed alteration was analyzed together with mortality. The results of the first experiment indicate that SBR is a suitable tool to detect linear swimming speed of the two organisms, since the values have been obtained in accordance with other studies using the same organisms (3.05 mm s(-1) for Artemia sp. and 0.62 mm s(-1) for B. plicatilis). Toxicity test results clearly indicate that swimming speed of Artemia sp. and B. plicatilis is a valid behavioural end-point to detect stress at sub-lethal toxic substance concentrations. Indeed, alterations in swimming speed have been detected at toxic compound concentrations as low as less then 0.1-5% of their LC(50) values. In conclusion, the SSA test with B. plicatilis and Artemia sp. can be a good behavioural integrated output for application in marine ecotoxicology and environmental monitoring programs.
Privacy Protection by Masking Moving Objects for Security Cameras
NASA Astrophysics Data System (ADS)
Yabuta, Kenichi; Kitazawa, Hitoshi; Tanaka, Toshihisa
Because of an increasing number of security cameras, it is crucial to establish a system that protects the privacy of objects in the recorded images. To this end, we propose a framework of image processing and data hiding for security monitoring and privacy protection. First, we state the requirements of the proposed monitoring systems and suggest possible implementation that satisfies those requirements. The underlying concept of our proposed framework is as follows: (1) in the recorded images, the objects whose privacy should be protected are deteriorated by appropriate image processing; (2) the original objects are encrypted and watermarked into the output image, which is encoded using an image compression standard; (3) real-time processing is performed such that no future frame is required to generate on output bitstream. It should be noted that in this framework, anyone can observe the decoded image that includes the deteriorated objects that are unrecognizable or invisible. On the other hand, for crime investigation, this system allows a limited number of users to observe the original objects by using a special viewer that decrypts and decodes the watermarked objects with a decoding password. Moreover, the special viewer allows us to select the objects to be decoded and displayed. We provide an implementation example, experimental results, and performance evaluations to support our proposed framework.
Ekman, Drew R.; Ankley, Gerald T.; Blazer, Vicki; Collette, Timothy W.; Garcia-Reyero, Natàlia; Iwanowicz, Luke R.; Jorgensen, Zachary G.; Lee, Kathy E.; Mazik, Pat M.; Miller, David H.; Perkins, Edward J.; Smith, Edwin T.; Tietge, Joseph E.; Villeneuve, Daniel L.
2013-01-01
There is increasing demand for the implementation of effects-based monitoring and surveillance (EBMS) approaches in the Great Lakes Basin to complement traditional chemical monitoring. Herein, we describe an ongoing multiagency effort to develop and implement EBMS tools, particularly with regard to monitoring potentially toxic chemicals and assessing Areas of Concern (AOCs), as envisioned by the Great Lakes Restoration Initiative (GLRI). Our strategy includes use of both targeted and open-ended/discovery techniques, as appropriate to the amount of information available, to guide a priori end point and/or assay selection. Specifically, a combination of in vivo and in vitro tools is employed by using both wild and caged fish (in vivo), and a variety of receptor- and cell-based assays (in vitro). We employ a work flow that progressively emphasizes in vitro tools for long-term or high-intensity monitoring because of their greater practicality (e.g., lower cost, labor) and relying on in vivo assays for initial surveillance and verification. Our strategy takes advantage of the strengths of a diversity of tools, balancing the depth, breadth, and specificity of information they provide against their costs, transferability, and practicality. Finally, a series of illustrative scenarios is examined that align EBMS options with management goals to illustrate the adaptability and scaling of EBMS approaches and how they can be used in management decisions.
Harada, Hitoshi; Kanaji, Shingo; Hasegawa, Hiroshi; Yamamoto, Masashi; Matsuda, Yoshiko; Yamashita, Kimihiro; Matsuda, Takeru; Oshikiri, Taro; Sumi, Yasuo; Nakamura, Tetsu; Suzuki, Satoshi; Kakeji, Yoshihiro
2018-03-30
Recently, several new imaging technologies, such as three-dimensional (3D)/high-definition (HD) stereovision and high-resolution two-dimensional (2D)/4K monitors, have been introduced in laparoscopic surgery. However, it is still unclear whether these technologies actually improve surgical performance. Participants were 11 expert laparoscopic surgeons. We designed three laparoscopic suturing tasks (task 1: simple suturing, task 2: knotting thread in a small box, and task 3: suturing in a narrow space) in training boxes. Performances were recorded by an optical position tracker. All participants first performed each task five times consecutively using a conventional 2D/HD monitor. Then they were randomly divided into two groups: six participants performed the tasks using 3D/HD before using 2D/4K; the other five participants performed the tasks using a 2D/4K monitor before the 3D/HD monitor. After the trials, we evaluated the performance scores (operative time, path length of forceps, and technical errors) and compared performance scores across all monitors. Surgical performances of participants were ranked in decreasing order: 3D/HD, 2D/4K, and 2D/HD using the total scores for each task. In task 1 (simple suturing), some surgical performances using 3D/HD were significantly better than those using 2D/4K (P = 0.017, P = 0.033, P = 0.492 for operative time, path length, and technical errors, respectively). On the other hand, with operation in narrow spaces such as in tasks 2 and 3, performances using 2D/4K were not inferior to 3D/HD performances. The high-resolution images from the 2D/4K monitor may enhance depth perception in narrow spaces and may complement stereoscopic vision almost as well as using 3D/HD. Compared to a 2D/HD monitor, a 3D/HD monitor improved the laparoscopic surgical technique of expert surgeons more than a 2D/4K monitor. However, the advantage of 2D/4K high-resolution images may be comparable to a 3D/HD monitor especially in narrow spaces.
Fibre Bragg grating manometry catheters for in vivo monitoring of peristalsis
NASA Astrophysics Data System (ADS)
Arkwright, John W.; Underhill, Ian
2017-02-01
The human gastrointestinal tract or `gut' is one of the body's largest functional systems spanning up to 8 metres in length from beginning to end. It is formed of a series of physiologically different sections that perform the various functions required for the digestion of food, absorption of nutrients and water, and the removal of waste products. To enable the gut to perform correctly it must be able to transport digesta through each section at the appropriate rate, and any breakdown or malfunction of this transport mechanism can have severe consequences to on-going good health. Monitoring motor function deep within the gut is challenging due to the need to monitor over extended lengths with high spatial resolution. Fiber Bragg grating (FBG) manometry catheters provide a near ideal method of monitoring physiologically significant lengths of the gut in a minimally invasive fashion. Following the development by our group of the first viable FBG based manometry catheter we have undertaken a series of clinical investigations in the human esophagus, colon, stomach and small bowel. Each region presents its own technological challenge and has required a range of modifications to the basic catheter design. We present the design of these catheters and clinical results from over 100 in-vivo studies.
Cannabis: a self-medication drug for weight management? The never ending story.
Bersani, Francesco Saverio; Santacroce, Rita; Coviello, Marialuce; Imperatori, Claudio; Francesconi, Marta; Vicinanza, Roberto; Minichino, Amedeo; Corazza, Ornella
2016-02-01
In a society highly focused on physical appearance, people are increasingly using the so-called performance and image-enhancing drugs (PIEDs) or life-style drugs as an easy way to control weight. Preliminary data from online sources (e.g. websites, drug forums, e-newsletters) suggest an increased use of cannabis amongst the general population as a PIED due to its putative weight-loss properties. The use of cannabis and/or cannabis-related products to lose weight may represent a new substance-use trend that should be carefully monitored and adequately investigated, especially in light of the well-known adverse psychiatric and somatic effects of cannabis, its possible interaction with other medications/drugs and the unknown and potentially dangerous composition of synthetic cannabimimetics preparations. Copyright © 2015 John Wiley & Sons, Ltd.
Thyroid hormones and commonly cited symptoms of overtraining in collegiate female endurance runners.
Nicoll, Justin X; Hatfield, Disa L; Melanson, Kathleen J; Nasin, Christopher S
2018-01-01
Overtraining syndrome (OTS) is reported in endurance sports. Thyroid hormones (TH) regulate metabolism, mood, and energy production, and may play a role in OTS of endurance athletes. The purpose of this study was to investigate relationships in TH and symptoms of OTS in track and field endurance runners (ER). Sixteen female track and field middle distance (MD; n = 9; age: 20.2 ± 1.5 years; ht: 167.86 ± 5.04 cm; body-mass: 57.97 ± 5.05 kg; VO 2MAX : 53.62 ± 6.04 ml/kg/min) and long distance (LD; n = 7; age: 20.5 ± 1.5 years; ht: 162.48 ± 6.11 cm; body-mass: 56.15 ± 5.99 kg; VO 2MAX : 61.94 ± 3.29 ml/kg/min) ER participated in this descriptive study (15-weeks). Thyroid-stimulating hormone (TSH), triiodothyronine (T 3 ), and thyroxine (T 4 ), were collected at pre-(PRE) and post-season (POST). A fatigue scale was administered weekly, and percent change (PΔ) in race time (season best vs. championship performance) was calculated. Wilcoxon-sign ranked tests and Spearman's rho correlations were used to determine changes and relationships between TH and performance. TSH, T 3 and T 4 did not change from PRE to POST. The percent change (PΔ) in T 3 from PRE to POST was correlated with running performance at the end of the season (ρ = - 0.70, p = 0.036). Fatigue at week 12 correlated with running performance at the end of the season (ρ = - 0.74, p = 0.004). TH may be valuable in assessing the overall training state of ER. TH concentrations change too slowly to be a frequent marker of monitoring OTS, but are related to markers of decreased performance. Monitoring dietary intake, and fatigue may be predictive markers to assess OTS and training status of female ER.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raimondi, Pantaleo
The design of the Stanford Linear Collider (SLC) called for a beam intensity far beyond what was practically achievable. This was due to intrinsic limitations in many subsystems and to a lack of understanding of the new physics of linear colliders. Real progress in improving the SLC performance came from precision, non-invasive diagnostics to measure and monitor the beams and from new techniques to control the emittance dilution and optimize the beams. A major contribution to the success of the last 1997-98 SLC run came from several innovative ideas for improving the performance of the Final Focus (FF). This papermore » describes some of the problems encountered and techniques used to overcome them. Building on the SLC experience, we will also present a new approach to the FF design for future high energy linear colliders.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jothikumar, N., E-mail: jin2@cdc.gov; Hill, Vincent R.
Highlights: •Uses a single-labeled fluorescent primer for real-time PCR. •The detection sensitivity of PET PCR was comparable to TaqMan PCR. •Melt curve analysis can be performed to confirm target amplicon production. •Conventional PCR primers can be converted to PET PCR primers. -- Abstract: We report the development of a fluorescently labeled oligonucleotide primer that can be used to monitor real-time PCR. The primer has two parts, the 3′-end of the primer is complimentary to the target and a universal 17-mer stem loop at the 5′-end forms a hairpin structure. A fluorescent dye is attached to 5′-end of either the forwardmore » or reverse primer. The presence of guanosine residues at the first and second position of the 3′ dangling end effectively quenches the fluorescence due to the photo electron transfer (PET) mechanism. During the synthesis of nucleic acid, the hairpin structure is linearized and the fluorescence of the incorporated primer increases several-fold due to release of the fluorescently labeled tail and the absence of guanosine quenching. As amplicons are synthesized during nucleic acid amplification, the fluorescence increase in the reaction mixture can be measured with commercially available real-time PCR instruments. In addition, a melting procedure can be performed to denature the double-stranded amplicons, thereby generating fluorescence peaks that can differentiate primer dimers and other non-specific amplicons if formed during the reaction. We demonstrated the application of PET-PCR for the rapid detection and quantification of Cryptosporidium parvum DNA. Comparison with a previously published TaqMan® assay demonstrated that the two real-time PCR assays exhibited similar sensitivity for a dynamic range of detection of 6000–0.6 oocysts per reaction. PET PCR primers are simple to design and less-expensive than dual-labeled probe PCR methods, and should be of interest for use by laboratories operating in resource-limited environments.« less
Application of end-tidal carbon dioxide monitoring via distal gas samples in ventilated neonates.
Jin, Ziying; Yang, Maoying; Lin, Ru; Huang, Wenfang; Wang, Jiangmei; Hu, Zhiyong; Shu, Qiang
2017-08-01
Previous research has suggested correlations between the end-tidal partial pressure of carbon dioxide (P ET CO 2 ) and the partial pressure of arterial carbon dioxide (PaCO 2 ) in mechanically ventilated patients, but both the relationship between P ET CO 2 and PaCO 2 and whether P ET CO 2 accurately reflects PaCO 2 in neonates and infants are still controversial. This study evaluated remote sampling of P ET CO 2 via an epidural catheter within an endotracheal tube to determine the procedure's clinical safety and efficacy in the perioperative management of neonates. Abdominal surgery was performed under general anesthesia in 86 full-term newborns (age 1-30 days, weight 2.55-4.0 kg, American Society of Anesthesiologists class I or II). The infants were divided into 2 groups (n = 43 each), and carbon dioxide (CO 2 ) gas samples were collected either from the conventional position (the proximal end) or a modified position (the distal end) of the epidural catheter. The P ET CO 2 measured with the new method was significantly higher than that measured with the traditional method, and the difference between P ET CO 2 and PaCO 2 was also reduced. The accuracy of P ET CO 2 measured increased from 78.7% to 91.5% when the modified sampling method was used. The moderate correlation between P ET CO 2 and PaCO 2 by traditional measurement was 0.596, which significantly increased to 0.960 in the modified sampling group. Thus, the P ET CO 2 value was closer to that of PaCO 2 . P ET CO 2 detected via modified carbon dioxide monitoring had a better accuracy and correlation with PaCO 2 in neonates. Copyright © 2017. Published by Elsevier B.V.
Global AIDS Reporting-2001 to 2015: Lessons for Monitoring the Sustainable Development Goals.
Alfvén, T; Erkkola, T; Ghys, P D; Padayachy, J; Warner-Smith, M; Rugg, D; de Lay, P
2017-07-01
Since 2001 the UNAIDS Secretariat has retained the responsibility for monitoring progress towards global commitments on HIV/AIDS. Key critical characteristics of the reporting system were assessed for the reporting period from 2004 to 2014 and analyses were undertaken of response rates and core indicator performance. Country submission rates ranged from 102 (53%) Member States in 2004 to 186 (96%) in 2012. There was great variance in response rates for specific indicators, with the highest response rates for treatment-related indicators. The Global AIDS reporting system has improved substantially over time and has provided key trend data on responses to the HIV epidemic, serving as the global accountability mechanism and providing reference data on the global AIDS response. It will be critical that reporting systems continue to evolve to support the monitoring of the Sustainable Development Goals, in view of ending the AIDS epidemic as a public health threat by 2030.
CMOS Rad-Hard Front-End Electronics for Precise Sensors Measurements
NASA Astrophysics Data System (ADS)
Sordo-Ibáñez, Samuel; Piñero-García, Blanca; Muñoz-Díaz, Manuel; Ragel-Morales, Antonio; Ceballos-Cáceres, Joaquín; Carranza-González, Luis; Espejo-Meana, Servando; Arias-Drake, Alberto; Ramos-Martos, Juan; Mora-Gutiérrez, José Miguel; Lagos-Florido, Miguel Angel
2016-08-01
This paper reports a single-chip solution for the implementation of radiation-tolerant CMOS front-end electronics (FEE) for applications requiring the acquisition of base-band sensor signals. The FEE has been designed in a 0.35μm CMOS process, and implements a set of parallel conversion channels with high levels of configurability to adapt the resolution, conversion rate, as well as the dynamic input range for the required application. Each conversion channel has been designed with a fully-differential implementation of a configurable-gain instrumentation amplifier, followed by an also configurable dual-slope ADC (DS ADC) up to 16 bits. The ASIC also incorporates precise thermal monitoring, sensor conditioning and error detection functionalities to ensure proper operation in extreme environments. Experimental results confirm that the proposed topologies, in conjunction with the applied radiation-hardening techniques, are reliable enough to be used without loss in the performance in environments with an extended temperature range (between -25 and 125 °C) and a total dose beyond 300 krad.
NASA Technical Reports Server (NTRS)
Barnum, P. W.; Renzetti, N. A.; Textor, G. P.; Kelly, L. B.
1973-01-01
The Tracking and Data System (TDS) Support for the Mariner Mars 1971 Mission final report contains the deep space tracking and data acquisition activities in support of orbital operations. During this period a major NASA objective was accomplished: completion of the 180th revolution and 90th day of data gathering with the spacecraft about the planet Mars. Included are presentations of the TDS flight support pass chronology data for each of the Deep Space Stations used, and performance evaluation for the Deep Space Network Telemetry, Tracking, Command, and Monitor Systems. With the loss of Mariner 8 at launch, Mariner 9 assumed the mission plan of Mariner 8, which included the TV mapping cycles and a 12-hr orbital period. The mission plan was modified as a result of a severe dust storm on the surface of Mars, which delayed the start of the TV mapping cycles. Thus, the end of primary mission date was extended to complete the TV mapping cycles.
Uniaxial Tensile Test for Soil.
1987-04-01
2.0 by 5.0 cm. This test was also performed on a horizontal specimen; however loading was applied through small metal plates that were embedded in the 6...i. enlarged ends. The specimen was supported by a bed of mercury and had two small ceramic markers mounted in the gage length that were monitored...with a cathetometer to determine displacements. It was found that most tests failed near the location of the embedded metal loading plates making their
Personalized USB Biosensor Module for Effective ECG Monitoring.
Sladojević, Srdjan; Arsenović, Marko; Lončar-Turukalo, Tatjana; Sladojević, Miroslava; Ćulibrk, Dubravko
2016-01-01
The burden of chronic disease and associated disability present a major threat to financial sustainability of healthcare delivery systems. The need for cost-effective early diagnosis and disease prevention is evident driving the development of personalized home health solutions. The proposed solution presents an easy to use ECG monitoring system. The core hardware component is a biosensor dongle with sensing probes at one end, and micro USB interface at the other end, offering reliable and unobtrusive sensing, preprocessing and storage. An additional component is a smart phone, providing both the biosensor's power supply and an intuitive user application for the real-time data reading. The system usage is simplified, with innovative solutions offering plug and play functionality avoiding additional driver installation. Personalized needs could be met with different sensor combinations enabling adequate monitoring in chronic disease, during physical activity and in the rehabilitation process.
Time-resolved acoustic emission tomography in the laboratory: tracking localised damage in rocks
NASA Astrophysics Data System (ADS)
Brantut, N.
2017-12-01
Over the past three decades, there has been tremendous technological developments of laboratory equipment and studies using acoustic emission and ultrasonic monitoring of rock samples during deformation. Using relatively standard seismological techniques, acoustic emissions can be detected, located in space and time, and source mechanisms can be obtained. In parallel, ultrasonic velocities can be measured routinely using standard pulse-receiver techniques.Despite these major developments, current acoustic emission and ultrasonic monitoring systems are typically used separately, and the poor spatial coverage of acoustic transducers precludes performing active 3D tomography in typical laboratory settings.Here, I present an algorithm and software package that uses both passive acoustic emission data and active ultrasonic measurements to determine acoustic emission locations together with the 3D, anisotropic P-wave structure of rock samples during deformation. The technique is analogous to local earthquake tomography, but tailored to the specificities of small scale laboratory tests. The fast marching method is employed to compute the forward problem. The acoustic emission locations and the anisotropic P-wave field are jointly inverted using the Quasi-Newton method.The method is used to track the propagation of compaction bands in a porous sandstone deformed in the ductile, cataclastic flow regime under triaxial stress conditions. Near the yield point, a compaction front forms at one end of the sample, and slowly progresses towards the other end. The front is illuminated by clusters of Acoustic Emissions, and leaves behind a heavily damaged material where the P-wave speed has dropped by up to 20%.The technique opens new possibilities to track in-situ strain localisation and damage around laboratory faults, and preliminary results on quasi-static rupture in granite will be presented.
Bertachi, Arthur; Quirós, Carmen; Giménez, Marga; Conget, Ignacio; Bondia, Jorge
2018-01-01
Continuous glucose monitoring (CGM) plays an important role in treatment decisions for patients with type 1 diabetes under conventional or closed-loop therapy. Physical activity represents a great challenge for diabetes management as well as for CGM systems. In this work, the accuracy of CGM in the context of exercise is addressed. Six adults performed aerobic and anaerobic exercise sessions and used two Medtronic Paradigm Enlite-2 sensors under closed-loop therapy. CGM readings were compared with plasma glucose during different periods: one hour before exercise, during exercise, and four hours after the end of exercise. In aerobic sessions, the median absolute relative difference (MARD) increased from 9.5% before the beginning of exercise to 16.5% during exercise (p < 0.001), and then decreased to 9.3% in the first hour after the end of exercise (p < 0.001). For the anaerobic sessions, the MARD before exercise was 15.5% and increased without statistical significance to 16.8% during exercise realisation (p = 0.993), and then decreased to 12.7% in the first hour after the cessation of anaerobic activities (p = 0.095). Results indicate that CGM might present lower accuracy during aerobic exercise, but return to regular operation a few hours after exercise cessation. No significant impact for anaerobic exercise was found. PMID:29522429
Biagi, Lyvia; Bertachi, Arthur; Quirós, Carmen; Giménez, Marga; Conget, Ignacio; Bondia, Jorge; Vehí, Josep
2018-03-09
Continuous glucose monitoring (CGM) plays an important role in treatment decisions for patients with type 1 diabetes under conventional or closed-loop therapy. Physical activity represents a great challenge for diabetes management as well as for CGM systems. In this work, the accuracy of CGM in the context of exercise is addressed. Six adults performed aerobic and anaerobic exercise sessions and used two Medtronic Paradigm Enlite-2 sensors under closed-loop therapy. CGM readings were compared with plasma glucose during different periods: one hour before exercise, during exercise, and four hours after the end of exercise. In aerobic sessions, the median absolute relative difference (MARD) increased from 9.5% before the beginning of exercise to 16.5% during exercise ( p < 0.001), and then decreased to 9.3% in the first hour after the end of exercise ( p < 0.001). For the anaerobic sessions, the MARD before exercise was 15.5% and increased without statistical significance to 16.8% during exercise realisation ( p = 0.993), and then decreased to 12.7% in the first hour after the cessation of anaerobic activities ( p = 0.095). Results indicate that CGM might present lower accuracy during aerobic exercise, but return to regular operation a few hours after exercise cessation. No significant impact for anaerobic exercise was found.
Characterization of product capture resin during microbial cultivations.
Frykman, Scott; Tsuruta, Hiroko; Galazzo, Jorge; Licari, Peter
2006-06-01
Various bioactive small molecules produced by microbial cultivation are degraded in the culture broth or may repress the formation of additional product. The inclusion of hydrophobic adsorber resin beads to capture these products in situ and remove them from the culture broth can reduce or prevent this degradation and repression. These product capture beads are often subjected to a dynamic and stressful microenvironment for a long cultivation time, affecting their physical structure and performance. Impact and collision forces can result in the fracturing of these beads into smaller pieces, which are difficult to recover at the end of a cultivation run. Various contaminating compounds may also bind in a non-specific manner to these beads, reducing the binding capacity of the resin for the product of interest (fouling). This study characterizes resin bead binding capacity (to monitor bead fouling), and resin bead volume distributions (to monitor bead fracture) for an XAD-16 adsorber resin used to capture epothilone produced during myxobacterial cultivations. Resin fouling was found to reduce the product binding capacity of the adsorber resin by 25-50%. Additionally, the degree of resin bead fracture was found to be dependent on the cultivation length and the impeller rotation rate. Microbial cultivations and harvesting processes should be designed in such a way to minimize bead fragmentation and fouling during cultivation to maximize the amount of resin and associated product harvested at the end of a run.
Somerson, Jacob; Plaxco, Kevin W
2018-04-15
The ability to measure the concentration of specific small molecules continuously and in real-time in complex sample streams would impact many areas of agriculture, food safety, and food production. Monitoring for mycotoxin taint in real time during food processing, for example, could improve public health. Towards this end, we describe here an inexpensive electrochemical DNA-based sensor that supports real-time monitor of the mycotoxin ochratoxin A in a flowing stream of foodstuffs.
USDA-ARS?s Scientific Manuscript database
Drought has significant impacts over broad spatial and temporal scales, and information about the timing and extent of such conditions is of critical importance to many end users in the agricultural and water resource management communities. The ability to accurately monitor effects on crops and pr...
NASA Technical Reports Server (NTRS)
Baughman, J. R.; Thys, P. C.
1973-01-01
A droplet monitoring system is disclosed for analysis of mixed-phase fluid flow in development of gas turbines. The system uses a probe comprising two electrical wires spaced a known distance apart and connected at one end to means for establishing a dc potential between the wires. A drop in the fluid stream momentarily contacting both wires simultaneously causes and electrical signal which is amplified, detected and counted.
Video and thermal imaging system for monitoring interiors of high temperature reaction vessels
Saveliev, Alexei V [Chicago, IL; Zelepouga, Serguei A [Hoffman Estates, IL; Rue, David M [Chicago, IL
2012-01-10
A system and method for real-time monitoring of the interior of a combustor or gasifier wherein light emitted by the interior surface of a refractory wall of the combustor or gasifier is collected using an imaging fiber optic bundle having a light receiving end and a light output end. Color information in the light is captured with primary color (RGB) filters or complimentary color (GMCY) filters placed over individual pixels of color sensors disposed within a digital color camera in a BAYER mosaic layout, producing RGB signal outputs or GMCY signal outputs. The signal outputs are processed using intensity ratios of the primary color filters or the complimentary color filters, producing video images and/or thermal images of the interior of the combustor or gasifier.
Loh, Yince; Duckwiler, Gary R
2010-10-01
The Onyx liquid embolic system (Onyx) was approved in the European Union in 1999 for embolization of lesions in the intracranial and peripheral vasculature, including brain arteriovenous malformations (AVMs) and hypervascular tumors. In 2001 a prospective, equivalence, multicenter, randomized controlled trial was initiated to support a submission for FDA approval. The objective of this study was to verify the safety and efficacy of Onyx compared with N-butyl cyanoacrylate (NBCA) for the presurgical treatment of brain AVMs. One hundred seventeen patients with brain AVMs were treated with either Onyx (54 patients) or NBCA (63 patients) for presurgical endovascular embolization between May 2001 and April 2003. The primary end point was technical success in achieving ≥ 50% reduction in AVM volume. Secondary end points were operative blood loss and resection time. All adverse events (AEs) were reported and assigned a relationship to the Onyx or NBCA system, treatment, disease, surgery, or other/unknown. The Data Safety Monitoring Board adjudicated AEs, and a blinded, independent core lab assessed volume measurements. Patients were monitored through discharge after the final surgery or through a 3- and/or 12-month follow-up if resection had not been performed or was incomplete. The use of Onyx led to ≥ 50% AVM volume reduction in 96% of cases versus 85% for NBCA (p = not significant). The secondary end points of resection time and blood loss were similar. Serious AEs were also similar between the 2 treatment groups. Onyx is equivalent to NBCA in safety and efficacy as a preoperative embolic agent in reducing brain AVM volume by at least 50%.
Coordinated Fault-Tolerance for High-Performance Computing Final Project Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Panda, Dhabaleswar Kumar; Beckman, Pete
2011-07-28
With the Coordinated Infrastructure for Fault Tolerance Systems (CIFTS, as the original project came to be called) project, our aim has been to understand and tackle the following broad research questions, the answers to which will help the HEC community analyze and shape the direction of research in the field of fault tolerance and resiliency on future high-end leadership systems. Will availability of global fault information, obtained by fault information exchange between the different HEC software on a system, allow individual system software to better detect, diagnose, and adaptively respond to faults? If fault-awareness is raised throughout the system throughmore » fault information exchange, is it possible to get all system software working together to provide a more comprehensive end-to-end fault management on the system? What are the missing fault-tolerance features that widely used HEC system software lacks today that would inhibit such software from taking advantage of systemwide global fault information? What are the practical limitations of a systemwide approach for end-to-end fault management based on fault awareness and coordination? What mechanisms, tools, and technologies are needed to bring about fault awareness and coordination of responses on a leadership-class system? What standards, outreach, and community interaction are needed for adoption of the concept of fault awareness and coordination for fault management on future systems? Keeping our overall objectives in mind, the CIFTS team has taken a parallel fourfold approach. Our central goal was to design and implement a light-weight, scalable infrastructure with a simple, standardized interface to allow communication of fault-related information through the system and facilitate coordinated responses. This work led to the development of the Fault Tolerance Backplane (FTB) publish-subscribe API specification, together with a reference implementation and several experimental implementations on top of existing publish-subscribe tools. We enhanced the intrinsic fault tolerance capabilities representative implementations of a variety of key HPC software subsystems and integrated them with the FTB. Targeting software subsystems included: MPI communication libraries, checkpoint/restart libraries, resource managers and job schedulers, and system monitoring tools. Leveraging the aforementioned infrastructure, as well as developing and utilizing additional tools, we have examined issues associated with expanded, end-to-end fault response from both system and application viewpoints. From the standpoint of system operations, we have investigated log and root cause analysis, anomaly detection and fault prediction, and generalized notification mechanisms. Our applications work has included libraries for fault-tolerance linear algebra, application frameworks for coupled multiphysics applications, and external frameworks to support the monitoring and response for general applications. Our final goal was to engage the high-end computing community to increase awareness of tools and issues around coordinated end-to-end fault management.« less
Occupational Exposure to Chromium of Assembly Workers in Aviation Industries.
Genovese, G; Castiglia, L; Pieri, M; Novi, C; d'Angelo, R; Sannolo, N; Lamberti, M; Miraglia, N
2015-01-01
Aircraft are constructed by modules that are covered by a "primer" layer, which can often contain hexavalent chromium [Cr(VI)], known carcinogen to humans. While the occupational exposure to Cr(VI) during aircraft painting is ascertained, the exposure assessment of assembly workers (assemblers) requires investigations. Three biological monitoring campaigns (BM-I,II,III) were performed in an aviation industry, on homogeneous groups of assemblers (N = 43) and controls (N = 23), by measuring chromium concentrations in end-shift urine collected at the end of the working week and the chromium concentration difference between end- and before-shift urines. BM-I was conducted on full-time workers, BM-II was performed on workers after a 3-4 day absence from work, BM-III on workers using ecoprimers with lower Cr(VI) content. Samples were analyzed by atomic absorption spectroscopy and mean values were compared by T-test. Even if Cr concentrations measured during BM-I were lower than Biological Exposure Indices by ACGIH, statistically significant differences were found between urinary Cr concentrations of workers and controls. Despite 3-4 days of absence from work, urinary chromium concentrations measured during BM-II were still higher than references from nonoccupationally exposed populations. In the BM-III campaign, the obtained preliminary results suggested the efficacy of using ecoprimers. The healthcare of workers exposed to carcinogenic agents follows the principle of limiting the exposure to "the minimum technically possible". The obtained results evidence that assemblers of aviation industries, whose task does not involve the direct use of primers containing Cr(VI), show an albeit slight occupational exposure to Cr(VI), that must be carefully taken into consideration in planning suitable prevention measures during risk assessment and management processes.
NASA Astrophysics Data System (ADS)
Cutshall, N. H.; Gilmore, T.; Looney, B. B.; Vangelas, K. M.; Adams, K. M.; Sink, C. H.
2006-05-01
Like many US industries and businesses, the Department of Energy (DOE) is responsible for remediation and restoration of soils and ground water contaminated with chlorinated ethenes. Monitored Natural Attenuation (MNA) is an attractive remediation approach and is probably the universal end-stage technology for removing such contamination. Since 2003 we have carried out a multifaceted program at the Savannah River Site designed to advance the state of the art for MNA of chlorinated ethenes in soils and groundwater. Three lines of effort were originally planned: 1) Improving the fundamental science for MNA, 2) Promoting better characterization and monitoring (CM) techniques, and 3) Advancing the regulatory aspects of MNA management. A fourth line, developing enhanced attenuation methods based on sustainable natural processes, was added in order to deal with sites where the initial natural attenuation capacity cannot offset contaminant loading rates. These four lines have been pursued in an integrated and mutually supportive fashion. Many DOE site-cleanup program managers view CM as major expenses, especially for natural attenuation where measuring attenuation is complex and the most critical attenuation mechanisms cannot be determined directly. We have reviewed new and developing approaches to CM for potential application in support of natural attenuation of chlorinated hydrocarbons in ground water at DOE sites (Gilmore, Tyler, et al., 2006 WSRC-TR- 2005-00199). Although our project is focused on chlorinated ethenes, many of the concepts and strategies are also applicable to a wider range of contaminants including radionuclides and metals. The greatest savings in CM are likely to come from new management approaches. New approaches can be based, for example, on conceptual models of attenuation capacity, the ability of a formation to reduce risks caused by contaminants. Using the mass balance concept as a guide, the integrated mass flux of contaminant is compared to the attenuation capacity. The mass balance approach is controlled by a combination of boundary conditions (e.g., water inputs and outputs), flow dynamics, and contaminant concentrations. As a result, long term monitoring might be improved while reducing costs by measuring fewer point concentrations and simultaneously adding large-scale measurements of boundary conditions, using weather data, remote sensing of evapotranspiration, stream-flow monitoring, etc. Because there are no specific regulatory drivers for performance-monitoring, regulators are not accustomed to participating in monitoring system design. A partnership with the Interstate Technology Regulatory Council (ITRC) has been formed to promote communication and develop advanced guidance for MNA. Early and continued communication among technology developers, end users, regulators and the public has been essential to this progress.
Performance of a 2.5 THz Receiver Front-End for Spaceborne Applications
NASA Technical Reports Server (NTRS)
Gaidis, Michael C.; Pickett, H. M.; Siegel, P. H.; Smith, C. D.; Smith, R. P.; Martin, S. C.
1999-01-01
The OH radical plays a significant role in a great many of the known ozone destruction cycles, and has become the focus of an important radiometer development effort for NASA's Earth Observing System Chem I satellite, which will monitor and study many tropospheric and stratospheric gases and is scheduled for launch in 2002. Here we describe the design, fabrication, and testing of a receiver front end used to detect the OH signals at 2.5 THz. This is to be the first Terahertz heterodyne receiver to be flown in space. The challenges of producing the necessary high-performance mixers are numerous, but for this application, there is the added challenge of designing a robust receiver which can withstand the environmental extremes of a rocket launch and five years in space. The receiver front-end consists of the following components: a four-port dual-polarization diplexer, off-axis elliptical feed mirrors, mixers for horizontal and vertical polarization, support structures allowing simple and rugged alignment, low noise IF amplification from 7.7 to 21.1 GHz, and mixer DC bias circuitry. The front-end design, alignment, and operation will be covered in depth, followed by a discussion of the most recent results in receiver noise and dual-mode horn beam patterns. JPL MOMED mixers are employed, and have resulted in receiver noise temperatures of 14,500 K, DSB with LO frequency 2.522 GHz and IF of 12.8 GHz. Horn beam patterns correspond well with theory, with no significant sidelobes above the -25 dB level. Considering the high-quality beam of this receiver, these results are competitive with the best reported in the literature.
Staged-Fault Testing of Distance Protection Relay Settings
NASA Astrophysics Data System (ADS)
Havelka, J.; Malarić, R.; Frlan, K.
2012-01-01
In order to analyze the operation of the protection system during induced fault testing in the Croatian power system, a simulation using the CAPE software has been performed. The CAPE software (Computer-Aided Protection Engineering) is expert software intended primarily for relay protection engineers, which calculates current and voltage values during faults in the power system, so that relay protection devices can be properly set up. Once the accuracy of the simulation model had been confirmed, a series of simulations were performed in order to obtain the optimal fault location to test the protection system. The simulation results were used to specify the test sequence definitions for the end-to-end relay testing using advanced testing equipment with GPS synchronization for secondary injection in protection schemes based on communication. The objective of the end-to-end testing was to perform field validation of the protection settings, including verification of the circuit breaker operation, telecommunication channel time and the effectiveness of the relay algorithms. Once the end-to-end secondary injection testing had been completed, the induced fault testing was performed with three-end lines loaded and in service. This paper describes and analyses the test procedure, consisting of CAPE simulations, end-to-end test with advanced secondary equipment and staged-fault test of a three-end power line in the Croatian transmission system.
RFID Tag Helix Antenna Sensors for Wireless Drug Dosage Monitoring
Huang, Haiyu; Zhao, Peisen; Chen, Pai-Yen; Ren, Yong; Liu, Xuewu; Ferrari, Mauro; Hu, Ye; Akinwande, Deji
2014-01-01
Miniaturized helix antennas are integrated with drug reservoirs to function as RFID wireless tag sensors for real-time drug dosage monitoring. The general design procedure of this type of biomedical antenna sensors is proposed based on electromagnetic theory and finite element simulation. A cost effective fabrication process is utilized to encapsulate the antenna sensor within a biocompatible package layer using PDMS material, and at the same time form a drug storage or drug delivery unit inside the sensor. The in vitro experiment on two prototypes of antenna sensor-drug reservoir assembly have shown the ability to monitor the drug dosage by tracking antenna resonant frequency shift from 2.4–2.5-GHz ISM band with realized sensitivity of 1.27 \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$\\mu~{\\rm l}/{\\rm MHz}$\\end{document} for transdermal drug delivery monitoring and 2.76-\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$\\mu~{\\rm l}/{\\rm MHz}$\\end{document} sensitivity for implanted drug delivery monitoring. PMID:27170865
NASA Astrophysics Data System (ADS)
Koukouli, MariLiza; Balis, Dimitris; Simopoulos, Spiros; Siomos, Nikos; Clarisse, Lieven; Carboni, Elisa; Wang, Ping; Siddans, Richard; Marenco, Franco; Mona, Lucia; Pappalardo, Gelsomina; Spinetti, Claudia; Theys, Nicolas; Tampellini, Lucia; Zehner, Claus
2014-05-01
The 2010 eruption of the Icelandic volcano Eyjafjallajökull attracted the attention of the public and the scientific community to the vulnerability of the European airspace to volcanic eruptions. Major disruptions in European air traffic were observed for several weeks surrounding the two eruptive episodes, which had a strong impact on the everyday life of many Europeans as well as a noticable economic loss of around 2-3 billion Euros in total. The eruptions made obvious that the decision-making bodies were not informed properly and timely about the commercial aircraft capabilities to ash-leaden air, and that the ash monitoring and prediction potential is rather limited. After the Eyjafjallajökull eruptions new guidelines for aviation, changing from zero tolerance to newly established ash threshold values, were introduced. Within this spirit, the European Space Agency project Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards, called for the creation of an optimal End-to-End System for Volcanic Ash Plume Monitoring and Prediction . This system is based on improved and dedicated satellite-derived ash plume and sulphur dioxide level assessments, as well as an extensive validation using auxiliary satellite, aircraft and ground-based measurements. The validation of volcanic ash levels extracted from the sensors GOME-2/MetopA, IASI/MetopA and MODIS/Terra and MODIS/Aqua is presented in this work with emphasis on the ash plume height and ash optical depth levels. Co-located aircraft flights, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation [CALIPSO] soundings and well as European Aerosol Research Lidar Network [EARLINET] measurements were compared to the different satellite estimates for the those two eruptive episodes. The validation results are extremely promising with most satellite sensors performing quite well and within the estimated uncertainties compared to the comparative datasets. The findings are extensively presented here and future directions discussed in length.
A Baseline Air Quality Assessment Onboard a Victoria Class Submarine: HMCS Windsor
2006-05-01
with the use of the Carbon Dioxide Absorption Units (CDAUs) in which two canisters were initiated in both the Fore Ends (foreword of Bulkhead 34 in...monitoring equipment used onboard was also checked as a confirmation. Carbon Monoxide is produced as a result of combustion, therefore the source of...aim the study monitored the effects of: air purification capabilities (management of Oxygen (O2) and Carbon Dioxide (CO2)); routine housekeeping
Impact of Moisture Content and Grain Size on Hydrocarbon Diffusion in Porous Media
NASA Astrophysics Data System (ADS)
McLain, A. A.; Ho, C. K.
2001-12-01
Diffusion of hydrocarbon vapors in porous media can play an important role in our ability to characterize subsurface contaminants such as trichloroethylene (TCE). For example, traditional monitoring methods often rely on direct sampling of contaminated soils or vapor. These samples may be influenced by the diffusion of vapors away from the contaminant source term, such as non-aqueous-phase TCE liquid. In addition, diffusion of hydrocarbon vapors can also impact the migration and dispersion of the contaminant in the subsurface. Therefore, understanding the diffusion rates and vapor transport processes of hydrocarbons in variably-saturated, heterogeneous porous media will assist in the characterization and detection of these subsurface contaminants. The purpose of this study was to investigate the impact of soil heterogeneity and water-moisture content on the diffusion processes for TCE. A one-dimensional column experiment was used to monitor the rates of vapor diffusion through sand. Experiments were performed with different average water-moisture contents and different grain sizes. On one end of the column, a reservoir cap is used to encase the TCE, providing a constant vapor boundary condition while sealing the end. The other end of the column contains a novel microchemical sensor. The sensor employs a polymer-absorption resistor (chemiresistor) that reversibly swells and increases in resistance when exposed to hydrocarbons. Once calibrated, the chemiresistors can be used to passively monitor vapor concentrations. This unique method allows the detection of in-situ vapor concentrations without disturbing the local environment. Results are presented in the form of vapor-concentration breakthrough curves as detected by the sensor. The shape of the breakthrough curve is dependent on several key parameters, including the length of the column and parameters (e.g., water-moisture content and grain-size) that affect the effective diffusion coefficient of TCE in air. Comparisons are made between theoretical and observed breakthrough curves to evaluate the diffusion of TCE and other relevant physical processes (e.g., air-water partitioning of TCE). The relative impact of water-moisture content and grain size on the diffusion of TCE vapor in porous media is also addressed. The authors thank Bob Hughes, who developed the chemiresistor sensors, and Chad Davis, who assisted with the calibrations. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE-AC04-94AL85000.
Analog integrated circuits design for processing physiological signals.
Li, Yan; Poon, Carmen C Y; Zhang, Yuan-Ting
2010-01-01
Analog integrated circuits (ICs) designed for processing physiological signals are important building blocks of wearable and implantable medical devices used for health monitoring or restoring lost body functions. Due to the nature of physiological signals and the corresponding application scenarios, the ICs designed for these applications should have low power consumption, low cutoff frequency, and low input-referred noise. In this paper, techniques for designing the analog front-end circuits with these three characteristics will be reviewed, including subthreshold circuits, bulk-driven MOSFETs, floating gate MOSFETs, and log-domain circuits to reduce power consumption; methods for designing fully integrated low cutoff frequency circuits; as well as chopper stabilization (CHS) and other techniques that can be used to achieve a high signal-to-noise performance. Novel applications using these techniques will also be discussed.
AGR-1 Irradiation Test Final As-Run Report, Rev. 3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collin, Blaise P.
2015-01-01
This document presents the as-run analysis of the AGR-1 irradiation experiment. AGR-1 is the first of eight planned irradiations for the Advanced Gas Reactor (AGR) Fuel Development and Qualification Program. Funding for this program is provided by the US Department of Energy (DOE) as part of the Next-Generation Nuclear Plant (NGNP) project. The objectives of the AGR-1 experiment are: 1. To gain experience with multi-capsule test train design, fabrication, and operation with the intent to reduce the probability of capsule or test train failure in subsequent irradiation tests. 2. To irradiate fuel produced in conjunction with the AGR fuel processmore » development effort. 3. To provide data that will support the development of an understanding of the relationship between fuel fabrication processes, fuel product properties, and irradiation performance. In order to achieve the test objectives, the AGR-1 experiment was irradiated in the B-10 position of the Advanced Test Reactor (ATR) at the Idaho National Laboratory (INL) for a total duration of 620 effective full power days of irradiation. Irradiation began on December 24, 2006 and ended on November 6, 2009 spanning 13 ATR cycles and approximately three calendar years. The test contained six independently controlled and monitored capsules. Each capsule contained 12 compacts of a single type, or variant, of the AGR coated fuel. No fuel particles failed during the AGR-1 irradiation. Final burnup values on a per compact basis ranged from 11.5 to 19.6 %FIMA, while fast fluence values ranged from 2.21 to 4.39 x 10 25 n/m 2 (E >0.18 MeV). We’ll say something here about temperatures once thermal recalc is done. Thermocouples performed well, failing at a lower rate than expected. At the end of the irradiation, nine of the originally-planned 19 TCs were considered functional. Fission product release-to-birth (R/B) ratios were quite low. In most capsules, R/B values at the end of the irradiation were at or below 10 -7 with only one capsule significantly exceeding this value. A maximum R/B of around 2 x 10 -7 was reached at the end of the irradiation in Capsule 5. Several shakedown issues were encountered and resolved during the first three cycles. These include the repair of minor gas line leaks; repair of faulty gas line valves; the need to position moisture monitors in regions of low radiation fields for proper functioning; the enforcement of proper on-line data storage and backup, the need to monitor thermocouple performance, correcting for detector spectral gain shift, and a change in the mass flow rate range of the neon flow controllers.« less
An ultra-sensitive wearable accelerometer for continuous heart and lung sound monitoring.
Hu, Yating; Xu, Yong
2012-01-01
This paper presents a chest-worn accelerometer with high sensitivity for continuous cardio-respiratory sound monitoring. The accelerometer is based on an asymmetrical gapped cantilever which is composed of a bottom mechanical layer and a top piezoelectric layer separated by a gap. This novel structure helps to increase the sensitivity by orders of magnitude compared with conventional cantilever based accelerometers. The prototype with a resonant frequency of 1100Hz and a total weight of 5 gram is designed, constructed and characterized. The size of the prototype sensor is 35mm×18mm×7.8mm (l×w×t). A built-in charge amplifier is used to amplify the output voltage of the sensor. A sensitivity of 86V/g and a noise floor of 40ng/√Hz are obtained. Preliminary tests for recording both cardiac and respiratory signals are carried out on human body and the new sensor exhibits better performance compared with a high-end electronic stethoscope.
Mazur, Lukasz M; Mosaly, Prithima R; Hoyle, Lesley M; Jones, Ellen L; Marks, Lawrence B
2013-01-01
To quantify, and compare, workload for several common physician-based treatment planning tasks using objective and subjective measures of workload. To assess the relationship between workload and performance to define workload levels where performance could be expected to decline. Nine physicians performed the same 3 tasks on each of 2 cases ("easy" vs "hard"). Workload was assessed objectively throughout the tasks (via monitoring of pupil size and blink rate), and subjectively at the end of each case (via National Aeronautics and Space Administration Task Load Index; NASA-TLX). NASA-TLX assesses the 6 dimensions (mental, physical, and temporal demands, frustration, effort, and performance); scores > or ≈ 50 are associated with reduced performance in other industries. Performance was measured using participants' stated willingness to approve the treatment plan. Differences in subjective and objective workload between cases, tasks, and experience were assessed using analysis of variance (ANOVA). The correlation between subjective and objective workload measures were assessed via the Pearson correlation test. The relationships between workload and performance measures were assessed using the t test. Eighteen case-wise and 54 task-wise assessments were obtained. Subjective NASA-TLX scores (P < .001), but not time-weighted averages of objective scores (P > .1), were significantly lower for the easy vs hard case. Most correlations between the subjective and objective measures were not significant, except between average blink rate and NASA-TLX scores (r = -0.34, P = .02), for task-wise assessments. Performance appeared to decline at NASA-TLX scores of ≥55. The NASA-TLX may provide a reasonable method to quantify subjective workload for broad activities, and objective physiologic eye-based measures may be useful to monitor workload for more granular tasks within activities. The subjective and objective measures, as herein quantified, do not necessarily track each other, and more work is needed to assess their utilities. From a series of controlled experiments, we found that performance appears to decline at subjective workload levels ≥55 (as measured via NASA-TLX), which is consistent with findings from other industries. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
Cook and Chill: Effect of Temperature on the Performance of Nonequilibrated Blood Glucose Meters.
Deakin, Sherine; Steele, Dominic; Clarke, Sarah; Gribben, Cathryn; Bexley, Anne-Marie; Laan, Remmert; Kerr, David
2015-08-20
Exposure to extreme temperature can affect the performance of blood glucose monitoring systems. The aim was to determine the non-equilibrated performance of these systems at extreme high and low temperatures that can occur in daily life. The performances of 5 test systems, (1) Abbott FreeStyle Freedom Lite, (2) Roche AccuChek Aviva, (3) Bayer Contour, (4) LifeScan OneTouch Verio, and (5) Sanofi BG Star, were compared after "cooking" (50°C for 1 hour) or "chilling" (-5°C for 1 hour) with room temperature controls (23°C) using whole blood with glucose concentrations of 50, 100, and 200 mg/dl. The equilibration period (time from the end of incubation to when the test system is operational) was between 1 and 8 minutes, and each test system took between 15 and 30 minutes after incubation to obtain stable measurements at room temperature. Incubating the strips at -5°C or 50°C had little effect on the glucose measurement, whereas incubating the meters introduced bias in performance between 0 and 15 minutes but not subsequently, compared to room temperature controls and at all 3 glucose levels. Compensating technologies embedded within blood glucose monitoring systems studied here perform well at extreme temperatures. People with diabetes need to be alerted to this feature to avoid perceptions of malperformance of their devices and the possible inability to get blood glucose readings on short notice (eg, during time of suspected rapid change or before an unplanned meal). © 2015 Diabetes Technology Society.
A configurable and low-power mixed signal SoC for portable ECG monitoring applications.
Kim, Hyejung; Kim, Sunyoung; Van Helleputte, Nick; Artes, Antonio; Konijnenburg, Mario; Huisken, Jos; Van Hoof, Chris; Yazicioglu, Refet Firat
2014-04-01
This paper describes a mixed-signal ECG System-on-Chip (SoC) that is capable of implementing configurable functionality with low-power consumption for portable ECG monitoring applications. A low-voltage and high performance analog front-end extracts 3-channel ECG signals and single channel electrode-tissue-impedance (ETI) measurement with high signal quality. This can be used to evaluate the quality of the ECG measurement and to filter motion artifacts. A custom digital signal processor consisting of 4-way SIMD processor provides the configurability and advanced functionality like motion artifact removal and R peak detection. A built-in 12-bit analog-to-digital converter (ADC) is capable of adaptive sampling achieving a compression ratio of up to 7, and loop buffer integration reduces the power consumption for on-chip memory access. The SoC is implemented in 0.18 μm CMOS process and consumes 32 μ W from a 1.2 V while heart beat detection application is running, and integrated in a wireless ECG monitoring system with Bluetooth protocol. Thanks to the ECG SoC, the overall system power consumption can be reduced significantly.
Bornhoft, J M; Strabala, K W; Wortman, T D; Lehman, A C; Oleynikov, D; Farritor, S M
2011-01-01
The objective of this research is to study the effectiveness of using a stereoscopic visualization system for performing remote surgery. The use of stereoscopic vision has become common with the advent of the da Vinci® system (Intuitive, Sunnyvale CA). This system creates a virtual environment that consists of a 3-D display for visual feedback and haptic tactile feedback, together providing an intuitive environment for remote surgical applications. This study will use simple in vivo robotic surgical devices and compare the performance of surgeons using the stereoscopic interfacing system to the performance of surgeons using one dimensional monitors. The stereoscopic viewing system consists of two cameras, two monitors, and four mirrors. The cameras are mounted to a multi-functional miniature in vivo robot; and mimic the depth perception of the actual human eyes. This is done by placing the cameras at a calculated angle and distance apart. Live video streams from the left and right cameras are displayed on the left and right monitors, respectively. A system of angled mirrors allows the left and right eyes to see the video stream from the left and right monitor, respectively, creating the illusion of depth. The haptic interface consists of two PHANTOM Omni® (SensAble, Woburn Ma) controllers. These controllers measure the position and orientation of a pen-like end effector with three degrees of freedom. As the surgeon uses this interface, they see a 3-D image and feel force feedback for collision and workspace limits. The stereoscopic viewing system has been used in several surgical training tests and shows a potential improvement in depth perception and 3-D vision. The haptic system accurately gives force feedback that aids in surgery. Both have been used in non-survival animal surgeries, and have successfully been used in suturing and gallbladder removal. Bench top experiments using the interfacing system have also been conducted. A group of participants completed two different surgical training tasks using both a two dimensional visual system and the stereoscopic visual system. Results suggest that the stereoscopic visual system decreased the amount of time taken to complete the tasks. All participants also reported that the stereoscopic system was easier to utilize than the two dimensional system. Haptic controllers combined with stereoscopic vision provides for a more intuitive virtual environment. This system provides the surgeon with 3-D vision, depth perception, and the ability to receive feedback through forces applied in the haptic controller while performing surgery. These capabilities potentially enable the performance of more complex surgeries with a higher level of precision.
Chernobyl NPP: Completion of LRW Treatment Plant and LRW Management on Site - 12568
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fedorov, Denis; Adamovich, Dmitry; Klimenko, I.
2012-07-01
Since a beginning of ChNPP operation, and after a tragedy in 1986, a few thousands m3 of LRW have been collected in a storage tanks. In 2004 ChNPP started the new project on creation of LRW treatment plant (LRWTP) financed from EBRD fund. But it was stopped in 2008 because of financial and contract problems. In 2010 SIA RADON jointly with Ukrainian partners has won a tender on completion of LRWTP, in particular I and C system. The purpose of LRTP is to process liquid rad-wastes from SSE 'Chernobyl NPP' site and those liquids stored in the LRWS and SLRWSmore » tanks as well as the would-be wastes after ChNPP Power Units 1, 2 and 3 decommissioning. The LRTP design lifetime - 20 years. Currently, the LRTP is getting ready to perform the following activities: 1. retrieval of waste from tanks stored at ChNPP LWS using waste retrieval system with existing equipment involved; 2. transfer of retrieved waste into LRTP reception tanks with partial use of existing transfer pipelines; 3. laboratory chemical and radiochemical analysis of reception tanks contest to define the full spectrum of characteristics before processing, to acknowledge the necessity of preliminary processing and to select end product recipe; 4. preliminary processing of the waste to meet the requirements for further stages of the process; 5. shrinkage (concentrating) of preliminary processed waste; 6. solidification of preliminary processed waste with concrete to make a solid-state (end product) and load of concrete compound into 200-l drums; 7. curing of end product drums in LRTP curing hall; 8. radiologic monitoring of end product drums and their loading into special overpacks; 9. overpack radiological monitoring; 10. send for disposal (ICSRM Lot 3); The current technical decisions allow to control and return to ChNPP of process media and supporting systems outputs until they satisfy the following quality norms: salt content: < 100 g/l; pH: 1 - 11; anionic surface-active agent: < 25 mg/l; oil dissipated in the liquid: < 2 mg/l; overall gamma-activity: < 3,7 x10{sup 5} Bq/l. (authors)« less
Steinhubl, Steven R; Mehta, Rajesh R; Ebner, Gail S; Ballesteros, Marissa M; Waalen, Jill; Steinberg, Gregory; Van Crocker, Percy; Felicione, Elise; Carter, Chureen T; Edmonds, Shawn; Honcz, Joseph P; Miralles, Gines Diego; Talantov, Dimitri; Sarich, Troy C; Topol, Eric J
2016-05-01
Efficient methods for screening populations for undiagnosed atrial fibrillation (AF) are needed to reduce its associated mortality, morbidity, and costs. The use of digital technologies, including wearable sensors and large health record data sets allowing for targeted outreach toward individuals at increased risk for AF, might allow for unprecedented opportunities for effective, economical screening. The trial's primary objective is to determine, in a real-world setting, whether using wearable sensors in a risk-targeted screening population can diagnose asymptomatic AF more effectively than routine care. Additional key objectives include (1) exploring 2 rhythm-monitoring strategies-electrocardiogram-based and exploratory pulse wave-based-for detection of new AF, and (2) comparing long-term clinical and resource outcomes among groups. In all, 2,100 Aetna members will be randomized 1:1 to either immediate or delayed monitoring, in which a wearable patch will capture a single-lead electrocardiogram during the first and last 2 weeks of a 4-month period beginning immediately or 4 months after enrollment, respectively. An observational, risk factor-matched control group (n = 4,000) will be developed from members who did not receive an invitation to participate. The primary end point is the incidence of new AF in the immediate- vs delayed-monitoring arms at the end of the 4-month monitoring period. Additional efficacy and safety end points will be captured at 1 and 3 years. The results of this digital medicine trial might benefit a substantial proportion of the population by helping identify and refine screening methods for undiagnosed AF. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
A Modular IoT Platform for Real-Time Indoor Air Quality Monitoring
Abdaoui, Abderrazak; Ahmad, Sabbir H.M.; Touati, Farid; Kadri, Abdullah
2018-01-01
The impact of air quality on health and on life comfort is well established. In many societies, vulnerable elderly and young populations spend most of their time indoors. Therefore, indoor air quality monitoring (IAQM) is of great importance to human health. Engineers and researchers are increasingly focusing their efforts on the design of real-time IAQM systems using wireless sensor networks. This paper presents an end-to-end IAQM system enabling measurement of CO2, CO, SO2, NO2, O3, Cl2, ambient temperature, and relative humidity. In IAQM systems, remote users usually use a local gateway to connect wireless sensor nodes in a given monitoring site to the external world for ubiquitous access of data. In this work, the role of the gateway in processing collected air quality data and its reliable dissemination to end-users through a web-server is emphasized. A mechanism for the backup and the restoration of the collected data in the case of Internet outage is presented. The system is adapted to an open-source Internet-of-Things (IoT) web-server platform, called Emoncms, for live monitoring and long-term storage of the collected IAQM data. A modular IAQM architecture is adopted, which results in a smart scalable system that allows seamless integration of various sensing technologies, wireless sensor networks (WSNs) and smart mobile standards. The paper gives full hardware and software details of the proposed solution. Sample IAQM results collected in various locations are also presented to demonstrate the abilities of the system. PMID:29443893
Besseling, Rut; Damen, Michiel; Tran, Thanh; Nguyen, Thanh; van den Dries, Kaspar; Oostra, Wim; Gerich, Ad
2015-10-10
Dry powder mixing is a wide spread Unit Operation in the Pharmaceutical industry. With the advent of in-line Near Infrared (NIR) Spectroscopy and Quality by Design principles, application of Process Analytical Technology to monitor Blend Uniformity (BU) is taking a more prominent role. Yet routine use of NIR for monitoring, let alone control of blending processes is not common in the industry, despite the improved process understanding and (cost) efficiency that it may offer. Method maintenance, robustness and translation to regulatory requirements have been important barriers to implement the method. This paper presents a qualitative NIR-BU method offering a convenient and compliant approach to apply BU control for routine operation and process understanding, without extensive calibration and method maintenance requirements. The method employs a moving F-test to detect the steady state of measured spectral variances and the endpoint of mixing. The fundamentals and performance characteristics of the method are first presented, followed by a description of the link to regulatory BU criteria, the method sensitivity and practical considerations. Applications in upscaling, tech transfer and commercial production are described, along with evaluation of the method performance by comparison with results from quantitative calibration models. A full application, in which end-point detection via the F-test controls the blending process of a low dose product, was successfully filed in Europe and Australia, implemented in commercial production and routinely used for about five years and more than 100 batches. Copyright © 2015 Elsevier B.V. All rights reserved.
Use of Lightweight Cellular Mats to Reduce the Settlement of Structure on Soft Soil
NASA Astrophysics Data System (ADS)
Ganasan, R.; Lim, A. J. M. S.; Wijeyesekera, D. C.
2016-07-01
Construction of structures on soft soils gives rise to some difficulties in Malaysia and other country especially in settlement both in short and long term. The focus of this research is to minimize the differential and non-uniform settlement on peat soil with the use of an innovative cellular mat. The behaviour and performance of the lightweight geo-material (in block form) is critically investigated and in particular the use as a fill in embankment on soft ground. Hemic peat soil, sponge and innovative cellular mat will be used as the main material in this study. The monitoring in settlement behavior from this part of research will be done as laboratory testing only. The uneven settlement in this problem was uniquely monitored photographically using spot markers. In the end of the research, it is seen that the innovative cellular mat has reduce the excessive and differential settlement up to 50% compare to flexible and rigid foundations. This had improve the stiffness of soils as well as the porous contain in cellular structure which help in allowing water/moisture to flow through in or out thus resulting in prevent the condition of floating.
Cyber Security and Resilient Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robert S. Anderson
2009-07-01
The Department of Energy (DOE) Idaho National Laboratory (INL) has become a center of excellence for critical infrastructure protection, particularly in the field of cyber security. It is one of only a few national laboratories that have enhanced the nation’s cyber security posture by performing industrial control system (ICS) vendor assessments as well as user on-site assessments. Not only are vulnerabilities discovered, but described actions for enhancing security are suggested – both on a system-specific basis and from a general perspective of identifying common weaknesses and their corresponding corrective actions. These cyber security programs have performed over 40 assessments tomore » date which have led to more robust, secure, and resilient monitoring and control systems for the US electrical grid, oil and gas, chemical, transportation, and many other sectors. In addition to cyber assessments themselves, the INL has been engaged in outreach to the ICS community through vendor forums, technical conferences, vendor user groups, and other special engagements as requested. Training programs have been created to help educate all levels of management and worker alike with an emphasis towards real everyday cyber hacking methods and techniques including typical exploits that are used. The asset owner or end user has many products available for its use created from these programs. One outstanding product is the US Department of Homeland Security (DHS) Cyber Security Procurement Language for Control Systems document that provides insight to the user when specifying a new monitoring and control system, particularly concerning security requirements. Employing some of the top cyber researchers in the nation, the INL can leverage this talent towards many applications other than critical infrastructure. Monitoring and control systems are used throughout the world to perform simple tasks such as cooking in a microwave to complex ones such as the monitoring and control of the next generation fighter jets or nuclear material safeguards systems in complex nuclear fuel cycle facilities. It is the intent of this paper to describe the cyber security programs that are currently in place, the experiences and successes achieved in industry including outreach and training, and suggestions about how other sectors and organizations can leverage this national expertise to help their monitoring and control systems become more secure.« less
NASA Astrophysics Data System (ADS)
Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey
2015-04-01
A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.
Bernal-Delgado, Enrique; Estupiñán-Romero, Francisco
2018-01-01
The integration of different administrative data sources from a number of European countries has been shown useful in the assessment of unwarranted variations in health care performance. This essay describes the procedures used to set up a data infrastructure (e.g., data access and exchange, definition of the minimum common wealth of data required, and the development of the relational logic data model) and, the methods to produce trustworthy healthcare performance measurements (e.g., ontologies standardisation and quality assurance analysis). The paper ends providing some hints on how to use these lessons in an eventual European infrastructure on public health research and monitoring. Although the relational data infrastructure developed has been proven accurate, effective to compare health system performance across different countries, and efficient enough to deal with hundred of millions of episodes, the logic data model might not be responsive if the European infrastructure aims at including electronic health records and carrying out multi-cohort multi-intervention comparative effectiveness research. The deployment of a distributed infrastructure based on semantic interoperability, where individual data remain in-country and open-access scripts for data management and analysis travel around the hubs composing the infrastructure, might be a sensible way forward.
Ait Ouarabi, Mohand; Antonaci, Paola; Boubenider, Fouad; Gliozzi, Antonio S; Scalerandi, Marco
2017-01-07
Alkaline solutions, such as sodium, potassium or lithium silicates, appear to be very promising as healing agents for the development of encapsulated self-healing concretes. However, the evolution of their mechanical and acoustic properties in time has not yet been completely clarified, especially regarding their behavior and related kinetics when they are used in the form of a thin layer in contact with a hardened cement matrix. This study aims to monitor, using linear and nonlinear ultrasonic methods, the evolution of a sodium silicate solution interacting with a cement matrix in the presence of localized cracks. The ultrasonic inspection via linear methods revealed that an almost complete recovery of the elastic and acoustic properties occurred within a few days of healing. The nonlinear ultrasonic measurements contributed to provide further insight into the kinetics of the recovery due to the presence of the healing agent. A good regain of mechanical performance was ascertained through flexural tests at the end of the healing process, confirming the suitability of sodium silicate as a healing agent for self-healing cementitious systems.
Pros and cons of body mass index as a nutritional and risk assessment tool in dialysis patients.
Carrero, Juan Jesús; Avesani, Carla Maria
2015-01-01
Obesity is a problem of serious concern among chronic kidney disease (CKD) patients; it is a risk factor for progression to end-stage renal disease and its incidence and prevalence in dialysis patients exceeds those of the general population. Obesity, typically assessed with the simple metric of body mass index (BMI), is considered a mainstay for nutritional assessment in guidelines on nutrition in CKD. While regular BMI assessment in connection with the dialysis session is a simple and easy-to-use monitoring tool, such ease of access can lead to excess-of-use, as the value of this metric to health care professionals is overestimated. This review examines BMI as a clinical monitoring tool in CKD practice and offers a critical appraisal as to what a high or a low BMI may signify in this patient population. Topics discussed include the utility of BMI as a reflection of body size, body composition and body fat distribution, diagnostic versus prognostic performance, and consideration of temporal trends over single assessments. © 2014 Wiley Periodicals, Inc.
HABs Monitoring and Prediction
Monitoring techniques for harmful algal blooms (HABs) vary across temporal and spatial domains. Remote satellite imagery provides information on water quality at relatively broad spatial and lengthy temporal scales. At the other end of the spectrum, local in-situ monitoring tec...
Remote Arrhythmia Monitoring System Developed
NASA Technical Reports Server (NTRS)
York, David W.; Mackin, Michael A.; Liszka, Kathy J.; Lichter, Michael J.
2004-01-01
Telemedicine is taking a step forward with the efforts of team members from the NASA Glenn Research Center, the MetroHealth campus of Case Western University, and the University of Akron. The Arrhythmia Monitoring System is a completed, working test bed developed at Glenn that collects real-time electrocardiogram (ECG) signals from a mobile or homebound patient, combines these signals with global positioning system (GPS) location data, and transmits them to a remote station for display and monitoring. Approximately 300,000 Americans die every year from sudden heart attacks, which are arrhythmia cases. However, not all patients identified at risk for arrhythmias can be monitored continuously because of technological and economical limitations. Such patients, who are at moderate risk of arrhythmias, would benefit from technology that would permit long-term continuous monitoring of electrical cardiac rhythms outside the hospital environment. Embedded Web Technology developed at Glenn to remotely command and collect data from embedded systems using Web technology is the catalyst for this new telemetry system (ref. 1). In the end-to-end system architecture, ECG signals are collected from a patient using an event recorder and are transmitted to a handheld personal digital assistant (PDA) using Bluetooth, a short-range wireless technology. The PDA concurrently tracks the patient's location via a connection to a GPS receiver. A long distance link is established via a standard Internet connection over a 2.5-generation Global System for Mobile Communications/General Packet Radio Service (GSM/GPRS)1 cellular, wireless infrastructure. Then, the digital signal is transmitted to a call center for monitoring by medical professionals.
2012-01-01
Background Peripheral vestibular hypofunction is a major cause of dizziness. When complicated with postural imbalance, this condition can lead to an increased incidence of falls. In traditional clinical practice, gaze stabilization exercise is commonly used to rehabilitate patients. In this study, we established a computer-aided vestibular rehabilitation system by coupling infrared LEDs to an infrared receiver. This system enabled the subjects’ head-turning actions to be quantified, and the training was performed using vestibular exercise combined with computer games and interactive video games that simulate daily life activities. Methods Three unilateral and one bilateral vestibular hypofunction patients volunteered to participate in this study. The participants received 30 minutes of computer-aided vestibular rehabilitation training 2 days per week for 6 weeks. Pre-training and post-training assessments were completed, and a follow-up assessment was completed 1 month after the end of the training period. Results After 6 weeks of training, significant improvements in balance and dynamic visual acuity (DVA) were observed in the four participants. Self-reports of dizziness, anxiety and depressed mood all decreased significantly. Significant improvements in self-confidence and physical performance were also observed. The effectiveness of this training was maintained for at least 1 month after the end of the training period. Conclusion Real-time monitoring of training performance can be achieved using this rehabilitation platform. Patients demonstrated a reduction in dizziness symptoms after 6 weeks of training with this short-term interactive game approach. This treatment paradigm also improved the patients’ balance function. This system could provide a convenient, safe and affordable treatment option for clinical practitioners. PMID:23043886
Chen, Po-Yin; Hsieh, Wan-Ling; Wei, Shun-Hwa; Kao, Chung-Lan
2012-10-09
Peripheral vestibular hypofunction is a major cause of dizziness. When complicated with postural imbalance, this condition can lead to an increased incidence of falls. In traditional clinical practice, gaze stabilization exercise is commonly used to rehabilitate patients. In this study, we established a computer-aided vestibular rehabilitation system by coupling infrared LEDs to an infrared receiver. This system enabled the subjects' head-turning actions to be quantified, and the training was performed using vestibular exercise combined with computer games and interactive video games that simulate daily life activities. Three unilateral and one bilateral vestibular hypofunction patients volunteered to participate in this study. The participants received 30 minutes of computer-aided vestibular rehabilitation training 2 days per week for 6 weeks. Pre-training and post-training assessments were completed, and a follow-up assessment was completed 1 month after the end of the training period. After 6 weeks of training, significant improvements in balance and dynamic visual acuity (DVA) were observed in the four participants. Self-reports of dizziness, anxiety and depressed mood all decreased significantly. Significant improvements in self-confidence and physical performance were also observed. The effectiveness of this training was maintained for at least 1 month after the end of the training period. Real-time monitoring of training performance can be achieved using this rehabilitation platform. Patients demonstrated a reduction in dizziness symptoms after 6 weeks of training with this short-term interactive game approach. This treatment paradigm also improved the patients' balance function. This system could provide a convenient, safe and affordable treatment option for clinical practitioners.
Alinejad, Ali; Istepanian, R S H; Philip, N
2012-01-01
The concept of 4G health will be one of the key focus areas of future m-health research and enterprise activities in the coming years. WiMAX technology is one of the constituent 4G wireless technologies that provides broadband wireless access (BWA). Despite the fact that WiMAX is able to provide a high data rate in a relatively large coverage; this technology has specific limitations such as: coverage, signal attenuation problems due to shadowing or path loss, and limited available spectrum. The IEEE 802.16j mobile multihop relay (MMR) technology is a pragmatic solution designed to overcome these limitations. The aim of IEEE 802.16j MMR is to expand the IEEE 802.16e's capabilities with multihop features. In particular, the uplink (UL) and downlink (DL) subframe allocation in WiMAX network is usually fixed. However, dynamic frame allocation is a useful mechanism to optimize uplink and downlink subframe size dynamically based on the traffic conditions through real-time traffic monitoring. This particular mechanism is important for future WiMAX based m-health applications as it allows the tradeoff in both UL and DL channels. In this paper, we address the dynamic frame allocation issue in IEEE 802.16j MMR network for m-health applications. A comparative performance analysis of the proposed approach is validated using the OPNET Modeler(®). The simulation results have shown an improved performance of resource allocation and end-to-end delay performance for typical medical video streaming application.
VLF Radio Field Strength Measurement of power line carrier system in San Diego, California
NASA Technical Reports Server (NTRS)
Mertel, H. K.
1981-01-01
The radio frequency interference (RFI) potential was evaluated for a Powerline Carriet (PLC) installed in San Diego which monitors the performance of an electrical power system. The PLC system generated 30 amperes at 5.79 kHz. The RF radiations were measured to be (typically) 120 dBuV/m at the beginning of the 12 kV powerline and 60 dBuV/m at the end of the powerline. The RF fields varied inversely as the distance squared. Measurements were also performed with a 45 kHz PLC system. The RF fields were of similar amplitude.
A Flexible System for Simulating Aeronautical Telecommunication Network
NASA Technical Reports Server (NTRS)
Maly, Kurt; Overstreet, C. M.; Andey, R.
1998-01-01
At Old Dominion University, we have built Aeronautical Telecommunication Network (ATN) Simulator with NASA being the fund provider. It provides a means to evaluate the impact of modified router scheduling algorithms on the network efficiency, to perform capacity studies on various network topologies and to monitor and study various aspects of ATN through graphical user interface (GUI). In this paper we describe briefly about the proposed ATN model and our abstraction of this model. Later we describe our simulator architecture highlighting some of the design specifications, scheduling algorithms and user interface. At the end, we have provided the results of performance studies on this simulator.
Tolerance analysis of null lenses using an end-use system performance criterion
NASA Astrophysics Data System (ADS)
Rodgers, J. Michael
2000-07-01
An effective method of assigning tolerances to a null lens is to determine the effects of null-lens fabrication and alignment errors on the end-use system itself, not simply the null lens. This paper describes a method to assign null- lens tolerances based on their effect on any performance parameter of the end-use system.
USDA-ARS?s Scientific Manuscript database
Wheat kernel texture dictates U.S. wheat market class and culinary end-uses. Of interest to wheat breeders is to identify quantitative trait loci (QTL) for wheat kernel texture, milling performance, or end-use quality because it is imperative for wheat breeders to ascertain the genetic architecture ...
Compact, Miniature MMIC Receiver Modules for an MMIC Array Spectrograph
NASA Technical Reports Server (NTRS)
Kangaslahti, Pekka P.; Gaier, Todd C.; Cooperrider, Joelle T.; Samoska, Lorene A.; Soria, Mary M.; ODwyer, Ian J.; Weinreb, Sander; Custodero, Brian; Owen, Heahter; Grainge, Keith;
2009-01-01
A single-pixel prototype of a W-band detector module with a digital back-end was developed to serve as a building block for large focal-plane arrays of monolithic millimeter-wave integrated circuit (MMIC) detectors. The module uses low-noise amplifiers, diode-based mixers, and a WR10 waveguide input with a coaxial local oscillator. State-of-the-art InP HEMT (high electron mobility transistor) MMIC amplifiers at the front end provide approximately 40 dB of gain. The measured noise temperature of the module, at an ambient temperature of 300 K, was found to be as low as 450 K at 95 GHz. The modules will be used to develop multiple instruments for astrophysics radio telescopes, both on the ground and in space. The prototype is being used by Stanford University to characterize noise performance at cryogenic temperatures. The goal is to achieve a 30-50 K noise temperature around 90 GHz when cooled to a 20 K ambient temperature. Further developments include characterization of the IF in-phase (I) and quadrature (Q) signals as a function of frequency to check amplitude and phase; replacing the InP low-noise amplifiers with state-of-the-art 35-nm-gate-length NGC low-noise amplifiers; interfacing the front-end module with a digital back-end spectrometer; and developing a scheme for local oscillator and IF distribution in a future array. While this MMIC is being developed for use in radio astronomy, it has the potential for use in other industries. Applications include automotive radar (both transmitters and receivers), communication links, radar systems for collision avoidance, production monitors, ground-penetrating sensors, and wireless personal networks.
Zabel, Markus; Müller-Riemenschneider, Falk; Geller, J Christoph; Brachmann, Johannes; Kühlkamp, Volker; Dissmann, Rüdiger; Reinhold, Thomas; Roll, Stephanie; Lüthje, Lars; Bode, Frank; Eckardt, Lars; Willich, Stefan N
2014-10-01
Implantable cardioverter defibrillator (ICD) remote follow-up and ICD remote monitoring (RM) are established means of ICD follow-up. The reduction of the number of in-office visits and the time to decision is proven, but the true clinical benefit is still unknown. Cost and cost-effectiveness of RM remain leading issues for its dissemination. The MONITOR-ICD study has been designed to assess costs, cost-effectiveness, and clinical benefits of RM versus standard-care follow-up in a prospective multicenter randomized controlled trial. Patients indicated for single- or dual-chamber ICD are eligible for the study and are implanted an RM-capable Biotronik ICD (Lumax VR-T or Lumax DR-T; Biotronik SE & Co KG, Berlin, Germany). Implantable cardioverter defibrillator programming and alert-based clinical responses in the RM group are highly standardized by protocol. As of December 2011, recruitment has been completed, and 416 patients have been enrolled. Subjects are followed-up for a minimum of 12months and a maximum of 24months, ending in January 2013. Disease-specific costs from a societal perspective have been defined as primary end point and will be compared between RM and standard-care groups. Secondary end points include ICD shocks (including appropriate and inappropriate shocks), cardiovascular hospitalizations and cardiovascular mortality, and additional health economic end points. The MONITOR-ICD study will be an important randomized RM study to report data on a primary economic end point in 2014. Its results on ICD shocks will add to the currently available evidence on clinical benefit of RM. Copyright © 2014 Mosby, Inc. All rights reserved.
Kutty, Shelby; Li, Ling; Polak, Amanda; Gribben, Paul; Danford, David A
2012-03-15
The systemic right ventricle (RV) in congenital heart disease is susceptible to progressive dilation and dysfunction. A 2-dimensional echocardiographic means for serial monitoring of the RV would be of great value in this clinical setting. We used 2-dimensional echocardiography with knowledge-based reconstruction (2DE-KBR) for evaluation of systemic RV. Patients with d-transposition of great arteries repaired with an atrial switch and without implanted pacemakers were prospectively recruited for same-day 2DE-KBR and cardiac magnetic resonance (CMR) imaging. RV images were acquired in various 2-dimensional imaging planes using a 3-dimensional space-localizing device attached to the imaging transducer and 3-dimensional reconstruction was performed. RV end-diastolic volume, end-systolic volume, and ejection fraction (EF) were calculated and compared to volumetric CMR analysis. Fifteen patients (7 women, 8 men, 24 ± 7 years old, weight 67 ± 12 kg) were studied. There was good agreement of 2DE-KBR and CMR measurements. Mean RV end-diastolic volume was 221 ± 39 ml with 2DE-KBR and 231 ± 35 ml with CMR (r = 0.80); mean end-systolic volume was 129 ± 35 ml with KBR and 132 ± 30 ml with CMR (r = 0.82), and EF was 42 ± 10% with KBR and 43 ± 7% with CMR (r = 0.86). For 2DE-KBR mean interobserver variabilities were 4.6%, 2.6%, and 4.3%; intraobserver variabilities were 3.2%, 3.1%, and 2.3%, respectively, for end-diastolic volume, end-systolic volume, and EF. In conclusion, this study demonstrates the clinical feasibility of quantifying systemic RV volumes and function using 2DE-KBR in adolescents and young adults with repaired d-transposition of great arteries and good agreement of measurements with CMR. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Moghadas, Davood; Jadoon, Khan Zaib; McCabe, Matthew F.
2017-12-01
Monitoring spatiotemporal variations of soil water content (θ) is important across a range of research fields, including agricultural engineering, hydrology, meteorology and climatology. Low frequency electromagnetic induction (EMI) systems have proven to be useful tools in mapping soil apparent electrical conductivity (σa) and soil moisture. However, obtaining depth profile water content is an area that has not been fully explored using EMI. To examine this, we performed time-lapse EMI measurements using a CMD mini-Explorer sensor along a 10 m transect of a maize field over a 6 day period. Reference data were measured at the end of the profile via an excavated pit using 5TE capacitance sensors. In order to derive a time-lapse, depth-specific subsurface image of electrical conductivity (σ), we applied a probabilistic sampling approach, DREAM(ZS) , on the measured EMI data. The inversely estimated σ values were subsequently converted to θ using the Rhoades et al. (1976) petrophysical relationship. The uncertainties in measured σa, as well as inaccuracies in the inverted data, introduced some discrepancies between estimated σ and reference values in time and space. Moreover, the disparity between the measurement footprints of the 5TE and CMD Mini-Explorer sensors also led to differences. The obtained θ permitted an accurate monitoring of the spatiotemporal distribution and variation of soil water content due to root water uptake and evaporation. The proposed EMI measurement and modeling technique also allowed for detecting temporal root zone soil moisture variations. The time-lapse θ monitoring approach developed using DREAM(ZS) thus appears to be a useful technique to understand spatiotemporal patterns of soil water content and provide insights into linked soil moisture vegetation processes and the dynamics of soil moisture/infiltration processes.
Trends in use of electronic nicotine delivery systems by adolescents.
Camenga, Deepa R; Delmerico, Jennifer; Kong, Grace; Cavallo, Dana; Hyland, Andrew; Cummings, K Michael; Krishnan-Sarin, Suchitra
2014-01-01
Electronic nicotine delivery systems (ENDS) have been gaining in popularity. The few prevalence studies in adults have found that most ENDS users are current or former smokers. The objectives of this study were to estimate the prevalence of ENDS usage in adolescents, and examine the correlates of use. Self-administered written surveys assessing tobacco use behaviors were conducted in multiple waves as part of a larger intervention study in two large suburban high schools. The prevalence of past-30 day ENDS use increased from 0.9% in February 2010 to 2.3% in June 2011 (p=0.009). Current cigarette smokers had increased odds of past-30 day ENDS use in all study waves. When adjusted for school, grade, sex, race and smoking status, students in October 2010 (Adjusted OR 2.12; 95% confidence interval (CI): 1.12-4.02) and June 2011 (Adjusted OR 2.51; 95% CI: 1.17-4.71) had increased odds past-30 day ENDS use compared to February 2010. The prevalence of ENDS use doubled in this sample of high school students, and current cigarette smoking is the strongest predictor of current use. Continued monitoring of ENDS is needed to determine whether it increases the likelihood of cigarette smoking initiation and maintenance in youth. © 2013.
Epstein, Richard H; Dexter, Franklin; Hofer, Ira S; Rodriguez, Luis I; Schwenk, Eric S; Maga, Joni M; Hindman, Bradley J
2018-02-01
Perioperative hypothermia may increase the incidences of wound infection, blood loss, transfusion, and cardiac morbidity. US national quality programs for perioperative normothermia specify the presence of at least 1 "body temperature" ≥35.5°C during the interval from 30 minutes before to 15 minutes after the anesthesia end time. Using data from 4 academic hospitals, we evaluated timing and measurement considerations relevant to the current requirements to guide hospitals wishing to report perioperative temperature measures using electronic data sources. Anesthesia information management system databases from 4 hospitals were queried to obtain intraoperative temperatures and intervals to the anesthesia end time from discontinuation of temperature monitoring, end of surgery, and extubation. Inclusion criteria included age >16 years, use of a tracheal tube or supraglottic airway, and case duration ≥60 minutes. The end-of-case temperature was determined as the maximum intraoperative temperature recorded within 30 minutes before the anesthesia end time (ie, the temperature that would be used for reporting purposes). The fractions of cases with intervals >30 minutes between the last intraoperative temperature and the anesthesia end time were determined. Among the hospitals, averages (binned by quarters) of 34.5% to 59.5% of cases had intraoperative temperature monitoring discontinued >30 minutes before the anesthesia end time. Even if temperature measurement had been continued until extubation, averages of 5.9% to 20.8% of cases would have exceeded the allowed 30-minute window. Averages of 8.9% to 21.3% of cases had end-of-case intraoperative temperatures <35.5°C (ie, a quality measure failure). Because of timing considerations, a substantial fraction of cases would have been ineligible to use the end-of-case intraoperative temperature for national quality program reporting. Thus, retrieval of postanesthesia care unit temperatures would have been necessary. A substantive percentage of cases had end-of-case intraoperative temperatures below the 35.5°C threshold, also requiring postoperative measurement to determine whether the quality measure was satisfied. Institutions considering reporting national quality measures for perioperative normothermia should consider the technical and logistical issues identified to achieve a high level of compliance based on the specified regulatory language.
Pagel, Christina; Utley, Martin; Crowe, Sonya; Witter, Thomas; Anderson, David; Samson, Ray; McLean, Andrew; Banks, Victoria; Tsang, Victor; Brown, Katherine
2013-01-01
Objective To implement routine in-house monitoring of risk-adjusted 30-day mortality following paediatric cardiac surgery. Design Collaborative monitoring software development and implementation in three specialist centres. Patients and methods Analyses incorporated 2 years of data routinely audited by the National Institute of Cardiac Outcomes Research (NICOR). Exclusion criteria were patients over 16 or undergoing non-cardiac or only catheter procedures. We applied the partial risk adjustment in surgery (PRAiS) risk model for death within 30 days following surgery and generated variable life-adjusted display (VLAD) charts for each centre. These were shared with each clinical team and feedback was sought. Results Participating centres were Great Ormond Street Hospital, Evelina Children's Hospital and The Royal Hospital for Sick Children in Glasgow. Data captured all procedures performed between 1 January 2010 and 31 December 2011. This incorporated 2490 30-day episodes of care, 66 of which were associated with a death within 30 days.The VLAD charts generated for each centre displayed trends in outcomes benchmarked to recent national outcomes. All centres ended the 2-year period within four deaths from what would be expected. The VLAD charts were shared in multidisciplinary meetings and clinical teams reported that they were a useful addition to existing quality assurance initiatives. Each centre is continuing to use the prototype software to monitor their in-house surgical outcomes. Conclusions Timely and routine monitoring of risk-adjusted mortality following paediatric cardiac surgery is feasible. Close liaison with hospital data managers as well as clinicians was crucial to the success of the project. PMID:23564473
A New mHealth Communication Framework for Use in Wearable WBANs and Mobile Technologies
Hamida, Sana Tmar-Ben; Hamida, Elyes Ben; Ahmed, Beena
2015-01-01
Driven by the development of biomedical sensors and the availability of high mobile bandwidth, mobile health (mHealth) systems are now offering a wider range of new services. This revolution makes the idea of in-home health monitoring practical and provides the opportunity for assessment in “real-world” environments producing more ecologically valid data. In the field of insomnia diagnosis, for example, it is now possible to offer patients wearable sleep monitoring systems which can be used in the comfort of their homes over long periods of time. The recorded data collected from body sensors can be sent to a remote clinical back-end system for analysis and assessment. Most of the research on sleep reported in the literature mainly looks into how to automate the analysis of the sleep data and does not address the problem of the efficient encoding and secure transmissions of the collected health data. This article reviews the key enabling communication technologies and research challenges for the design of efficient mHealth systems. An end-to-end mHealth system architecture enabling the remote assessment and monitoring of patient's sleep disorders is then proposed and described as a case study. Finally, various mHealth data serialization formats and machine-to-machine (M2M) communication protocols are evaluated and compared under realistic operating conditions. PMID:25654718
A new mHealth communication framework for use in wearable WBANs and mobile technologies.
Hamida, Sana Tmar-Ben; Hamida, Elyes Ben; Ahmed, Beena
2015-02-03
Driven by the development of biomedical sensors and the availability of high mobile bandwidth, mobile health (mHealth) systems are now offering a wider range of new services. This revolution makes the idea of in-home health monitoring practical and provides the opportunity for assessment in "real-world" environments producing more ecologically valid data. In the field of insomnia diagnosis, for example, it is now possible to offer patients wearable sleep monitoring systems which can be used in the comfort of their homes over long periods of time. The recorded data collected from body sensors can be sent to a remote clinical back-end system for analysis and assessment. Most of the research on sleep reported in the literature mainly looks into how to automate the analysis of the sleep data and does not address the problem of the efficient encoding and secure transmissions of the collected health data. This article reviews the key enabling communication technologies and research challenges for the design of efficient mHealth systems. An end-to-end mHealth system architecture enabling the remote assessment and monitoring of patient's sleep disorders is then proposed and described as a case study. Finally, various mHealth data serialization formats and machine-to-machine (M2M) communication protocols are evaluated and compared under realistic operating conditions.
Joch, Michael; Hegele, Mathias; Maurer, Heiko; Müller, Hermann; Maurer, Lisa Katharina
2017-07-01
The error (related) negativity (Ne/ERN) is an event-related potential in the electroencephalogram (EEG) correlating with error processing. Its conditions of appearance before terminal external error information suggest that the Ne/ERN is indicative of predictive processes in the evaluation of errors. The aim of the present study was to specifically examine the Ne/ERN in a complex motor task and to particularly rule out other explaining sources of the Ne/ERN aside from error prediction processes. To this end, we focused on the dependency of the Ne/ERN on visual monitoring about the action outcome after movement termination but before result feedback (action effect monitoring). Participants performed a semi-virtual throwing task by using a manipulandum to throw a virtual ball displayed on a computer screen to hit a target object. Visual feedback about the ball flying to the target was masked to prevent action effect monitoring. Participants received a static feedback about the action outcome (850 ms) after each trial. We found a significant negative deflection in the average EEG curves of the error trials peaking at ~250 ms after ball release, i.e., before error feedback. Furthermore, this Ne/ERN signal did not depend on visual ball-flight monitoring after release. We conclude that the Ne/ERN has the potential to indicate error prediction in motor tasks and that it exists even in the absence of action effect monitoring. NEW & NOTEWORTHY In this study, we are separating different kinds of possible contributors to an electroencephalogram (EEG) error correlate (Ne/ERN) in a throwing task. We tested the influence of action effect monitoring on the Ne/ERN amplitude in the EEG. We used a task that allows us to restrict movement correction and action effect monitoring and to control the onset of result feedback. We ascribe the Ne/ERN to predictive error processing where a conscious feeling of failure is not a prerequisite. Copyright © 2017 the American Physiological Society.
Wearable dry sensors with bluetooth connection for use in remote patient monitoring systems.
Gargiulo, Gaetano; Bifulco, Paolo; Cesarelli, Mario; Jin, Craig; McEwan, Alistair; van Schaik, Andre
2010-01-01
Cost reduction has become the primary theme of healthcare reforms globally. More providers are moving towards remote patient monitoring, which reduces the length of hospital stays and frees up their physicians and nurses for acute cases and helps them to tackle staff shortages. Physiological sensors are commonly used in many human specialties e.g. electrocardiogram (ECG) electrodes, for monitoring heart signals, and electroencephalogram (EEG) electrodes, for sensing the electrical activity of the brain, are the most well-known applications. Consequently there is a substantial unmet need for physiological sensors that can be simply and easily applied by the patient or primary carer, are comfortable to wear, can accurately sense parameters over long periods of time and can be connected to data recording systems using Bluetooth technology. We have developed a small, battery powered, user customizable portable monitor. This prototype is capable of recording three-axial body acceleration, skin temperature, and has up to four bio analogical front ends. Moreover, it is also able of continuous wireless transmission to any Bluetooth device including a PDA or a cellular phone. The bio-front end can use long-lasting dry electrodes or novel textile electrodes that can be embedded in clothes. The device can be powered by a standard mobile phone which has a Ni-MH 3.6 V battery, to sustain more than seven days continuous functioning when using the Bluetooth Sniff mode to reduce TX power. In this paper, we present some of the evaluation experiments of our wearable personal monitor device with a focus on ECG applications.
A prototype scintillating fibre beam profile monitor for Ion Therapy beams
NASA Astrophysics Data System (ADS)
Leverington, B. D.; Dziewiecki, M.; Renner, L.; Runze, R.
2018-05-01
A prototype plastic scintillating fibre based beam profile monitor was tested at the Heidelberg Ion Therapy Centre/Heidelberg Ionenstrahl Therapiezentrum (HIT) in 2016 to determine its beam property reconstruction performance and the feasibility of further developing an expanded system. At HIT protons, helium, carbon, and oxygen ions are available for therapy and experiments. The beam can be scanned in two dimensions using fast deflection magnets. A tracking system is used to monitor beam position and to adjust scanning magnet currents online. A new detector system with a finer granularity and without the drift time delay of the current MWPC system with a similar amount of material along the beamline would prove valuable in patient treatment. The sensitive detector components in the tested prototype detector are double-clad Kuraray SCSF-78MJ scintillating fibres with a diameter of 0.250 mm wound as a thin multi-layer ribbon. The scintillation light is detected at the end of the ribbon with Hamamatsu S11865-64 photodiode arrays with a pitch of 0.8 mm. Commercial or readily available readout electronics have been used to evaluate the system feasibility. The results shown in this paper include the linearity with respect to beam intensity, the RMS of the beam intensity as measured by two planes, along with the RMS of the mean position, and the measured beam width RMS. The Signal-to-Noise ratio of the current system is also measured as an indicator of potential performance. Additionally, the non-linear light yield of the scintillating fibres as measured by the photodiode arrays is compared to two models which describe the light yield as a function of the ion stopping power and Lorentz β.
The monitoring and data quality assessment of the ATLAS liquid argon calorimeter
NASA Astrophysics Data System (ADS)
Simard, Olivier; ATLAS Liquid Argon Calorimeter Group
2015-02-01
The ATLAS experiment is designed to study the proton-proton (pp) collisions produced at the Large Hadron Collider (LHC) at CERN. Liquid argon (LAr) sampling calorimeters are used for all electromagnetic calorimetry in the pseudo-rapidity region |η| < 3.2, as well as for hadronic calorimetry in the range 1.5 < |η| < 4.9. The electromagnetic calorimeters use lead as passive material and are characterized by an accordion geometry that allows a fast and uniform response without azimuthal gaps. Copper and tungsten were chosen as passive material for the hadronic calorimetry; while a classic parallel-plate geometry was adopted at large polar angles, an innovative design based on cylindrical electrodes with thin liquid argon gaps is employed at low angles, where the particle flux is higher. All detectors are housed in three cryostats maintained at about 88.5 K. The 182,468 cells are read out via front-end boards housed in on-detector crates that also contain monitoring, calibration, trigger and timing boards. In the first three years of LHC operation, approximately 27 fb-1 of pp collision data were collected at centre-of-mass energies of 7-8 TeV. Throughout this period, the calorimeter consistently operated with performances very close to specifications, with high data-taking efficiency. This is in large part due to a sophisticated data monitoring procedure designed to quickly identify issues that would degrade the detector performance, to ensure that only the best quality data are used for physics analysis. After a description of the detector design, main characteristics and operation principles, this paper details the data quality assessment procedures developed during the 2011 and 2012 LHC data-taking periods, when more than 98% of the luminosity recorded by ATLAS had high quality LAr calorimeter data suitable for physics analysis.
Physiological monitoring of team and task stressors
NASA Astrophysics Data System (ADS)
Orasanu, Judith; Tada, Yuri; Kraft, Norbert; Fischer, Ute
2005-05-01
Sending astronauts into space, especially on long-durations missions (e.g. three-year missions to Mars), entails enormous risk. Threats include both physical dangers of radiation, bone loss and other consequences of weightlessness, and also those arising from interpersonal problems associated with extended life in a high-risk isolated and confined environment. Before undertaking long-duration missions, NASA seeks to develop technologies to monitor indicators of potentially debilitating stress at both the individual and team level so that countermeasures can be introduced to prevent further deterioration. Doing so requires a better understanding of indicators of team health and performance. To that end, a study of team problem solving in a simulation environment was undertaken to explore effects of team and task stress. Groups of four males (25-45 yrs) engaged in six dynamic computer-based Antarctic search and rescue missions over four days. Both task and team stressors were manipulated. Physiological responses (ECG, respiration rate and amplitude, SCL, EMG, and PPG); communication (voice and email); individual personality and subjective team dynamics responses were collected and related to task performance. Initial analyses found that physiological measures can be used to identify transient stress, predict performance, and reflect subjective workload. Muscle tension and respiration were the most robust predictors. Not only the level of arousal but its variability during engagement in the task is important to consider. In general, less variability was found to be associated with higher levels of performance. Individuals scoring high on specific personality characteristics responded differently to task stress.
Prefrontal Markers and Cognitive Performance Are Dissociated during Progressive Dopamine Lesion
Wilson, Charles R. E.; Vezoli, Julien; Faraut, Maïlys C. M.; Leviel, Vincent; Knoblauch, Kenneth; Procyk, Emmanuel
2016-01-01
Dopamine is thought to directly influence the neurophysiological mechanisms of both performance monitoring and cognitive control—two processes that are critically linked in the production of adapted behaviour. Changing dopamine levels are also thought to induce cognitive changes in several neurological and psychiatric conditions. But the working model of this system as a whole remains untested. Specifically, although many researchers assume that changing dopamine levels modify neurophysiological mechanisms and their markers in frontal cortex, and that this in turn leads to cognitive changes, this causal chain needs to be verified. Using longitudinal recordings of frontal neurophysiological markers over many months during progressive dopaminergic lesion in non-human primates, we provide data that fail to support a simple interaction between dopamine, frontal function, and cognition. Feedback potentials, which are performance-monitoring signals sometimes thought to drive successful control, ceased to differentiate feedback valence at the end of the lesion, just before clinical motor threshold. In contrast, cognitive control performance and beta oscillatory markers of cognitive control were unimpaired by the lesion. The differing dynamics of these measures throughout a dopamine lesion suggests they are not all driven by dopamine in the same way. These dynamics also demonstrate that a complex non-linear set of mechanisms is engaged in the brain in response to a progressive dopamine lesion. These results question the direct causal chain from dopamine to frontal physiology and on to cognition. They imply that biomarkers of cognitive functions are not directly predictive of dopamine loss. PMID:27824858
Suominen, Pertti K; Stayer, Stephen; Wang, Wei; Chang, Anthony C
2007-01-01
We evaluated accuracy of end-tidal carbon dioxide tension (PETco2) monitoring and measured the effect of temperature correction of blood gas values in children after cardiac surgery. Data from 49 consecutive mechanically ventilated children after cardiac surgery in the cardiac intensive care unit were prospectively collected. One patient was excluded from the study. Four arterial-end-tidal CO2 pairs in each patient were obtained. Both the arterial carbon dioxide tension (Paco2) values determined at a temperature of 37 degrees C and values corrected to body temperature (Patcco2) were compared with the PETco2 values. After the surgical correction 28 patients had biventricular, acyanotic (mean age 2.7 +/- 4.8 years) and 20 patients had a cyanotic lesion (mean age 1.0 +/- 1.7 years). The body temperature ranged from 35.2 degrees C to 38.9 degrees C. The Pa-PETco2 discrepancy was affected both by the type of cardiac lesion and by the temperature correction of Paco2 values. Correlation slopes of the Pa-PETco2 and Patc-PETco2 discrepancies were significantly different (p = 0.040) when the body temperature was higher or lower than 37 degrees C. In children, after cardiac surgery, end-tidal CO2 monitoring provided a clinically acceptable estimate of arterial CO2 value, which remained stabile in repeated measurements. End-tidal CO2 monitoring more accurately reflects temperature-corrected blood gas values.