Sample records for thresholds developing monitoring

  1. Utilizing Objective Drought Thresholds to Improve Drought Monitoring with the SPI

    NASA Astrophysics Data System (ADS)

    Leasor, Z. T.; Quiring, S. M.

    2017-12-01

    Drought is a prominent climatic hazard in the south-central United States. Droughts are frequently monitored using the severity categories determined by the U.S. Drought Monitor (USDM). This study uses the Standardized Precipitation Index (SPI) to conduct a drought frequency analysis across Texas, Oklahoma, and Kansas using PRISM precipitation data from 1900-2015. The SPI is shown to be spatiotemporally variant across the south-central United States. In particular, utilizing the default USDM severity thresholds may underestimate drought severity in arid regions. Objective drought thresholds were implemented by fitting a CDF to each location's SPI distribution. This approach results in a more homogeneous distribution of drought frequencies across each severity category. Results also indicate that it may be beneficial to develop objective drought thresholds for each season and SPI timescale. This research serves as a proof-of-concept and demonstrates how drought thresholds should be objectively developed so that they are appropriate for each climatic region.

  2. Continuous Seismic Threshold Monitoring

    DTIC Science & Technology

    1992-05-31

    Continuous threshold monitoring is a technique for using a seismic network to monitor a geographical area continuously in time. The method provides...area. Two approaches are presented. Site-specific monitoring: By focusing a seismic network on a specific target site, continuous threshold monitoring...recorded events at the site. We define the threshold trace for the network as the continuous time trace of computed upper magnitude limits of seismic

  3. Insights from internet-based remote intrathoracic impedance monitoring as part of a heart failure disease management program.

    PubMed

    Mullens, Wilfried; Oliveira, Leonardo P J; Verga, Tanya; Wilkoff, Bruce L; Tang, Wai Hong Wilson

    2010-01-01

    Changes in intrathoracic impedance (Z) leading to crossing of a derived fluid index (FI) threshold has been associated with heart failure (HF) hospitalization. The authors developed a remote monitoring program as part of HF disease management and prospectively examined the feasibility and resource utilization of monitoring individuals with an implanted device capable of measuring Z. An HF nurse analyzed all transmitted data daily, as they were routinely uploaded as part of quarterly remote device monitoring, and called the patient if the FI crossed the threshold (arbitrarily defined at 60 Omega) to identify clinically relevant events (CREs) that occurred during this period (eg, worsening dyspnea or increase in edema or weight). A total of 400 uploads were completed during the 4-month study period. During this period, 34 patients (18%) had an FI threshold crossing, averaging 0.52 FI threshold crossings per patient-year. Thirty-two of 34 patients contacted by telephone (94%) with FI threshold crossing had evidence of CREs during this period. However, only 6 (18%) had HF hospitalizations, 19 (56%) had reported changes in HF therapy, and 13 (38%) reported drug and/or dietary plan nonadherence. The average data analysis time required was 30 min daily when focusing on those with FI threshold crossing, averaging 8 uploads for review per working day and 5 telephone follow-ups per week. Our pilot observations suggested that Internet-based remote monitoring of Z trends from existing device interrogation uploads is feasible as part of a daily routine of HF disease management. 2010 Wiley Periodicals, Inc.

  4. Development of water quality thresholds during dredging for the protection of benthic primary producer habitats.

    PubMed

    Sofonia, Jeremy J; Unsworth, Richard K F

    2010-01-01

    Given the potential for adverse effects of ocean dredging on marine organisms, particularly benthic primary producer communities, the management and monitoring of those activities which cause elevated turbidity and sediment loading is critical. In practice, however, this has proven challenging as the development of water quality threshold values, upon which management responses are based, are subject to a large number of physical and biological parameters that are spatially and temporally specific. As a consequence, monitoring programs to date have taken a wide range of different approaches, most focusing on measures of turbidity reported as nephelometric turbidity units (NTU). This paper presents a potential approach in the determination of water quality thresholds which utilises data gathered through the long-term deployment of in situ water instruments, but suggests a focus on photosynthetic active radiation (PAR) rather than NTU as it is more relevant biologically and inclusive of other site conditions. A simple mathematical approach to data interpretation is also presented which facilitates threshold value development, not individual values of concentrations over specific intervals, but as an equation which may be utilized in numerical modelling.

  5. Ecological thresholds as a basis for defining management triggers for National Park Service vital signs: case studies for dryland ecosystems

    USGS Publications Warehouse

    Bowker, Matthew A.; Miller, Mark E.; Belote, R. Travis; Garman, Steven L.

    2013-01-01

    Threshold concepts are used in research and management of ecological systems to describe and interpret abrupt and persistent reorganization of ecosystem properties (Walker and Meyers, 2004; Groffman and others, 2006). Abrupt change, referred to as a threshold crossing, and the progression of reorganization can be triggered by one or more interactive disturbances such as land-use activities and climatic events (Paine and others, 1998). Threshold crossings occur when feedback mechanisms that typically absorb forces of change are replaced with those that promote development of alternative equilibria or states (Suding and others, 2004; Walker and Meyers, 2004; Briske and others, 2008). The alternative states that emerge from a threshold crossing vary and often exhibit reduced ecological integrity and value in terms of management goals relative to the original or reference system. Alternative stable states with some limited residual properties of the original system may develop along the progression after a crossing; an eventual outcome may be the complete loss of pre-threshold properties of the original ecosystem. Reverting to the more desirable reference state through ecological restoration becomes increasingly difficult and expensive along the progression gradient and may eventually become impossible. Ecological threshold concepts have been applied as a heuristic framework and to aid in the management of rangelands (Bestelmeyer, 2006; Briske and others, 2006, 2008), aquatic (Scheffer and others, 1993; Rapport and Whitford 1999), riparian (Stringham and others, 2001; Scott and others, 2005), and forested ecosystems (Allen and others, 2002; Digiovinazzo and others, 2010). These concepts are also topical in ecological restoration (Hobbs and Norton 1996; Whisenant 1999; Suding and others, 2004; King and Hobbs, 2006) and ecosystem sustainability (Herrick, 2000; Chapin and others, 1996; Davenport and others, 1998). Achieving conservation management goals requires the protection of resources within the range of desired conditions (Cook and others, 2010). The goal of conservation management for natural resources in the U.S. National Park System is to maintain native species and habitat unimpaired for the enjoyment of future generations. Achieving this goal requires, in part, early detection of system change and timely implementation of remediation. The recent National Park Service Inventory and Monitoring program (NPS I&M) was established to provide early warning of declining ecosystem conditions relative to a desired native or reference system (Fancy and others, 2009). To be an effective tool for resource protection, monitoring must be designed to alert managers of impending thresholds so that preventive actions can be taken. This requires an understanding of the ecosystem attributes and processes associated with threshold-type behavior; how these attributes and processes become degraded; and how risks of degradation vary among ecosystems and in relation to environmental factors such as soil properties, climatic conditions, and exposure to stressors. In general, the utility of the threshold concept for long-term monitoring depends on the ability of scientists and managers to detect, predict, and prevent the occurrence of threshold crossings associated with persistent, undesirable shifts among ecosystem states (Briske and others, 2006). Because of the scientific challenges associated with understanding these factors, the application of threshold concepts to monitoring designs has been very limited to date (Groffman and others, 2006). As a case in point, the monitoring efforts across the 32 NPS I&M networks were largely designed with the knowledge that they would not be used to their full potential until the development of a systematic method for understanding threshold dynamics and methods for estimating key attributes of threshold crossings. This report describes and demonstrates a generalized approach that we implemented to formalize understanding and estimating of threshold dynamics for terrestrial dryland ecosystems in national parks of the Colorado Plateau. We provide a structured approach to identify and describe degradation processes associated with threshold behavior and to estimate indicator levels that characterize the point at which a threshold crossing has occurred or is imminent (tipping points) or points where investigative or preventive management action should be triggered (assessment points). We illustrate this method for several case studies in national parks included in the Northern and Southern Colorado Plateau NPS I&M networks, where historical livestock grazing, climatic change, and invasive species are key agents of change. The approaches developed in these case studies are intended to enhance the design, effectiveness, and management-relevance of monitoring efforts in support of conservation management in dryland systems. They specifically enhance National Park Service (NPS) capacity for protecting park resources on the Colorado Plateau but have applicability to monitoring and conservation management of dryland ecosystems worldwide.

  6. First Results of a Detection Sensor for the Monitoring of Laying Hens Reared in a Commercial Organic Egg Production Farm Based on the Use of Infrared Technology.

    PubMed

    Zaninelli, Mauro; Redaelli, Veronica; Tirloni, Erica; Bernardi, Cristian; Dell'Orto, Vittorio; Savoini, Giovanni

    2016-10-21

    The development of a monitoring system to identify the presence of laying hens, in a closed room of a free-range commercial organic egg production farm, was the aim of this study. This monitoring system was based on the infrared (IR) technology and had, as final target, a possible reduction of atmospheric ammonia levels and bacterial load. Tests were carried out for three weeks and involved 7 ISA (Institut de Sélection Animale) brown laying hens. The first 5 days was used to set up the detection sensor, while the other 15 days were used to evaluate the accuracy of the resulting monitoring system, in terms of sensitivity and specificity. The setup procedure included the evaluation of different color background (CB) thresholds, used to discriminate the information contents of the thermographic images. At the end of this procedure, a CB threshold equal to an increase of 3 °C from the floor temperature was chosen, and a cutoff level of 196 colored pixels was identified as the threshold to use to classify a positive case. The results of field tests showed that the developed monitoring system reached a fine detection accuracy (sensitivity = 97.9% and specificity = 94.9%) and the IR technology proved to be a possible solution for the development of a detection sensor necessary to reach the scope of this study.

  7. First Results of a Detection Sensor for the Monitoring of Laying Hens Reared in a Commercial Organic Egg Production Farm Based on the Use of Infrared Technology

    PubMed Central

    Zaninelli, Mauro; Redaelli, Veronica; Tirloni, Erica; Bernardi, Cristian; Dell’Orto, Vittorio; Savoini, Giovanni

    2016-01-01

    The development of a monitoring system to identify the presence of laying hens, in a closed room of a free-range commercial organic egg production farm, was the aim of this study. This monitoring system was based on the infrared (IR) technology and had, as final target, a possible reduction of atmospheric ammonia levels and bacterial load. Tests were carried out for three weeks and involved 7 ISA (Institut de Sélection Animale) brown laying hens. The first 5 days was used to set up the detection sensor, while the other 15 days were used to evaluate the accuracy of the resulting monitoring system, in terms of sensitivity and specificity. The setup procedure included the evaluation of different color background (CB) thresholds, used to discriminate the information contents of the thermographic images. At the end of this procedure, a CB threshold equal to an increase of 3 °C from the floor temperature was chosen, and a cutoff level of 196 colored pixels was identified as the threshold to use to classify a positive case. The results of field tests showed that the developed monitoring system reached a fine detection accuracy (sensitivity = 97.9% and specificity = 94.9%) and the IR technology proved to be a possible solution for the development of a detection sensor necessary to reach the scope of this study. PMID:27775658

  8. Rotor Smoothing and Vibration Monitoring Results for the US Army VMEP

    DTIC Science & Technology

    2009-06-01

    individual component CI detection thresholds, and development of models for diagnostics, prognostics , and anomaly detection . Figure 16 VMEP Server...and prognostics are of current interest. Development of those systems requires large amounts of data (collection, monitoring , manipulation) to capture...development of automated systems and for continuous updating of algorithms to improve detection , classification, and prognostic performance. A test

  9. Earth resources data acquisition sensor study

    NASA Technical Reports Server (NTRS)

    Grohse, E. W.

    1975-01-01

    The minimum data collection and data processing requirements are investigated for the development of water monitoring systems, which disregard redundant and irrelevant data and process only those data predictive of the onset of significant pollution events. Two approaches are immediately suggested: (1) adaptation of a presently available ambient air monitoring system developed by TVA, and (2) consideration of an air, water, and radiological monitoring system developed by the Georgia Tech Experiment Station. In order to apply monitoring systems, threshold values and maximum allowable rates of change of critical parameters such as dissolved oxygen and temperature are required.

  10. Integrating real-time subsurface hydrologic monitoring with empirical rainfall thresholds to improve landslide early warning

    USGS Publications Warehouse

    Mirus, Benjamin B.; Becker, Rachel E.; Baum, Rex L.; Smith, Joel B.

    2018-01-01

    Early warning for rainfall-induced shallow landsliding can help reduce fatalities and economic losses. Although these commonly occurring landslides are typically triggered by subsurface hydrological processes, most early warning criteria rely exclusively on empirical rainfall thresholds and other indirect proxies for subsurface wetness. We explore the utility of explicitly accounting for antecedent wetness by integrating real-time subsurface hydrologic measurements into landslide early warning criteria. Our efforts build on previous progress with rainfall thresholds, monitoring, and numerical modeling along the landslide-prone railway corridor between Everett and Seattle, Washington, USA. We propose a modification to a previously established recent versus antecedent (RA) cumulative rainfall thresholds by replacing the antecedent 15-day rainfall component with an average saturation observed over the same timeframe. We calculate this antecedent saturation with real-time telemetered measurements from five volumetric water content probes installed in the shallow subsurface within a steep vegetated hillslope. Our hybrid rainfall versus saturation (RS) threshold still relies on the same recent 3-day rainfall component as the existing RA thresholds, to facilitate ready integration with quantitative precipitation forecasts. During the 2015–2017 monitoring period, this RS hybrid approach has an increase of true positives and a decrease of false positives and false negatives relative to the previous RA rainfall-only thresholds. We also demonstrate that alternative hybrid threshold formats could be even more accurate, which suggests that further development and testing during future landslide seasons is needed. The positive results confirm that accounting for antecedent wetness conditions with direct subsurface hydrologic measurements can improve thresholds for alert systems and early warning of rainfall-induced shallow landsliding.

  11. Monitoring Start of Season in Alaska

    NASA Astrophysics Data System (ADS)

    Robin, J.; Dubayah, R.; Sparrow, E.; Levine, E.

    2006-12-01

    In biomes that have distinct winter seasons, start of spring phenological events, specifically timing of budburst and green-up of leaves, coincides with transpiration. Seasons leave annual signatures that reflect the dynamic nature of the hydrologic cycle and link the different spheres of the Earth system. This paper evaluates whether continuity between AVHRR and MODIS normalized difference vegetation index (NDVI) is achievable for monitoring land surface phenology, specifically start of season (SOS), in Alaska. Additionally, two thresholds, one based on NDVI and the other on accumulated growing degree-days (GDD), are compared to determine which most accurately predicts SOS for Fairbanks. Ratio of maximum greenness at SOS was computed from biweekly AVHRR and MODIS composites for 2001 through 2004 for Anchorage and Fairbanks regions. SOS dates were determined from annual green-up observations made by GLOBE students. Results showed that different processing as well as spectral characteristics of each sensor restrict continuity between the two datasets. MODIS values were consistently higher and had less inter-annual variability during the height of the growing season than corresponding AVHRR values. Furthermore, a threshold of 131-175 accumulated GDD was a better predictor of SOS for Fairbanks than a NDVI threshold applied to AVHRR and MODIS datasets. The NDVI threshold was developed from biweekly AVHRR composites from 1982 through 2004 and corresponding annual green-up observations at University of Alaska-Fairbanks (UAF). The GDD threshold was developed from 20+ years of historic daily mean air temperature data and the same green-up observations. SOS dates computed with the GDD threshold most closely resembled actual green-up dates observed by GLOBE students and UAF researchers. Overall, biweekly composites and effects of clouds, snow, and conifers limit the ability of NDVI to monitor phenological changes in Alaska.

  12. A novel threshold criterion in transcranial motor evoked potentials during surgery for gliomas close to the motor pathway.

    PubMed

    Abboud, Tammam; Schaper, Miriam; Dührsen, Lasse; Schwarz, Cindy; Schmidt, Nils Ole; Westphal, Manfred; Martens, Tobias

    2016-10-01

    OBJECTIVE Warning criteria for monitoring of motor evoked potentials (MEP) after direct cortical stimulation during surgery for supratentorial tumors have been well described. However, little is known about the value of MEP after transcranial electrical stimulation (TES) in predicting postoperative motor deficit when monitoring threshold level. The authors aimed to evaluate the feasibility and value of this method in glioma surgery by using a new approach for interpreting changes in threshold level involving contra- and ipsilateral MEP. METHODS Between November 2013 and December 2014, 93 patients underwent TES-MEP monitoring during resection of gliomas located close to central motor pathways but not involving the primary motor cortex. The MEP were elicited by transcranial repetitive anodal train stimulation. Bilateral MEP were continuously evaluated to assess percentage increase of threshold level (minimum voltage needed to evoke a stable motor response from each of the muscles being monitored) from the baseline set before dural opening. An increase in threshold level on the contralateral side (facial, arm, or leg muscles contralateral to the affected hemisphere) of more than 20% beyond the percentage increase on the ipsilateral side (facial, arm, or leg muscles ipsilateral to the affected hemisphere) was considered a significant alteration. Recorded alterations were subsequently correlated with postoperative neurological deterioration and MRI findings. RESULTS TES-MEP could be elicited in all patients, including those with recurrent glioma (31 patients) and preoperative paresis (20 patients). Five of 73 patients without preoperative paresis showed a significant increase in threshold level, and all of them developed new paresis postoperatively (transient in 4 patients and permanent in 1 patient). Eight of 20 patients with preoperative paresis showed a significant increase in threshold level, and all of them developed postoperative neurological deterioration (transient in 4 patients and permanent in 4 patients). In 80 patients no significant change in threshold level was detected, and none of them showed postoperative neurological deterioration. The specificity and sensitivity in this series were estimated at 100%. Postoperative MRI revealed gross-total tumor resection in 56 of 82 patients (68%) in whom complete tumor resection was attainable; territorial ischemia was detected in 4 patients. CONCLUSIONS The novel threshold criterion has made TES-MEP a useful method for predicting postoperative motor deficit in patients who undergo glioma surgery, and has been feasible in patients with preoperative paresis as well as in patients with recurrent glioma. Including contra- and ipsilateral changes in threshold level has led to a high sensitivity and specificity.

  13. The validity of activity monitors for measuring sleep in elite athletes.

    PubMed

    Sargent, Charli; Lastella, Michele; Halson, Shona L; Roach, Gregory D

    2016-10-01

    There is a growing interest in monitoring the sleep of elite athletes. Polysomnography is considered the gold standard for measuring sleep, however this technique is impractical if the aim is to collect data simultaneously with multiple athletes over consecutive nights. Activity monitors may be a suitable alternative for monitoring sleep, but these devices have not been validated against polysomnography in a population of elite athletes. Participants (n=16) were endurance-trained cyclists participating in a 6-week training camp. A total of 122 nights of sleep were recorded with polysomnography and activity monitors simultaneously. Agreement, sensitivity, and specificity were calculated from epoch-for-epoch comparisons of polysomnography and activity monitor data. Sleep variables derived from polysomnography and activity monitors were compared using paired t-tests. Activity monitor data were analysed using low, medium, and high sleep-wake thresholds. Epoch-for-epoch comparisons showed good agreement between activity monitors and polysomnography for each sleep-wake threshold (81-90%). Activity monitors were sensitive to sleep (81-92%), but specificity differed depending on the threshold applied (67-82%). Activity monitors underestimated sleep duration (18-90min) and overestimated wake duration (4-77min) depending on the threshold applied. Applying the correct sleep-wake threshold is important when using activity monitors to measure the sleep of elite athletes. For example, the default sleep-wake threshold (>40 activity counts=wake) underestimates sleep duration by ∼50min and overestimates wake duration by ∼40min. In contrast, sleep-wake thresholds that have a high sensitivity to sleep (>80 activity counts=wake) yield the best combination of agreement, sensitivity, and specificity. Copyright © 2015 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  14. Exploring the utility of real-time hydrologic data for landslide early warning

    NASA Astrophysics Data System (ADS)

    Mirus, B. B.; Smith, J. B.; Becker, R.; Baum, R. L.; Koss, E.

    2017-12-01

    Early warning systems can provide critical information for operations managers, emergency planners, and the public to help reduce fatalities, injuries, and economic losses due to landsliding. For shallow, rainfall-triggered landslides early warning systems typically use empirical rainfall thresholds, whereas the actual triggering mechanism involves the non-linear hydrological processes of infiltration, evapotranspiration, and hillslope drainage that are more difficult to quantify. Because hydrologic monitoring has demonstrated that shallow landslides are often preceded by a rise in soil moisture and pore-water pressures, some researchers have developed early warning criteria that attempt to account for these antecedent wetness conditions through relatively simplistic storage metrics or soil-water balance modeling. Here we explore the potential for directly incorporating antecedent wetness into landslide early warning criteria using recent landslide inventories and in-situ hydrologic monitoring near Seattle, WA, and Portland, OR. We use continuous, near-real-time telemetered soil moisture and pore-water pressure data measured within a few landslide-prone hillslopes in combination with measured and forecasted rainfall totals to inform easy-to-interpret landslide initiation thresholds. Objective evaluation using somewhat limited landslide inventories suggests that our new thresholds based on subsurface hydrologic monitoring and rainfall data compare favorably to the capabilities of existing rainfall-only thresholds for the Seattle area, whereas there are no established rainfall thresholds for the Portland area. This preliminary investigation provides a proof-of-concept for the utility of developing landslide early warning criteria in two different geologic settings using real-time subsurface hydrologic measurements from in-situ instrumentation.

  15. Developments in seismic monitoring for risk reduction

    USGS Publications Warehouse

    Celebi, M.

    2007-01-01

    This paper presents recent state-of-the-art developments to obtain displacements and drift ratios for seismic monitoring and damage assessment of buildings. In most cases, decisions on safety of buildings following seismic events are based on visual inspections of the structures. Real-time instrumental measurements using GPS or double integration of accelerations, however, offer a viable alternative. Relevant parameters, such as the type of connections and structural characteristics (including storey geometry), can be estimated to compute drifts corresponding to several pre-selected threshold stages of damage. Drift ratios determined from real-time monitoring can then be compared to these thresholds in order to estimate damage conditions drift ratios. This approach is demonstrated in three steel frame buildings in San Francisco, California. Recently recorded data of strong shaking from these buildings indicate that the monitoring system can be a useful tool in rapid assessment of buildings and other structures following an earthquake. Such systems can also be used for risk monitoring, as a method to assess performance-based design and analysis procedures, for long-term assessment of structural characteristics of a building, and as a possible long-term damage detection tool.

  16. Real-time seismic monitoring and functionality assessment of a building

    USGS Publications Warehouse

    Celebi, M.; ,

    2005-01-01

    This paper presents recent developments and approaches (using GPS technology and real-time double-integration) to obtain displacements and, in turn, drift ratios, in real-time or near real-time to meet the needs of the engineering and user community in seismic monitoring and assessing the functionality and damage condition of structures. Drift ratios computed in near real-time allow technical assessment of the damage condition of a building. Relevant parameters, such as the type of connections and story structural characteristics (including geometry) are used in computing drifts corresponding to several pre-selected threshold stages of damage. Thus, drift ratios determined from real-time monitoring can be compared to pre-computed threshold drift ratios. The approaches described herein can be used for performance evaluation of structures and can be considered as building health-monitoring applications.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabrera-Palmer, Belkis; Reyna, David; Gerling, Mark D.

    This project aims at the development of advanced low-threshold Ge detection technology and proof of its applicability to reactor monitoring via the as-yet undetected coherent neutrino nucleus scattering (CNNS) process.

  18. Detection of impact damage on thermal protection systems using thin-film piezoelectric sensors for integrated structural health monitoring

    NASA Astrophysics Data System (ADS)

    Na, Jeong K.; Kuhr, Samuel J.; Jata, Kumar V.

    2008-03-01

    Thermal Protection Systems (TPS) can be subjected to impact damage during flight and/or during ground maintenance and/or repair. AFRL/RXLP is developing a reliable and robust on-board sensing/monitoring capability for next generation thermal protection systems to detect and assess impact damage. This study was focused on two classes of metallic thermal protection tiles to determine threshold for impact damage and develop sensing capability of the impacts. Sensors made of PVDF piezoelectric film were employed and tested to evaluate the detectability of impact signals and assess the onset or threshold of impact damage. Testing was performed over a range of impact energy levels, where the sensors were adhered to the back of the specimens. The PVDF signal levels were analyzed and compared to assess damage, where digital microscopy, visual inspection, and white light interferometry were used for damage verification. Based on the impact test results, an assessment of the impact damage thresholds for each type of metallic TPS system was made.

  19. An assessment of nutrients and sedimentation in the St. Thomas East End Reserves, US Virgin Islands.

    PubMed

    Pait, Anthony S; Galdo, Francis R; Ian Hartwell, S; Apeti, Dennis A; Mason, Andrew L

    2018-04-09

    Nutrients and sedimentation were monitored for approximately 2 years at six sites in the St. Thomas East End Reserves (STEER), St. Thomas, USVI, as part of a NOAA project to develop an integrated environmental assessment. Concentrations of ammonium (NH 4 + ) and dissolved inorganic nitrogen (DIN) were higher in Mangrove Lagoon and Benner Bay in the western portion of STEER than in the other sites further east (i.e., Cowpet Bay, Rotto Cay, St. James, and Little St. James). There was no correlation between rainfall and nutrient concentrations. Using a set of suggested nutrient thresholds that have been developed to indicate the potential for the overgrowth of algae on reefs, approximately 60% of the samples collected in STEER were above the threshold for orthophosphate (HPO 4 = ), while 55% of samples were above the DIN threshold. Benner Bay had the highest sedimentation rate of any site monitored in STEER, including Mangrove Lagoon. There was also an east to west and a north to south gradient in sedimentation, indicative of higher sedimentation rates in the western, more populated areas surrounding STEER, and sites closer to the shore of the main island of St. Thomas. Although none of the sites had a mean or average sedimentation rate above a suggested sedimentation threshold, the mean sedimentation rate in Benner Bay was just below the threshold.

  20. Situational Awareness: A Feasibility Investigation of Near-Threshold Skills Development

    DTIC Science & Technology

    1994-01-01

    management and analysis. TS 1 consisted of a medium resolution (640-pixel x 240- line) 13-in. Magnavox RGB Monitor 80 (Model CM 8562 color video monitor), a...timed relaxation period between each training block (20 trials) and a 10- to 15-min rest period between each training protocol. Refreshments and snacks

  1. Summary of small group discussions: Monitoring objectives and thresholds

    Treesearch

    Patricia Manley

    2013-01-01

    Workshop participants were asked to address sets of questions in small group discussions, which were subsequently brought to the entire group for discussion. The second set of questions was directed at identifying a set of degradation activities that could be a primary focus for developing or refining methods and techniques for monitoring:

  2. Risk management in air protection in the Republic of Croatia.

    PubMed

    Peternel, Renata; Toth, Ivan; Hercog, Predrag

    2014-03-01

    In the Republic of Croatia, according to the Air Protection Act, air pollution assessment is obligatory on the whole State territory. For individual regions and populated areas in the State a network has been established for permanent air quality monitoring. The State network consists of stations for measuring background pollution, regional and cross-border remote transfer and measurements as part of international government liabilities, then stations for measuring air quality in areas of cultural and natural heritage, and stations for measuring air pollution in towns and industrial zones. The exceeding of alert and information threshold levels of air pollutants are related to emissions from industrial plants, and accidents. Each excess represents a threat to human health in case of short-time exposure. Monitoring of alert and information threshold levels is carried out at stations from the state and local networks for permanent air quality monitoring according to the Air Quality Measurement Program in the State network for permanent monitoring of air quality and air quality measurement programs in local networks for permanent air quality monitoring. The State network for permanent air quality monitoring has a developed automatic system for reporting on alert and information threshold levels, whereas many local networks under the competence of regional and local self-governments still lack any fully installed systems of this type. In case of accidents, prompt action at all responsibility levels is necessary in order to prevent crisis and this requires developed and coordinated competent units of State Administration as well as self-government units. It is also necessary to be continuously active in improving the implementation of legislative regulations in the field of crises related to critical and alert levels of air pollutants, especially at local levels.

  3. Corona-vacuum failure mechanism test facilities

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.; Mueller, L. A.; Koutnik, E. A.

    1975-01-01

    A nondestructive corona-vacuum test facility for testing high-voltage power system components has been developed using commercially available hardware. The facility simulates operating temperature and vacuum while monitoring coronal discharges with residual gases. Corona threshold voltages obtained from statorette tests with various gas-solid dielectric systems and comparison with calculated data support the following conclusions: (1) air gives the highest corona threshold voltage and helium the lowest, with argon and helium-xenon mixtures intermediate; (2) corona threshold voltage increases with gas pressure; (3) corona threshold voltage for an armature winding can be accurately calculated by using Paschen curves for a uniform field; and (4) Paschen curves for argon can be used to calculate the corona threshold voltage in He-Xe mixtures, for which Paschen curves are unavailable.-

  4. Cognitive Abilities, Monitoring Confidence, and Control Thresholds Explain Individual Differences in Heuristics and Biases

    PubMed Central

    Jackson, Simon A.; Kleitman, Sabina; Howie, Pauline; Stankov, Lazar

    2016-01-01

    In this paper, we investigate whether individual differences in performance on heuristic and biases tasks can be explained by cognitive abilities, monitoring confidence, and control thresholds. Current theories explain individual differences in these tasks by the ability to detect errors and override automatic but biased judgments, and deliberative cognitive abilities that help to construct the correct response. Here we retain cognitive abilities but disentangle error detection, proposing that lower monitoring confidence and higher control thresholds promote error checking. Participants (N = 250) completed tasks assessing their fluid reasoning abilities, stable monitoring confidence levels, and the control threshold they impose on their decisions. They also completed seven typical heuristic and biases tasks such as the cognitive reflection test and Resistance to Framing. Using structural equation modeling, we found that individuals with higher reasoning abilities, lower monitoring confidence, and higher control threshold performed significantly and, at times, substantially better on the heuristic and biases tasks. Individuals with higher control thresholds also showed lower preferences for risky alternatives in a gambling task. Furthermore, residual correlations among the heuristic and biases tasks were reduced to null, indicating that cognitive abilities, monitoring confidence, and control thresholds accounted for their shared variance. Implications include the proposal that the capacity to detect errors does not differ between individuals. Rather, individuals might adopt varied strategies that promote error checking to different degrees, regardless of whether they have made a mistake or not. The results support growing evidence that decision-making involves cognitive abilities that construct actions and monitoring and control processes that manage their initiation. PMID:27790170

  5. Cognitive Abilities, Monitoring Confidence, and Control Thresholds Explain Individual Differences in Heuristics and Biases.

    PubMed

    Jackson, Simon A; Kleitman, Sabina; Howie, Pauline; Stankov, Lazar

    2016-01-01

    In this paper, we investigate whether individual differences in performance on heuristic and biases tasks can be explained by cognitive abilities, monitoring confidence, and control thresholds. Current theories explain individual differences in these tasks by the ability to detect errors and override automatic but biased judgments, and deliberative cognitive abilities that help to construct the correct response. Here we retain cognitive abilities but disentangle error detection, proposing that lower monitoring confidence and higher control thresholds promote error checking. Participants ( N = 250) completed tasks assessing their fluid reasoning abilities, stable monitoring confidence levels, and the control threshold they impose on their decisions. They also completed seven typical heuristic and biases tasks such as the cognitive reflection test and Resistance to Framing. Using structural equation modeling, we found that individuals with higher reasoning abilities, lower monitoring confidence, and higher control threshold performed significantly and, at times, substantially better on the heuristic and biases tasks. Individuals with higher control thresholds also showed lower preferences for risky alternatives in a gambling task. Furthermore, residual correlations among the heuristic and biases tasks were reduced to null, indicating that cognitive abilities, monitoring confidence, and control thresholds accounted for their shared variance. Implications include the proposal that the capacity to detect errors does not differ between individuals. Rather, individuals might adopt varied strategies that promote error checking to different degrees, regardless of whether they have made a mistake or not. The results support growing evidence that decision-making involves cognitive abilities that construct actions and monitoring and control processes that manage their initiation.

  6. Monitoring subsurface hydrologic response for precipitation-induced shallow landsliding in the San Francisco Bay area, California, USA

    USGS Publications Warehouse

    Collins, Brian D.; Stock, Jonathan; Weber, Lisa C.; Whitman, K.; Knepprath, N.

    2012-01-01

    Intense winter storms in the San Francisco Bay area (SFBA) of California, USA often trigger shallow landslides. Some of these landslides mobilize into potentially hazardous debris flows. A growing body of research indicates that rainfall intensity-duration thresholds are insufficient for accurate prediction of landslide occurrence. In response, we have begun long-term monitoring of the hydrologic response of land-slide-prone hillslopes to rainfall in several areas of the SFBA. Each monitoring site is equipped with sensors for measuring soil moisture content and piezometric pressure at several soil depths along with a rain gauge connected to a cell phone or satellite telemetered data logger. The data are transmitted in near-real-time, providing the ability to monitor hydrologic conditions before, during, and after storms. Results are guiding the establishment of both antecedent and storm-specific rainfall and moisture content thresholds which must be achieved before landslide-causative positive pore water pressures are generated. Although widespread shallow landsliding has not yet occurred since the deployment of the monitoring sites, several isolated land-slides have been observed in the area of monitoring. The landslides occurred during a period when positive pore water pressures were measured as a result of intense rainfall that followed higher-than-average season precipitation totals. Continued monitoring and analysis will further guide the establishment of more general-ized thresholds for different regions of the SFBA and contribute to the development and calibration of physi-cally-based predictive models.

  7. Estimating population extinction thresholds with categorical classification trees for Louisiana black bears

    USGS Publications Warehouse

    Laufenberg, Jared S.; Clark, Joseph D.; Chandler, Richard B.

    2018-01-01

    Monitoring vulnerable species is critical for their conservation. Thresholds or tipping points are commonly used to indicate when populations become vulnerable to extinction and to trigger changes in conservation actions. However, quantitative methods to determine such thresholds have not been well explored. The Louisiana black bear (Ursus americanus luteolus) was removed from the list of threatened and endangered species under the U.S. Endangered Species Act in 2016 and our objectives were to determine the most appropriate parameters and thresholds for monitoring and management action. Capture mark recapture (CMR) data from 2006 to 2012 were used to estimate population parameters and variances. We used stochastic population simulations and conditional classification trees to identify demographic rates for monitoring that would be most indicative of heighted extinction risk. We then identified thresholds that would be reliable predictors of population viability. Conditional classification trees indicated that annual apparent survival rates for adult females averaged over 5 years () was the best predictor of population persistence. Specifically, population persistence was estimated to be ≥95% over 100 years when , suggesting that this statistic can be used as threshold to trigger management intervention. Our evaluation produced monitoring protocols that reliably predicted population persistence and was cost-effective. We conclude that population projections and conditional classification trees can be valuable tools for identifying extinction thresholds used in monitoring programs.

  8. Estimating population extinction thresholds with categorical classification trees for Louisiana black bears.

    PubMed

    Laufenberg, Jared S; Clark, Joseph D; Chandler, Richard B

    2018-01-01

    Monitoring vulnerable species is critical for their conservation. Thresholds or tipping points are commonly used to indicate when populations become vulnerable to extinction and to trigger changes in conservation actions. However, quantitative methods to determine such thresholds have not been well explored. The Louisiana black bear (Ursus americanus luteolus) was removed from the list of threatened and endangered species under the U.S. Endangered Species Act in 2016 and our objectives were to determine the most appropriate parameters and thresholds for monitoring and management action. Capture mark recapture (CMR) data from 2006 to 2012 were used to estimate population parameters and variances. We used stochastic population simulations and conditional classification trees to identify demographic rates for monitoring that would be most indicative of heighted extinction risk. We then identified thresholds that would be reliable predictors of population viability. Conditional classification trees indicated that annual apparent survival rates for adult females averaged over 5 years ([Formula: see text]) was the best predictor of population persistence. Specifically, population persistence was estimated to be ≥95% over 100 years when [Formula: see text], suggesting that this statistic can be used as threshold to trigger management intervention. Our evaluation produced monitoring protocols that reliably predicted population persistence and was cost-effective. We conclude that population projections and conditional classification trees can be valuable tools for identifying extinction thresholds used in monitoring programs.

  9. Recent advances to obtain real - Time displacements for engineering applications

    USGS Publications Warehouse

    Celebi, M.

    2005-01-01

    This paper presents recent developments and approaches (using GPS technology and real-time double-integration) to obtain displacements and, in turn, drift ratios, in real-time or near real-time to meet the needs of the engineering and user community in seismic monitoring and assessing the functionality and damage condition of structures. Drift ratios computed in near real-time allow technical assessment of the damage condition of a building. Relevant parameters, such as the type of connections and story structural characteristics (including geometry) are used in computing drifts corresponding to several pre-selected threshold stages of damage. Thus, drift ratios determined from real-time monitoring can be compared to pre-computed threshold drift ratios. The approaches described herein can be used for performance evaluation of structures and can be considered as building health-monitoring applications.

  10. A new temperature threshold detector - Application to missile monitoring

    NASA Astrophysics Data System (ADS)

    Coston, C. J.; Higgins, E. V.

    Comprehensive thermal surveys within the case of solid propellant ballistic missile flight motors are highly desirable. For example, a problem involving motor failures due to insulator cracking at motor ignition, which took several years to solve, could have been identified immediately on the basis of a suitable thermal survey. Using conventional point measurements, such as those utilizing typical thermocouples, for such a survey on a full scale motor is not feasible because of the great number of sensors and measurements required. An alternate approach recognizes that temperatures below a threshold (which depends on the material being monitored) are acceptable, but higher temperatures exceed design margins. In this case hot spots can be located by a grid of wire-like sensors which are sensitive to temperature above the threshold anywhere along the sensor. A new type of temperature threshold detector is being developed for flight missile use. The considered device consists of KNO3 separating copper and Constantan metals. Above the KNO3 MP, galvanic action provides a voltage output of a few tenths of a volt.

  11. Thresholds for conservation and management: structured decision making as a conceptual framework

    USGS Publications Warehouse

    Nichols, James D.; Eaton, Mitchell J.; Martin, Julien; Edited by Guntenspergen, Glenn R.

    2014-01-01

    changes in system dynamics. They are frequently incorporated into ecological models used to project system responses to management actions. Utility thresholds are components of management objectives and are values of state or performance variables at which small changes yield substantial changes in the value of the management outcome. Decision thresholds are values of system state variables at which small changes prompt changes in management actions in order to reach specified management objectives. Decision thresholds are derived from the other components of the decision process.We advocate a structured decision making (SDM) approach within which the following components are identified: objectives (possibly including utility thresholds), potential actions, models (possibly including ecological thresholds), monitoring program, and a solution algorithm (which produces decision thresholds). Adaptive resource management (ARM) is described as a special case of SDM developed for recurrent decision problems that are characterized by uncertainty. We believe that SDM, in general, and ARM, in particular, provide good approaches to conservation and management. Use of SDM and ARM also clarifies the distinct roles of ecological thresholds, utility thresholds, and decision thresholds in informed decision processes.

  12. Estimating the susceptibility of surface water in Texas to nonpoint-source contamination by use of logistic regression modeling

    USGS Publications Warehouse

    Battaglin, William A.; Ulery, Randy L.; Winterstein, Thomas; Welborn, Toby

    2003-01-01

    In the State of Texas, surface water (streams, canals, and reservoirs) and ground water are used as sources of public water supply. Surface-water sources of public water supply are susceptible to contamination from point and nonpoint sources. To help protect sources of drinking water and to aid water managers in designing protective yet cost-effective and risk-mitigated monitoring strategies, the Texas Commission on Environmental Quality and the U.S. Geological Survey developed procedures to assess the susceptibility of public water-supply source waters in Texas to the occurrence of 227 contaminants. One component of the assessments is the determination of susceptibility of surface-water sources to nonpoint-source contamination. To accomplish this, water-quality data at 323 monitoring sites were matched with geographic information system-derived watershed- characteristic data for the watersheds upstream from the sites. Logistic regression models then were developed to estimate the probability that a particular contaminant will exceed a threshold concentration specified by the Texas Commission on Environmental Quality. Logistic regression models were developed for 63 of the 227 contaminants. Of the remaining contaminants, 106 were not modeled because monitoring data were available at less than 10 percent of the monitoring sites; 29 were not modeled because there were less than 15 percent detections of the contaminant in the monitoring data; 27 were not modeled because of the lack of any monitoring data; and 2 were not modeled because threshold values were not specified.

  13. Organic Scintillation Detectors for Spectroscopic Radiation Portal Monitors

    NASA Astrophysics Data System (ADS)

    Paff, Marc Gerrit

    Thousands of radiation portal monitors have been deployed worldwide to detect and deter the smuggling of nuclear and radiological materials that could be used in nefarious acts. Radiation portal monitors are often installed at bottlenecks where large amounts of people or goods must traverse. Examples of use include scanning cargo containers at shipping ports, vehicles at border crossings, and people at high profile functions and events. Traditional radiation portal monitors contain separate detectors for passively measuring neutron and gamma ray count rates. 3He tubes embedded in polyethylene and slabs of plastic scintillators are the most common detector materials used in radiation portal monitors. The radiation portal monitor alarm mechanism relies on measuring radiation count rates above user defined alarm thresholds. These alarm thresholds are set above natural background count rates. Minimizing false alarms caused by natural background and maximizing sensitivity to weakly emitting threat sources must be balanced when setting these alarm thresholds. Current radiation portal monitor designs suffer from frequent nuisance radiation alarms. These radiation nuisance alarms are most frequently caused by shipments of large quantities of naturally occurring radioactive material containing cargo, like kitty litter, as well as by humans who have recently undergone a nuclear medicine procedure, particularly 99mTc treatments. Current radiation portal monitors typically lack spectroscopic capabilities, so nuisance alarms must be screened out in time-intensive secondary inspections with handheld radiation detectors. Radiation portal monitors using organic liquid scintillation detectors were designed, built, and tested. A number of algorithms were developed to perform on-the-fly radionuclide identification of single and combination radiation sources moving past the portal monitor at speeds up to 2.2 m/s. The portal monitor designs were tested extensively with a variety of shielded and unshielded radiation sources, including special nuclear material, at the European Commission Joint Research Centre in Ispra, Italy. Common medical isotopes were measured at the C.S. Mott Children's Hospital and added to the radionuclide identification algorithms.

  14. Estimating Alarm Thresholds for Process Monitoring Data under Different Assumptions about the Data Generating Mechanism

    DOE PAGES

    Burr, Tom; Hamada, Michael S.; Howell, John; ...

    2013-01-01

    Process monitoring (PM) for nuclear safeguards sometimes requires estimation of thresholds corresponding to small false alarm rates. Threshold estimation dates to the 1920s with the Shewhart control chart; however, because possible new roles for PM are being evaluated in nuclear safeguards, it is timely to consider modern model selection options in the context of threshold estimation. One of the possible new PM roles involves PM residuals, where a residual is defined as residual = data − prediction. This paper reviews alarm threshold estimation, introduces model selection options, and considers a range of assumptions regarding the data-generating mechanism for PM residuals.more » Two PM examples from nuclear safeguards are included to motivate the need for alarm threshold estimation. The first example involves mixtures of probability distributions that arise in solution monitoring, which is a common type of PM. The second example involves periodic partial cleanout of in-process inventory, leading to challenging structure in the time series of PM residuals.« less

  15. Comparison of alternatives to amplitude thresholding for onset detection of acoustic emission signals

    NASA Astrophysics Data System (ADS)

    Bai, F.; Gagar, D.; Foote, P.; Zhao, Y.

    2017-02-01

    Acoustic Emission (AE) monitoring can be used to detect the presence of damage as well as determine its location in Structural Health Monitoring (SHM) applications. Information on the time difference of the signal generated by the damage event arriving at different sensors in an array is essential in performing localisation. Currently, this is determined using a fixed threshold which is particularly prone to errors when not set to optimal values. This paper presents three new methods for determining the onset of AE signals without the need for a predetermined threshold. The performance of the techniques is evaluated using AE signals generated during fatigue crack growth and compared to the established Akaike Information Criterion (AIC) and fixed threshold methods. It was found that the 1D location accuracy of the new methods was within the range of < 1 - 7.1 % of the monitored region compared to 2.7% for the AIC method and a range of 1.8-9.4% for the conventional Fixed Threshold method at different threshold levels.

  16. Threshold Values for Identification of Contamination Predicted by Reduced-Order Models

    DOE PAGES

    Last, George V.; Murray, Christopher J.; Bott, Yi-Ju; ...

    2014-12-31

    The U.S. Department of Energy’s (DOE’s) National Risk Assessment Partnership (NRAP) Project is developing reduced-order models to evaluate potential impacts on underground sources of drinking water (USDWs) if CO2 or brine leaks from deep CO2 storage reservoirs. Threshold values, below which there would be no predicted impacts, were determined for portions of two aquifer systems. These threshold values were calculated using an interwell approach for determining background groundwater concentrations that is an adaptation of methods described in the U.S. Environmental Protection Agency’s Unified Guidance for Statistical Analysis of Groundwater Monitoring Data at RCRA Facilities.

  17. Why Does Threshold Level Change in Transcranial Motor-evoked Potentials During Surgery for Supratentorial Lesions?

    PubMed

    Abboud, Tammam; Huckhagel, Torge; Stork, Jan-Henrich; Hamel, Wolfgang; Schwarz, Cindy; Vettorazzi, Eik; Westphal, Manfred; Martens, Tobias

    2017-10-01

    Rising threshold level during monitoring of motor-evoked potentials (MEP) using transcranial electrical stimulation (TES) has been described without damage to the motor pathway in the cranial surgery, suggesting the need for monitoring of affected and unaffected hemisphere. We aimed to determine the factors that lead to a change in threshold level and to establish reliable criteria for adjusting stimulation intensity during surgery for supratentorial lesions. Between October 2014 and October 2015, TES-MEP were performed in 143 patients during surgery for unilateral supratentorial lesions in motor-eloquent brain areas. All procedures were performed under general anesthesia using a strict protocol to maintain stable blood pressure. MEP were evaluated bilaterally to assess the percentage increase in threshold level, which was considered significant if it exceeded 20% on the contralateral side beyond the percentage increase on the ipsilateral side. Patients who developed a postoperative motor deficit were excluded. Volume of subdural air was measured on postoperative magnetic resonance imaging. Logistic regression was performed to identify factors associated with the intraoperative recorded changes in threshold level. A total of 123 patients were included in the study. On the affected side, 82 patients (66.7%) showed an increase in threshold level, which ranged from 2% to 48% and 41 patients (33.3%) did not show any change. The difference to the unaffected side was under 20% in all patients. The recorded range of changes in the systolic and mean pressure did not exceed 20 mm Hg in any of the patients. Pneumocephalus was detected on postoperative magnetic resonance imaging scans in 87 patients (70.7%) and 81 of them (93.1%) had an intraoperative increase in threshold level on either sides. Pneumocephalus was the only factor associated with an increase in threshold level on the affected side (P<0.001), while each of pneumocephalus and length of the procedure correlated with a change in threshold level on the unaffected side (P<0.001 and 0.032, respectively). Pneumocephalus was the only factor associated with increase in threshold level during MEP monitoring without damaging motor pathway. Threshold level on the affected side can rise up to 48% without being predictive of postoperative paresis, as long as the difference between the increased threshold of the affected and unaffected side is within 20%. Changes in systolic or mean blood pressure within a range of 20 mm Hg do not seem to influence intraoperative MEP.

  18. Intensity Thresholds on Raw Acceleration Data: Euclidean Norm Minus One (ENMO) and Mean Amplitude Deviation (MAD) Approaches

    PubMed Central

    Bakrania, Kishan; Yates, Thomas; Rowlands, Alex V.; Esliger, Dale W.; Bunnewell, Sarah; Sanders, James; Davies, Melanie; Khunti, Kamlesh; Edwardson, Charlotte L.

    2016-01-01

    Objectives (1) To develop and internally-validate Euclidean Norm Minus One (ENMO) and Mean Amplitude Deviation (MAD) thresholds for separating sedentary behaviours from common light-intensity physical activities using raw acceleration data collected from both hip- and wrist-worn tri-axial accelerometers; and (2) to compare and evaluate the performances between the ENMO and MAD metrics. Methods Thirty-three adults [mean age (standard deviation (SD)) = 27.4 (5.9) years; mean BMI (SD) = 23.9 (3.7) kg/m2; 20 females (60.6%)] wore four accelerometers; an ActiGraph GT3X+ and a GENEActiv on the right hip; and an ActiGraph GT3X+ and a GENEActiv on the non-dominant wrist. Under laboratory-conditions, participants performed 16 different activities (11 sedentary behaviours and 5 light-intensity physical activities) for 5 minutes each. ENMO and MAD were computed from the raw acceleration data, and logistic regression and receiver-operating-characteristic (ROC) analyses were implemented to derive thresholds for activity discrimination. Areas under ROC curves (AUROC) were calculated to summarise performances and thresholds were assessed via executing leave-one-out-cross-validations. Results For both hip and wrist monitor placements, in comparison to the ActiGraph GT3X+ monitors, the ENMO and MAD values derived from the GENEActiv devices were observed to be slightly higher, particularly for the lower-intensity activities. Monitor-specific hip and wrist ENMO and MAD thresholds showed excellent ability for separating sedentary behaviours from motion-based light-intensity physical activities (in general, AUROCs >0.95), with validation indicating robustness. However, poor classification was experienced when attempting to isolate standing still from sedentary behaviours (in general, AUROCs <0.65). The ENMO and MAD metrics tended to perform similarly across activities and accelerometer brands. Conclusions Researchers can utilise these robust monitor-specific hip and wrist ENMO and MAD thresholds, in order to accurately separate sedentary behaviours from common motion-based light-intensity physical activities. However, caution should be taken if isolating sedentary behaviours from standing is of particular interest. PMID:27706241

  19. Intensity Thresholds on Raw Acceleration Data: Euclidean Norm Minus One (ENMO) and Mean Amplitude Deviation (MAD) Approaches.

    PubMed

    Bakrania, Kishan; Yates, Thomas; Rowlands, Alex V; Esliger, Dale W; Bunnewell, Sarah; Sanders, James; Davies, Melanie; Khunti, Kamlesh; Edwardson, Charlotte L

    2016-01-01

    (1) To develop and internally-validate Euclidean Norm Minus One (ENMO) and Mean Amplitude Deviation (MAD) thresholds for separating sedentary behaviours from common light-intensity physical activities using raw acceleration data collected from both hip- and wrist-worn tri-axial accelerometers; and (2) to compare and evaluate the performances between the ENMO and MAD metrics. Thirty-three adults [mean age (standard deviation (SD)) = 27.4 (5.9) years; mean BMI (SD) = 23.9 (3.7) kg/m2; 20 females (60.6%)] wore four accelerometers; an ActiGraph GT3X+ and a GENEActiv on the right hip; and an ActiGraph GT3X+ and a GENEActiv on the non-dominant wrist. Under laboratory-conditions, participants performed 16 different activities (11 sedentary behaviours and 5 light-intensity physical activities) for 5 minutes each. ENMO and MAD were computed from the raw acceleration data, and logistic regression and receiver-operating-characteristic (ROC) analyses were implemented to derive thresholds for activity discrimination. Areas under ROC curves (AUROC) were calculated to summarise performances and thresholds were assessed via executing leave-one-out-cross-validations. For both hip and wrist monitor placements, in comparison to the ActiGraph GT3X+ monitors, the ENMO and MAD values derived from the GENEActiv devices were observed to be slightly higher, particularly for the lower-intensity activities. Monitor-specific hip and wrist ENMO and MAD thresholds showed excellent ability for separating sedentary behaviours from motion-based light-intensity physical activities (in general, AUROCs >0.95), with validation indicating robustness. However, poor classification was experienced when attempting to isolate standing still from sedentary behaviours (in general, AUROCs <0.65). The ENMO and MAD metrics tended to perform similarly across activities and accelerometer brands. Researchers can utilise these robust monitor-specific hip and wrist ENMO and MAD thresholds, in order to accurately separate sedentary behaviours from common motion-based light-intensity physical activities. However, caution should be taken if isolating sedentary behaviours from standing is of particular interest.

  20. SU-F-R-11: Designing Quality and Safety Informatics Through Implementation of a CT Radiation Dose Monitoring Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, JM; Samei, E; Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, NC

    2016-06-15

    Purpose: Recent legislative and accreditation requirements have driven rapid development and implementation of CT radiation dose monitoring solutions. Institutions must determine how to improve quality, safety, and consistency of their clinical performance. The purpose of this work was to design a strategy and meaningful characterization of results from an in-house, clinically-deployed dose monitoring solution. Methods: A dose monitoring platform was designed by our imaging physics group that focused on extracting protocol parameters, dose metrics, and patient demographics and size. Compared to most commercial solutions, which focus on individual exam alerts and global thresholds, the program sought to characterize overall consistencymore » and targeted thresholds based on eight analytic interrogations. Those were based on explicit questions related to protocol application, national benchmarks, protocol and size-specific dose targets, operational consistency, outliers, temporal trends, intra-system variability, and consistent use of electronic protocols. Using historical data since the start of 2013, 95% and 99% intervals were used to establish yellow and amber parameterized dose alert thresholds, respectively, as a function of protocol, scanner, and size. Results: Quarterly reports have been generated for three hospitals for 3 quarters of 2015 totaling 27880, 28502, 30631 exams, respectively. Four adult and two pediatric protocols were higher than external institutional benchmarks. Four protocol dose levels were being inconsistently applied as a function of patient size. For the three hospitals, the minimum and maximum amber outlier percentages were [1.53%,2.28%], [0.76%,1.8%], [0.94%,1.17%], respectively. Compared with the electronic protocols, 10 protocols were found to be used with some inconsistency. Conclusion: Dose monitoring can satisfy requirements with global alert thresholds and patient dose records, but the real value is in optimizing patient-specific protocols, balancing image quality trade-offs that dose-reduction strategies promise, and improving the performance and consistency of a clinical operation. Data plots that capture patient demographics and scanner performance demonstrate that value.« less

  1. Development of a new approach to cumulative effects assessment: a northern river ecosystem example.

    PubMed

    Dubé, Monique; Johnson, Brian; Dunn, Gary; Culp, Joseph; Cash, Kevin; Munkittrick, Kelly; Wong, Isaac; Hedley, Kathlene; Booty, William; Lam, David; Resler, Oskar; Storey, Alex

    2006-02-01

    If sustainable development of Canadian waters is to be achieved, a realistic and manageable framework is required for assessing cumulative effects. The objective of this paper is to describe an approach for aquatic cumulative effects assessment that was developed under the Northern Rivers Ecosystem Initiative. The approach is based on a review of existing monitoring practices in Canada and the presence of existing thresholds for aquatic ecosystem health assessments. It suggests that a sustainable framework is possible for cumulative effects assessment of Canadian waters that would result in integration of national indicators of aquatic health, integration of national initiatives (e.g., water quality index, environmental effects monitoring), and provide an avenue where long-term monitoring programs could be integrated with baseline and follow-up monitoring conducted under the environmental assessment process.

  2. Development and validation of a dual sensing scheme to improve accuracy of bradycardia and pause detection in an insertable cardiac monitor.

    PubMed

    Passman, Rod S; Rogers, John D; Sarkar, Shantanu; Reiland, Jerry; Reisfeld, Erin; Koehler, Jodi; Mittal, Suneet

    2017-07-01

    Undersensing of premature ventricular beats and low-amplitude R waves are primary causes for inappropriate bradycardia and pause detections in insertable cardiac monitors (ICMs). The purpose of this study was to develop and validate an enhanced algorithm to reduce inappropriately detected bradycardia and pause episodes. Independent data sets to develop and validate the enhanced algorithm were derived from a database of ICM-detected bradycardia and pause episodes in de-identified patients monitored for unexplained syncope. The original algorithm uses an auto-adjusting sensitivity threshold for R-wave sensing to detect tachycardia and avoid T-wave oversensing. In the enhanced algorithm, a second sensing threshold is used with a long blanking and fixed lower sensitivity threshold, looking for evidence of undersensed signals. Data reported includes percent change in appropriate and inappropriate bradycardia and pause detections as well as changes in episode detection sensitivity and positive predictive value with the enhanced algorithm. The validation data set, from 663 consecutive patients, consisted of 4904 (161 patients) bradycardia and 2582 (133 patients) pause episodes, of which 2976 (61%) and 996 (39%) were appropriately detected bradycardia and pause episodes. The enhanced algorithm reduced inappropriate bradycardia and pause episodes by 95% and 47%, respectively, with 1.7% and 0.6% reduction in appropriate episodes, respectively. The average episode positive predictive value improved by 62% (P < .001) for bradycardia detection and by 26% (P < .001) for pause detection, with an average relative sensitivity of 95% (P < .001) and 99% (P = .5), respectively. The enhanced dual sense algorithm for bradycardia and pause detection in ICMs substantially reduced inappropriate episode detection with a minimal reduction in true episode detection. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Considerations for the Development of Work Zone Mobility Performance Measures and Thresholds for Virginia Freeways.

    DOT National Transportation Integrated Search

    2014-02-01

    "The Federal Highway Administration has been encouraging states to improve their monitoring and tracking of the : mobility impacts of work zones. The use of mobility performance measures will enable agencies to assess better the : contribution of wor...

  4. Optimizing Environmental Monitoring Networks with Direction-Dependent Distance Thresholds.

    ERIC Educational Resources Information Center

    Hudak, Paul F.

    1993-01-01

    In the direction-dependent approach to location modeling developed herein, the distance within which a point of demand can find service from a facility depends on direction of measurement. The utility of the approach is illustrated through an application to groundwater remediation. (Author/MDH)

  5. Toward development of mobile application for hand arthritis screening.

    PubMed

    Akhbardeh, Farhad; Vasefi, Fartash; Tavakolian, Kouhyar; Bradley, David; Fazel-Rezai, Reza

    2015-01-01

    Arthritis is one of the most common health problems affecting people throughout the world. The goal of the work presented in this paper is to provide individuals, who may be developing or have developed arthritis, with a mobile application to assess and monitor the progress of their disease using their smartphone. The image processing algorithm includes finger border detection algorithm to monitor joint thickness and angular deviation abnormalities, which are common symptoms of arthritis. In this work, we have analyzed and compared gradient, thresholding and Canny algorithms for border detection. The effect of image spatial resolution (down-sampling) is also investigated. The results calculated based on 36 joint measurements show that the mean errors for gradient, thresholding, and Canny methods are 0.20, 2.13, and 2.03 mm, respectively. In addition, the average error for different image resolutions is analyzed and the minimum required resolution is determined for each method. The results confirm that recent smartphone imaging capabilities can provide enough accuracy for hand border detection and finger joint analysis based on gradient method.

  6. Establishing seasonal and alert influenza thresholds in Cambodia using the WHO method: implications for effective utilization of influenza surveillance in the tropics and subtropics.

    PubMed

    Ly, Sovann; Arashiro, Takeshi; Ieng, Vanra; Tsuyuoka, Reiko; Parry, Amy; Horwood, Paul; Heng, Seng; Hamid, Sarah; Vandemaele, Katelijn; Chin, Savuth; Sar, Borann; Arima, Yuzo

    2017-01-01

    To establish seasonal and alert thresholds and transmission intensity categories for influenza to provide timely triggers for preventive measures or upscaling control measures in Cambodia. Using Cambodia's influenza-like illness (ILI) and laboratory-confirmed influenza surveillance data from 2009 to 2015, three parameters were assessed to monitor influenza activity: the proportion of ILI patients among all outpatients, proportion of ILI samples positive for influenza and the product of the two. With these parameters, four threshold levels (seasonal, moderate, high and alert) were established and transmission intensity was categorized based on a World Health Organization alignment method. Parameters were compared against their respective thresholds. Distinct seasonality was observed using the two parameters that incorporated laboratory data. Thresholds established using the composite parameter, combining syndromic and laboratory data, had the least number of false alarms in declaring season onset and were most useful in monitoring intensity. Unlike in temperate regions, the syndromic parameter was less useful in monitoring influenza activity or for setting thresholds. Influenza thresholds based on appropriate parameters have the potential to provide timely triggers for public health measures in a tropical country where monitoring and assessing influenza activity has been challenging. Based on these findings, the Ministry of Health plans to raise general awareness regarding influenza among the medical community and the general public. Our findings have important implications for countries in the tropics/subtropics and in resource-limited settings, and categorized transmission intensity can be used to assess severity of potential pandemic influenza as well as seasonal influenza.

  7. Research on critical groundwater level under the threshold value of land subsidence in the typical region of Beijing

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Liu, J.-R.; Luo, Y.; Yang, Y.; Tian, F.; Lei, K.-C.

    2015-11-01

    Groundwater in Beijing has been excessively exploited in a long time, causing the groundwater level continued to declining and land subsidence areas expanding, which restrained the economic and social sustainable development. Long years of study show good time-space corresponding relationship between groundwater level and land subsidence. To providing scientific basis for the following land subsidence prevention and treatment, quantitative research between groundwater level and settlement is necessary. Multi-linear regression models are set up by long series factual monitoring data about layered water table and settlement in the Tianzhu monitoring station. The results show that: layered settlement is closely related to water table, water level variation and amplitude, especially the water table. Finally, according to the threshold value in the land subsidence prevention and control plan of China (45, 30, 25 mm), the minimum allowable layered water level in this region while settlement achieving the threshold value is calculated between -18.448 and -10.082 m. The results provide a reasonable and operable control target of groundwater level for rational adjustment of groundwater exploited horizon in the future.

  8. A prototype gas exchange monitor for exercise stress testing aboard NASA Space Station

    NASA Technical Reports Server (NTRS)

    Orr, Joseph A.; Westenskow, Dwayne R.; Bauer, Anne

    1989-01-01

    This paper describes an easy-to-use monitor developed to track the weightlessness deconditioning aboard the NASA Space Station, together with the results of testing of a prototype instrument. The monitor measures the O2 uptake and CO2 production, and calculates the maximum O2 uptake and anaerobic threshold during an exercise stress test. The system uses two flowmeters in series to achieve a completely automatic calibration, and uses breath-by-breath compensation for sample line-transport delay. The monitor was evaluated using two laboratory methods and was shown to be accurate. The system's block diagram and the bench test setup diagram are included.

  9. Technical Note: An operational landslide early warning system at regional scale based on space-time variable rainfall thresholds

    NASA Astrophysics Data System (ADS)

    Segoni, S.; Battistini, A.; Rossi, G.; Rosi, A.; Lagomarsino, D.; Catani, F.; Moretti, S.; Casagli, N.

    2014-10-01

    We set up an early warning system for rainfall-induced landslides in Tuscany (23 000 km2). The system is based on a set of state-of-the-art intensity-duration rainfall thresholds (Segoni et al., 2014b), makes use of LAMI rainfall forecasts and real-time rainfall data provided by an automated network of more than 300 rain-gauges. The system was implemented in a WebGIS to ease the operational use in civil protection procedures: it is simple and intuitive to consult and it provides different outputs. Switching among different views, the system is able to focus both on monitoring of real time data and on forecasting at different lead times up to 48 h. Moreover, the system can switch between a very straightforward view where a synoptic scenario of the hazard can be shown all over the region and a more in-depth view were the rainfall path of rain-gauges can be displayed and constantly compared with rainfall thresholds. To better account for the high spatial variability of the physical features, which affects the relationship between rainfall and landslides, the region is subdivided into 25 alert zones, each provided with a specific threshold. The warning system reflects this subdivision: using a network of 332 rain gauges, it allows monitoring each alert zone separately and warnings can be issued independently from an alert zone to another. An important feature of the warning system is the use of thresholds that may vary in time adapting at the conditions of the rainfall path recorded by the rain-gauges. Depending on when the starting time of the rainfall event is set, the comparison with the threshold may produce different outcomes. Therefore, a recursive algorithm was developed to check and compare with the thresholds all possible starting times, highlighting the worst scenario and showing in the WebGIS interface at what time and how much the rainfall path has exceeded or will exceed the most critical threshold. Besides forecasting and monitoring the hazard scenario over the whole region with hazard levels differentiated for 25 distinct alert zones, the system can be used to gather, analyze, visualize, explore, interpret and store rainfall data, thus representing a potential support to both decision makers and scientists.

  10. Device for monitoring cell voltage

    DOEpatents

    Doepke, Matthias [Garbsen, DE; Eisermann, Henning [Edermissen, DE

    2012-08-21

    A device for monitoring a rechargeable battery having a number of electrically connected cells includes at least one current interruption switch for interrupting current flowing through at least one associated cell and a plurality of monitoring units for detecting cell voltage. Each monitoring unit is associated with a single cell and includes a reference voltage unit for producing a defined reference threshold voltage and a voltage comparison unit for comparing the reference threshold voltage with a partial cell voltage of the associated cell. The reference voltage unit is electrically supplied from the cell voltage of the associated cell. The voltage comparison unit is coupled to the at least one current interruption switch for interrupting the current of at least the current flowing through the associated cell, with a defined minimum difference between the reference threshold voltage and the partial cell voltage.

  11. Low volume flow meter

    DOEpatents

    Meixler, Lewis D.

    1993-01-01

    The low flow monitor provides a means for determining if a fluid flow meets a minimum threshold level of flow. The low flow monitor operates with a minimum of intrusion by the flow detection device into the flow. The electrical portion of the monitor is externally located with respect to the fluid stream which allows for repairs to the monitor without disrupting the flow. The electronics provide for the adjustment of the threshold level to meet the required conditions. The apparatus can be modified to provide an upper limit to the flow monitor by providing for a parallel electronic circuit which provides for a bracketing of the desired flow rate.

  12. Trial of real-time locating and messaging system with Bluetooth low energy.

    PubMed

    Arisaka, Naoya; Mamorita, Noritaka; Isonaka, Risa; Kawakami, Tadashi; Takeuchi, Akihiro

    2016-09-14

    Hospital real-time location systems (RTLS) are increasing efficiency and reducing operational costs, but room access tags are necessary. We developed three iPhone 5 applications for an RTLS and communications using Bluetooth low energy (BLE). The applications were: Peripheral device tags, Central beacons, and a Monitor. A Peripheral communicated with a Central using BLE. The Central communicated with a Monitor using sockets on TCP/IP (Transmission Control Protocol/Internet Protocol) via a WLAN (wireless local area network). To determine a BLE threshold level for the received signal strength indicator (RSSI), relationships between signal strength and distance were measured in our laboratory and on the terrace. The BLE RSSI threshold was set at -70 dB, about 10 m. While an individual with a Peripheral moved around in a concrete building, the Peripheral was captured in a few 10-sec units at about 10 m from a Central. The Central and Monitor showed and saved the approach events, location, and Peripheral's nickname sequentially in real time. Remote Centrals also interactively communicate with Peripherals by intermediating through Monitors that found the nickname in the event database. Trial applications using BLE on iPhones worked well for patient tracking, and messaging in indoor environments.

  13. Fuel cell flooding detection and correction

    DOEpatents

    DiPierno Bosco, Andrew; Fronk, Matthew Howard

    2000-08-15

    Method and apparatus for monitoring an H.sub.2 -O.sub.2 PEM fuel cells to detect and correct flooding. The pressure drop across a given H.sub.2 or O.sub.2 flow field is monitored and compared to predetermined thresholds of unacceptability. If the pressure drop exists a threshold of unacceptability corrective measures are automatically initiated.

  14. Influenza surveillance in Europe: establishing epidemic thresholds by the Moving Epidemic Method

    PubMed Central

    Vega, Tomás; Lozano, Jose Eugenio; Meerhoff, Tamara; Snacken, René; Mott, Joshua; Ortiz de Lejarazu, Raul; Nunes, Baltazar

    2012-01-01

    Please cite this paper as: Vega et al. (2012) Influenza surveillance in Europe: establishing epidemic thresholds by the moving epidemic method. Influenza and Other Respiratory Viruses 7(4), 546–558. Background  Timely influenza surveillance is important to monitor influenza epidemics. Objectives  (i) To calculate the epidemic threshold for influenza‐like illness (ILI) and acute respiratory infections (ARI) in 19 countries, as well as the thresholds for different levels of intensity. (ii) To evaluate the performance of these thresholds. Methods  The moving epidemic method (MEM) has been developed to determine the baseline influenza activity and an epidemic threshold. False alerts, detection lags and timeliness of the detection of epidemics were calculated. The performance was evaluated using a cross‐validation procedure. Results  The overall sensitivity of the MEM threshold was 71·8% and the specificity was 95·5%. The median of the timeliness was 1 week (range: 0–4·5). Conclusions  The method produced a robust and specific signal to detect influenza epidemics. The good balance between the sensitivity and specificity of the epidemic threshold to detect seasonal epidemics and avoid false alerts has advantages for public health purposes. This method may serve as standard to define the start of the annual influenza epidemic in countries in Europe. PMID:22897919

  15. High-resolution audiometry: an automated method for hearing threshold acquisition with quality control.

    PubMed

    Bian, Lin

    2012-01-01

    In clinical practice, hearing thresholds are measured at only five to six frequencies at octave intervals. Thus, the audiometric configuration cannot closely reflect the actual status of the auditory structures. In addition, differential diagnosis requires quantitative comparison of behavioral thresholds with physiological measures, such as otoacoustic emissions (OAEs) that are usually measured in higher resolution. The purpose of this research was to develop a method to improve the frequency resolution of the audiogram. A repeated-measure design was used in the study to evaluate the reliability of the threshold measurements. A total of 16 participants with clinically normal hearing and mild hearing loss were recruited from a population of university students. No intervention was involved in the study. Custom developed system and software were used for threshold acquisition with quality control (QC). With real-ear calibration and monitoring of test signals, the system provided accurate and individualized measure of hearing thresholds that were determined by an analysis based on signal detection theory (SDT). The reliability of the threshold measure was assessed by correlation and differences between the repeated measures. The audiometric configurations were diverse and unique to each individual ear. The accuracy, within-subject reliability, and between-test repeatability are relatively high. With QC, the high-resolution audiograms can be reliably and accurately measured. Hearing thresholds measured as ear canal sound pressures with higher frequency resolution can provide more customized hearing-aid fitting. The test system may be integrated with other physiological measures, such as OAEs, into a comprehensive evaluative tool. American Academy of Audiology.

  16. Water quality real-time monitoring system via biological detection based on video analysis

    NASA Astrophysics Data System (ADS)

    Xin, Chen; Fei, Yuan

    2017-11-01

    With the development of society, water pollution has become the most serious problem in China. Therefore, real-time water quality monitoring is an important part of human activities and water pollution prevention. In this paper, the behavior of zebrafish was monitored by computer vision. Firstly, the moving target was extracted by the method of saliency detection, and tracked by fitting the ellipse model. Then the motion parameters were extracted by optical flow method, and the data were monitored in real time by means of Hinkley warning and threshold warning. We achieved classification warning through a number of dimensions by comprehensive toxicity index. The experimental results show that the system can achieve more accurate real-time monitoring.

  17. Large-scale, dynamic transformations in fuel moisture drive wildfire activity across southeastern Australia

    NASA Astrophysics Data System (ADS)

    Nolan, R. H.; Boer, M. M.; Resco de Dios, V.; Caccamo, G.; Bradstock, R. A.

    2016-05-01

    The occurrence of large, high-intensity wildfires requires plant biomass, or fuel, that is sufficiently dry to burn. This poses the question, what is "sufficiently dry"? Until recently, the ability to address this question has been constrained by the spatiotemporal scale of available methods to monitor the moisture contents of both dead and live fuels. Here we take advantage of recent developments in macroscale monitoring of fuel moisture through a combination of remote sensing and climatic modeling. We show there are clear thresholds of fuel moisture content associated with the occurrence of wildfires in forests and woodlands. Furthermore, we show that transformations in fuel moisture conditions across these thresholds can occur rapidly, within a month. Both the approach presented here, and our findings, can be immediately applied and may greatly improve fire risk assessments in forests and woodlands globally.

  18. Summer temperature metrics for predicting brook trout (Salvelinus fontinalis) distribution in streams

    USGS Publications Warehouse

    Parrish, Donna; Butryn, Ryan S.; Rizzo, Donna M.

    2012-01-01

    We developed a methodology to predict brook trout (Salvelinus fontinalis) distribution using summer temperature metrics as predictor variables. Our analysis used long-term fish and hourly water temperature data from the Dog River, Vermont (USA). Commonly used metrics (e.g., mean, maximum, maximum 7-day maximum) tend to smooth the data so information on temperature variation is lost. Therefore, we developed a new set of metrics (called event metrics) to capture temperature variation by describing the frequency, area, duration, and magnitude of events that exceeded a user-defined temperature threshold. We used 16, 18, 20, and 22°C. We built linear discriminant models and tested and compared the event metrics against the commonly used metrics. Correct classification of the observations was 66% with event metrics and 87% with commonly used metrics. However, combined event and commonly used metrics correctly classified 92%. Of the four individual temperature thresholds, it was difficult to assess which threshold had the “best” accuracy. The 16°C threshold had slightly fewer misclassifications; however, the 20°C threshold had the fewest extreme misclassifications. Our method leveraged the volumes of existing long-term data and provided a simple, systematic, and adaptable framework for monitoring changes in fish distribution, specifically in the case of irregular, extreme temperature events.

  19. Prediction of spatially explicit rainfall intensity–duration thresholds for post-fire debris-flow generation in the western United States

    USGS Publications Warehouse

    Staley, Dennis M.; Negri, Jacquelyn; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.

    2017-01-01

    Early warning of post-fire debris-flow occurrence during intense rainfall has traditionally relied upon a library of regionally specific empirical rainfall intensity–duration thresholds. Development of this library and the calculation of rainfall intensity-duration thresholds often require several years of monitoring local rainfall and hydrologic response to rainstorms, a time-consuming approach where results are often only applicable to the specific region where data were collected. Here, we present a new, fully predictive approach that utilizes rainfall, hydrologic response, and readily available geospatial data to predict rainfall intensity–duration thresholds for debris-flow generation in recently burned locations in the western United States. Unlike the traditional approach to defining regional thresholds from historical data, the proposed methodology permits the direct calculation of rainfall intensity–duration thresholds for areas where no such data exist. The thresholds calculated by this method are demonstrated to provide predictions that are of similar accuracy, and in some cases outperform, previously published regional intensity–duration thresholds. The method also provides improved predictions of debris-flow likelihood, which can be incorporated into existing approaches for post-fire debris-flow hazard assessment. Our results also provide guidance for the operational expansion of post-fire debris-flow early warning systems in areas where empirically defined regional rainfall intensity–duration thresholds do not currently exist.

  20. Prediction of spatially explicit rainfall intensity-duration thresholds for post-fire debris-flow generation in the western United States

    NASA Astrophysics Data System (ADS)

    Staley, Dennis M.; Negri, Jacquelyn A.; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.

    2017-02-01

    Early warning of post-fire debris-flow occurrence during intense rainfall has traditionally relied upon a library of regionally specific empirical rainfall intensity-duration thresholds. Development of this library and the calculation of rainfall intensity-duration thresholds often require several years of monitoring local rainfall and hydrologic response to rainstorms, a time-consuming approach where results are often only applicable to the specific region where data were collected. Here, we present a new, fully predictive approach that utilizes rainfall, hydrologic response, and readily available geospatial data to predict rainfall intensity-duration thresholds for debris-flow generation in recently burned locations in the western United States. Unlike the traditional approach to defining regional thresholds from historical data, the proposed methodology permits the direct calculation of rainfall intensity-duration thresholds for areas where no such data exist. The thresholds calculated by this method are demonstrated to provide predictions that are of similar accuracy, and in some cases outperform, previously published regional intensity-duration thresholds. The method also provides improved predictions of debris-flow likelihood, which can be incorporated into existing approaches for post-fire debris-flow hazard assessment. Our results also provide guidance for the operational expansion of post-fire debris-flow early warning systems in areas where empirically defined regional rainfall intensity-duration thresholds do not currently exist.

  1. Development of a Fault Monitoring Technique for Wind Turbines Using a Hidden Markov Model.

    PubMed

    Shin, Sung-Hwan; Kim, SangRyul; Seo, Yun-Ho

    2018-06-02

    Regular inspection for the maintenance of the wind turbines is difficult because of their remote locations. For this reason, condition monitoring systems (CMSs) are typically installed to monitor their health condition. The purpose of this study is to propose a fault detection algorithm for the mechanical parts of the wind turbine. To this end, long-term vibration data were collected over two years by a CMS installed on a 3 MW wind turbine. The vibration distribution at a specific rotating speed of main shaft is approximated by the Weibull distribution and its cumulative distribution function is utilized for determining the threshold levels that indicate impending failure of mechanical parts. A Hidden Markov model (HMM) is employed to propose the statistical fault detection algorithm in the time domain and the method whereby the input sequence for HMM is extracted is also introduced by considering the threshold levels and the correlation between the signals. Finally, it was demonstrated that the proposed HMM algorithm achieved a greater than 95% detection success rate by using the long-term signals.

  2. Use of video-based education and tele-health home monitoring after liver transplantation: Results of a novel pilot study.

    PubMed

    Ertel, Audrey E; Kaiser, Tiffany E; Abbott, Daniel E; Shah, Shimul A

    2016-10-01

    In this observational study, we analyzed the feasibility and early results of a perioperative, video-based educational program and tele-health home monitoring model on postoperative care management and readmissions for patients undergoing liver transplantation. Twenty consecutive liver transplantation recipients were provided with tele-health home monitoring and an educational video program during the perioperative period. Vital statistics were tracked and monitored daily with emphasis placed on readings outside of the normal range (threshold violations). Additionally, responses to effectiveness questionnaires were collected retrospectively for analysis. In the study, 19 of the 20 patients responded to the effectiveness questionnaire, with 95% reporting having watched all 10 videos, 68% watching some more than once, and 100% finding them effective in improving their preparedness for understanding their postoperative care. Among these 20 patients, there was an observed 19% threshold violation rate for systolic blood pressure, 6% threshold violation rate for mean blood glucose concentrations, and 8% threshold violation rate for mean weights. This subset of patients had a 90-day readmission rate of 30%. This observational study demonstrates that tele-health home monitoring and video-based educational programs are feasible in liver transplantation recipients and seem to be effective in enhancing the monitoring of vital statistics postoperatively. These data suggest that smart technology is effective in creating a greater awareness and understanding of how to manage postoperative care after liver transplantation. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Development and Interlaboratory Validation of a Simple Screening Method for Genetically Modified Maize Using a ΔΔC(q)-Based Multiplex Real-Time PCR Assay.

    PubMed

    Noguchi, Akio; Nakamura, Kosuke; Sakata, Kozue; Sato-Fukuda, Nozomi; Ishigaki, Takumi; Mano, Junichi; Takabatake, Reona; Kitta, Kazumi; Teshima, Reiko; Kondo, Kazunari; Nishimaki-Mogami, Tomoko

    2016-04-19

    A number of genetically modified (GM) maize events have been developed and approved worldwide for commercial cultivation. A screening method is needed to monitor GM maize approved for commercialization in countries that mandate the labeling of foods containing a specified threshold level of GM crops. In Japan, a screening method has been implemented to monitor approved GM maize since 2001. However, the screening method currently used in Japan is time-consuming and requires generation of a calibration curve and experimental conversion factor (C(f)) value. We developed a simple screening method that avoids the need for a calibration curve and C(f) value. In this method, ΔC(q) values between the target sequences and the endogenous gene are calculated using multiplex real-time PCR, and the ΔΔC(q) value between the analytical and control samples is used as the criterion for determining analytical samples in which the GM organism content is below the threshold level for labeling of GM crops. An interlaboratory study indicated that the method is applicable independently with at least two models of PCR instruments used in this study.

  4. AREA RADIATION MONITOR

    DOEpatents

    Manning, F.W.; Groothuis, S.E.; Lykins, J.H.; Papke, D.M.

    1962-06-12

    S>An improved area radiation dose monitor is designed which is adapted to compensate continuously for background radiation below a threshold dose rate and to give warning when the dose integral of the dose rate of an above-threshold radiation excursion exceeds a selected value. This is accomplished by providing means for continuously charging an ionization chamber. The chamber provides a first current proportional to the incident radiation dose rate. Means are provided for generating a second current including means for nulling out the first current with the second current at all values of the first current corresponding to dose rates below a selected threshold dose rate value. The second current has a maximum value corresponding to that of the first current at the threshold dose rate. The excess of the first current over the second current, which occurs above the threshold, is integrated and an alarm is given at a selected integrated value of the excess corresponding to a selected radiation dose. (AEC)

  5. Improved detection of radioactive material using a series of measurements

    NASA Astrophysics Data System (ADS)

    Mann, Jenelle

    The goal of this project is to develop improved algorithms for detection of radioactive sources that have low signal compared to background. The detection of low signal sources is of interest in national security applications where the source may have weak ionizing radiation emissions, is heavily shielded, or the counting time is short (such as portal monitoring). Traditionally to distinguish signal from background the decision threshold (y*) is calculated by taking a long background count and limiting the false negative error (alpha error) to 5%. Some problems with this method include: background is constantly changing due to natural environmental fluctuations and large amounts of data are being taken as the detector continuously scans that are not utilized. Rather than looking at a single measurement, this work investigates looking at a series of N measurements and develops an appropriate decision threshold for exceeding the decision threshold n times in a series of N. This methodology is investigated for a rectangular, triangular, sinusoidal, Poisson, and Gaussian distribution.

  6. Pure-tone audiometry outside a sound booth using earphone attentuation, integrated noise monitoring, and automation.

    PubMed

    Swanepoel, De Wet; Matthysen, Cornelia; Eikelboom, Robert H; Clark, Jackie L; Hall, James W

    2015-01-01

    Accessibility of audiometry is hindered by the cost of sound booths and shortage of hearing health personnel. This study investigated the validity of an automated mobile diagnostic audiometer with increased attenuation and real-time noise monitoring for clinical testing outside a sound booth. Attenuation characteristics and reference ambient noise levels for the computer-based audiometer (KUDUwave) was evaluated alongside the validity of environmental noise monitoring. Clinical validity was determined by comparing air- and bone-conduction thresholds obtained inside and outside the sound booth (23 subjects). Twenty-three normal-hearing subjects (age range, 20-75 years; average age 35.5) and a sub group of 11 subjects to establish test-retest reliability. Improved passive attenuation and valid environmental noise monitoring was demonstrated. Clinically, air-conduction thresholds inside and outside the sound booth, corresponded within 5 dB or less > 90% of instances (mean absolute difference 3.3 ± 3.2 SD). Bone conduction thresholds corresponded within 5 dB or less in 80% of comparisons between test environments, with a mean absolute difference of 4.6 dB (3.7 SD). Threshold differences were not statistically significant. Mean absolute test-retest differences outside the sound booth was similar to those in the booth. Diagnostic pure-tone audiometry outside a sound booth, using automated testing, improved passive attenuation, and real-time environmental noise monitoring demonstrated reliable hearing assessments.

  7. Anaerobic Threshold: Its Concept and Role in Endurance Sport

    PubMed Central

    Ghosh, Asok Kumar

    2004-01-01

    aerobic to anaerobic transition intensity is one of the most significant physiological variable in endurance sports. Scientists have explained the term in various ways, like, Lactate Threshold, Ventilatory Anaerobic Threshold, Onset of Blood Lactate Accumulation, Onset of Plasma Lactate Accumulation, Heart Rate Deflection Point and Maximum Lactate Steady State. But all of these have great role both in monitoring training schedule and in determining sports performance. Individuals endowed with the possibility to obtain a high oxygen uptake need to complement with rigorous training program in order to achieve maximal performance. If they engage in endurance events, they must also develop the ability to sustain a high fractional utilization of their maximal oxygen uptake (%VO2 max) and become physiologically efficient in performing their activity. Anaerobic threshold is highly correlated to the distance running performance as compared to maximum aerobic capacity or VO2 max, because sustaining a high fractional utilization of the VO2 max for a long time delays the metabolic acidosis. Training at or little above the anaerobic threshold intensity improves both the aerobic capacity and anaerobic threshold level. Anaerobic Threshold can also be determined from the speed-heart rate relationship in the field situation, without undergoing sophisticated laboratory techniques. However, controversies also exist among scientists regarding its role in high performance sports. PMID:22977357

  8. Anaerobic threshold: its concept and role in endurance sport.

    PubMed

    Ghosh, Asok Kumar

    2004-01-01

    aerobic to anaerobic transition intensity is one of the most significant physiological variable in endurance sports. Scientists have explained the term in various ways, like, Lactate Threshold, Ventilatory Anaerobic Threshold, Onset of Blood Lactate Accumulation, Onset of Plasma Lactate Accumulation, Heart Rate Deflection Point and Maximum Lactate Steady State. But all of these have great role both in monitoring training schedule and in determining sports performance. Individuals endowed with the possibility to obtain a high oxygen uptake need to complement with rigorous training program in order to achieve maximal performance. If they engage in endurance events, they must also develop the ability to sustain a high fractional utilization of their maximal oxygen uptake (%VO(2) max) and become physiologically efficient in performing their activity. Anaerobic threshold is highly correlated to the distance running performance as compared to maximum aerobic capacity or VO(2) max, because sustaining a high fractional utilization of the VO(2) max for a long time delays the metabolic acidosis. Training at or little above the anaerobic threshold intensity improves both the aerobic capacity and anaerobic threshold level. Anaerobic Threshold can also be determined from the speed-heart rate relationship in the field situation, without undergoing sophisticated laboratory techniques. However, controversies also exist among scientists regarding its role in high performance sports.

  9. 47 CFR 15.323 - Specific requirements for devices operating in the 1920-1930 MHz sub-band.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... monitoring threshold must not be more than 30 dB above the thermal noise power for a bandwidth equivalent to... windows with the lowest power level below a monitoring threshold of 50 dB above the thermal noise power... of 20 °C. For equipment that is capable only of operating from a battery, the frequency stability...

  10. From theoretical fixed return period events to real flooding impacts: a new approach to set flooding scenarios, thresholds and alerts

    NASA Astrophysics Data System (ADS)

    Parravicini, Paola; Cislaghi, Matteo; Condemi, Leonardo

    2017-04-01

    ARPA Lombardia is the Environmental Protection Agency of Lombardy, a wide region in the North of Italy. ARPA is in charge of river monitoring either for Civil Protection or water balance purposes. It cooperates with the Civil Protection Agency of Lombardy (RL-PC) in flood forecasting and early warning. The early warning system is based on rainfall and discharge thresholds: when a threshold exceeding is expected, RL-PC disseminates an alert from yellow to red. The conventional threshold evaluation is based on events at a fixed return period. Anyway, the impacts of events with the same return period may be different along the river course due to the specific characteristics of the affected areas. A new approach is introduced. It defines different scenarios, corresponding to different flood impacts. A discharge threshold is then associated to each scenario and the return period of the scenario is computed backwards. Flood scenarios are defined in accordance with National Civil Protection guidelines, which describe the expected flood impact and associate a colour to the scenario from green (no relevant effects) to red (major floods). A range of discharges is associated with each scenario since they cause the same flood impact; the threshold is set as the discharge corresponding to the transition between two scenarios. A wide range of event-based information is used to estimate the thresholds. As first guess, the thresholds are estimated starting from hydraulic model outputs and the people or infrastructures flooded according to the simulations. Eventually the model estimates are validated with real event knowledge: local Civil Protection Emergency Plans usually contain very detailed local impact description at known river levels or discharges, RL-PC collects flooding information notified by the population, newspapers often report flood events on web, data from the river monitoring network provide evaluation of actually happened levels and discharges. The methodology allows to give a return period for each scenario. The return period may vary along the river course according to the discharges associated with the scenario. The values of return period may show the areas characterized by higher risk and can be an important basis for civil protection emergency planning and river monitoring. For example, considering the Lambro River, the red scenario (major flood) shows a return period of 50 years in the northern rural part of the catchment. When the river crosses the city of Milan, the return period drops to 4 years. Afterwards it goes up to more than 100 years when the river flows in the agricultural areas in the southern part of the catchment. In addition, the knowledge gained with event-based analysis allows evaluating the compliance of the monitoring network with early warning requirements and represents the starting point for further development of the network itself.

  11. Improved Controller Design of Grid Friendly™ Appliances for Primary Frequency Response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lian, Jianming; Sun, Yannan; Marinovici, Laurentiu D.

    2015-09-01

    The Grid Friendlymore » $$^\\textrm{TM}$$ Appliance~(GFA) controller, developed at Pacific Northwest National Laboratory, can autonomously switch off the appliances by detecting the under-frequency events. In this paper, the impacts of curtailing frequency threshold on the performance of frequency responsive GFAs are carefully analyzed first. The current method of selecting curtailing frequency thresholds for GFAs is found to be insufficient to guarantee the desired performance especially when the frequency deviation is shallow. In addition, the power reduction of online GFAs could be so excessive that it can even impact the system response negatively. As a remedy to the deficiency of the current controller design, a different way of selecting curtailing frequency thresholds is proposed to ensure the effectiveness of GFAs in frequency protection. Moreover, it is also proposed to introduce a supervisor at each distribution feeder to monitor the curtailing frequency thresholds of online GFAs and take corrective actions if necessary.« less

  12. Multi-thresholds for fault isolation in the presence of uncertainties.

    PubMed

    Touati, Youcef; Mellal, Mohamed Arezki; Benazzouz, Djamel

    2016-05-01

    Monitoring of the faults is an important task in mechatronics. It involves the detection and isolation of faults which are performed by using the residuals. These residuals represent numerical values that define certain intervals called thresholds. In fact, the fault is detected if the residuals exceed the thresholds. In addition, each considered fault must activate a unique set of residuals to be isolated. However, in the presence of uncertainties, false decisions can occur due to the low sensitivity of certain residuals towards faults. In this paper, an efficient approach to make decision on fault isolation in the presence of uncertainties is proposed. Based on the bond graph tool, the approach is developed in order to generate systematically the relations between residuals and faults. The generated relations allow the estimation of the minimum detectable and isolable fault values. The latter is used to calculate the thresholds of isolation for each residual. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Rainfall Threshold Assessment Corresponding to the Maximum Allowable Turbidity for Source Water.

    PubMed

    Fan, Shu-Kai S; Kuan, Wen-Hui; Fan, Chihhao; Chen, Chiu-Yang

    2016-12-01

      This study aims to assess the upstream rainfall thresholds corresponding to the maximum allowable turbidity of source water, using monitoring data and artificial neural network computation. The Taipei Water Source Domain was selected as the study area, and the upstream rainfall records were collected for statistical analysis. Using analysis of variance (ANOVA), the cumulative rainfall records of one-day Ping-lin, two-day Ping-lin, two-day Tong-hou, one-day Guie-shan, and one-day Tai-ping (rainfall in the previous 24 or 48 hours at the named weather stations) were found to be the five most significant parameters for downstream turbidity development. An artificial neural network model was constructed to predict the downstream turbidity in the area investigated. The observed and model-calculated turbidity data were applied to assess the rainfall thresholds in the studied area. By setting preselected turbidity criteria, the upstream rainfall thresholds for these statistically determined rain gauge stations were calculated.

  14. Keeping it simple: Monitoring flood extent in large data-poor wetlands using MODIS SWIR data

    NASA Astrophysics Data System (ADS)

    Wolski, Piotr; Murray-Hudson, Mike; Thito, Kgalalelo; Cassidy, Lin

    2017-05-01

    Characterising inundation conditions for flood-pulsed wetlands is a critical first step towards assessment of flood risk as well as towards understanding hydrological dynamics that underlay their ecology and functioning. In this paper, we develop a series of inundation maps for the Okavango Delta, Botswana, based on the thresholding of the SWIR band (b7) MODIS MCD43A4 product. We show that in the Okavango Delta, SWIR is superior to other spectral bands or derived indices, and illustrate an innovative way of defining the spectral threshold used to separate inundated from dry land. The threshold is determined dynamically for each scene based on reflectances of training areas capturing end-members of the inundation spectrum. The method provides a very good accuracy and is suitable for automated processing.

  15. New Textile Sensors for In Situ Structural Health Monitoring of Textile Reinforced Thermoplastic Composites Based on the Conductive Poly(3,4-ethylenedioxythiophene)-poly(styrenesulfonate) Polymer Complex

    PubMed Central

    Jerkovic, Ivona; Koncar, Vladan; Grancaric, Ana Marija

    2017-01-01

    Many metallic structural and non-structural parts used in the transportation industry can be replaced by textile-reinforced composites. Composites made from a polymeric matrix and fibrous reinforcement have been increasingly studied during the last decade. On the other hand, the fast development of smart textile structures seems to be a very promising solution for in situ structural health monitoring of composite parts. In order to optimize composites’ quality and their lifetime all the production steps have to be monitored in real time. Textile sensors embedded in the composite reinforcement and having the same mechanical properties as the yarns used to make the reinforcement exhibit actuating and sensing capabilities. This paper presents a new generation of textile fibrous sensors based on the conductive polymer complex poly(3,4-ethylenedioxythiophene)-poly(styrenesulfonate) developed by an original roll to roll coating method. Conductive coating for yarn treatment was defined according to the preliminary study of percolation threshold of this polymer complex. The percolation threshold determination was based on conductive dry films’ electrical properties analysis, in order to develop highly sensitive sensors. A novel laboratory equipment was designed and produced for yarn coating to ensure effective and equally distributed coating of electroconductive polymer without distortion of textile properties. The electromechanical properties of the textile fibrous sensors confirmed their suitability for in situ structural damages detection of textile reinforced thermoplastic composites in real time. PMID:28994733

  16. Development of a real-time bridge structural monitoring and warning system: a case study in Thailand

    NASA Astrophysics Data System (ADS)

    Khemapech, I.; Sansrimahachai, W.; Toachoodee, M.

    2017-04-01

    Regarded as one of the physical aspects under societal and civil development and evolution, engineering structure is required to support growth of the nation. It also impacts life quality and safety of the civilian. Despite of its own weight (dead load) and live load, structural members are also significantly affected by disaster and environment. Proper inspection and detection are thus crucial both during regular and unsafe events. An Enhanced Structural Health Monitoring System Using Stream Processing and Artificial Neural Network Techniques (SPANNeT) has been developed and is described in this paper. SPANNeT applies wireless sensor network, real-time data stream processing and artificial neural network based upon the measured bending strains. Major contributions include an effective, accurate and energy-aware data communication and damage detection of the engineering structure. Strain thresholds have been defined according to computer simulation results and the AASHTO (American Association of State Highway and Transportation Officials) LRFD (Load and Resistance Factor Design) Bridge Design specifications for launching several warning levels. SPANNeT has been tested and evaluated by means of computer-based simulation and on-site levels. According to the measurements, the observed maximum values are 25 to 30 microstrains during normal operation. The given protocol provided at least 90% of data communication reliability. SPANNeT is capable of real-time data report, monitoring and warning efficiently conforming to the predefined thresholds which can be adjusted regarding user's requirements and structural engineering characteristics.

  17. New Textile Sensors for In Situ Structural Health Monitoring of Textile Reinforced Thermoplastic Composites Based on the Conductive Poly(3,4-ethylenedioxythiophene)-poly(styrenesulfonate) Polymer Complex.

    PubMed

    Jerkovic, Ivona; Koncar, Vladan; Grancaric, Ana Marija

    2017-10-10

    Many metallic structural and non-structural parts used in the transportation industry can be replaced by textile-reinforced composites. Composites made from a polymeric matrix and fibrous reinforcement have been increasingly studied during the last decade. On the other hand, the fast development of smart textile structures seems to be a very promising solution for in situ structural health monitoring of composite parts. In order to optimize composites' quality and their lifetime all the production steps have to be monitored in real time. Textile sensors embedded in the composite reinforcement and having the same mechanical properties as the yarns used to make the reinforcement exhibit actuating and sensing capabilities. This paper presents a new generation of textile fibrous sensors based on the conductive polymer complex poly(3,4-ethylenedioxythiophene)-poly(styrenesulfonate) developed by an original roll to roll coating method. Conductive coating for yarn treatment was defined according to the preliminary study of percolation threshold of this polymer complex. The percolation threshold determination was based on conductive dry films' electrical properties analysis, in order to develop highly sensitive sensors. A novel laboratory equipment was designed and produced for yarn coating to ensure effective and equally distributed coating of electroconductive polymer without distortion of textile properties. The electromechanical properties of the textile fibrous sensors confirmed their suitability for in situ structural damages detection of textile reinforced thermoplastic composites in real time.

  18. SU-E-T-110: An Investigation On Monitor Unit Threshold and Effects On IMPT Delivery in Proton Pencil Beam Planning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Syh, J; Ding, X; Syh, J

    2015-06-15

    Purpose: An approved proton pencil beam scanning (PBS) treatment plan might not be able to deliver because of existed extremely low monitor unit per beam spot. A dual hybrid plan with higher efficiency of higher spot monitor unit and the efficacy of less number of energy layers were searched and optimized. The range of monitor unit threshold setting was investigated and the plan quality was evaluated by target dose conformity. Methods: Certain limitations and requirements need to be checks and tested before a nominal proton PBS treatment plan can be delivered. The plan needs to be met the machine characterization,more » specification in record and verification to deliver the beams. Minimal threshold of monitor unit, e.g. 0.02, per spot was set to filter the low counts and plan was re-computed. Further MU threshold increment was tested in sequence without sacrificing the plan quality. The number of energy layer was also alternated due to elimination of low count layer(s). Results: Minimal MU/spot threshold, spot spacing in each energy layer and total number of energy layer and the MU weighting of beam spots of each beam were evaluated. Plan optimization between increases of the spot MU (efficiency) and less energy layers of delivery (efficacy) was adjusted. 5% weighting limit of total monitor unit per beam was feasible. Scarce spreading of beam spots was not discouraging as long as target dose conformity within 3% criteria. Conclusion: Each spot size is equivalent to the relative dose in the beam delivery system. The energy layer is associated with the depth of the targeting tumor. Our work is crucial to maintain the best possible quality plan. To keep integrity of all intrinsic elements such as spot size, spot number, layer number and the carried weighting of spots in each layer is important in this study.« less

  19. Measuring financial protection against catastrophic health expenditures: methodological challenges for global monitoring.

    PubMed

    Hsu, Justine; Flores, Gabriela; Evans, David; Mills, Anne; Hanson, Kara

    2018-05-31

    Monitoring financial protection against catastrophic health expenditures is important to understand how health financing arrangements in a country protect its population against high costs associated with accessing health services. While catastrophic health expenditures are generally defined to be when household expenditures for health exceed a given threshold of household resources, there is no gold standard with several methods applied to define the threshold and household resources. These different approaches to constructing the indicator might give different pictures of a country's progress towards financial protection. In order for monitoring to effectively provide policy insight, it is critical to understand the sensitivity of measurement to these choices. This paper examines the impact of varying two methodological choices by analysing household expenditure data from a sample of 47 countries. We assess sensitivity of cross-country comparisons to a range of thresholds by testing for restricted dominance. We further assess sensitivity of comparisons to different methods for defining household resources (i.e. total expenditure, non-food expenditure and non-subsistence expenditure) by conducting correlation tests of country rankings. We found country rankings are robust to the choice of threshold in a tenth to a quarter of comparisons within the 5-85% threshold range and this increases to half of comparisons if the threshold is restricted to 5-40%, following those commonly used in the literature. Furthermore, correlations of country rankings using different methods to define household resources were moderate to high; thus, this choice makes less difference from a measurement perspective than from an ethical perspective as different definitions of available household resources reflect varying concerns for equity. Interpreting comparisons from global monitoring based on a single threshold should be done with caution as these may not provide reliable insight into relative country progress. We therefore recommend financial protection against catastrophic health expenditures be measured across a range of thresholds using a catastrophic incidence curve as shown in this paper. We further recommend evaluating financial protection in relation to a country's health financing system arrangements in order to better understand the extent of protection and better inform future policy changes.

  20. Technical Challenges for a Comprehensive Test Ban: A historical perspective to frame the future (Invited)

    NASA Astrophysics Data System (ADS)

    Wallace, T. C.

    2013-12-01

    In the summer of 1958 scientists from the Soviet block and the US allies met in Geneva to discuss what it would take to monitor a forerunner to a Comprehensive Test Ban Treaty at the 'Conference of Experts to Study the Possibility of Detecting Violations of a Possible Agreement on Suspension of Nuclear Tests'. Although armed with a limited resume of observations, the conference recommended a multi-phenomenology approach (air sampling, acoustics, seismic and electromagnetic) deployed it a network of 170 sites scattered across the Northern Hemisphere, and hypothesized a detection threshold of 1kt for atmospheric tests and 5kt for underground explosions. The conference recommendations spurred vigorous debate, with strong disagreement with the stated detection hypothesis. Nevertheless, the technical challenges posed lead to a very focused effort to improve facilities, methodologies and, most importantly, research and development on event detection, location and identification. In the ensuing 50 years the various challenges arose and were eventually 'solved'; these included quantifying yield determination to enter a Limited Threshold Test Ban, monitoring broad areas of emerging nuclear nations, and after the mid-1990s lowering the global detection threshold to sub-kiloton levels for underground tests. Today there is both an international monitoring regime (ie, the International Monitoring System, or IMS) and a group of countries that have their own national technical means (NTM). The challenges for the international regime are evolving; the IMS has established itself as a very credible monitoring system, but the demand of a CTBT to detect and identify a 'nuclear test' of diminished size (zero yield) poses new technical hurdles. These include signal processing and understanding limits of resolution, location accuracy, integration of heterogeneous data, and accurately characterizing anomalous events. It is possible to extrapolate past technical advances to predict what should be available by 2020; detection of coupled explosions to 100s of tons for all continental areas, as well as a probabilistic assessment of event identification.

  1. SU-E-J-72: Design and Study of In-House Web-Camera Based Automatic Continuous Patient Movement Monitoring and Controlling Device for EXRT.

    PubMed

    Senthil Kumar, S; Suresh Babu, S S; Anand, P; Dheva Shantha Kumari, G

    2012-06-01

    The purpose of our study was to fabricate in-house web-camera based automatic continuous patient movement monitoring device and control the movement of the patients during EXRT. Web-camera based patient movement monitoring device consists of a computer, digital web-camera, mounting system, breaker circuit, speaker, and visual indicator. The computer is used to control and analyze the patient movement using indigenously developed software. The speaker and the visual indicator are placed in the console room to indicate the positional displacement of the patient. Studies were conducted on phantom and 150 patients with different types of cancers. Our preliminary clinical results indicate that our device is highly reliable and can accurately report smaller movements of the patients in all directions. The results demonstrated that the device was able to detect patient's movements with the sensitivity of about 1 mm. When a patient moves, the receiver activates the circuit; an audible warning sound will be produced in the console. Through real-time measurements, an audible alarm can alert the radiation technologist to stop the treatment if the user defined positional threshold is violated. Simultaneously, the electrical circuit to the teletherapy machine will be activated and radiation will be halted. Patient's movement during the course for radiotherapy was studied. The beam is halted automatically when the threshold level of the system is exceeded. By using the threshold provided in the system, it is possible to monitor the patient continuously with certain fixed limits. An additional benefit is that it has reduced the tension and stress of a treatment team associated with treating patients who are not immobilized. It also enables the technologists to do their work more efficiently, because they don't have to continuously monitor patients with as much scrutiny as was required. © 2012 American Association of Physicists in Medicine.

  2. Misclassification of OSA severity with automated scoring of home sleep recordings.

    PubMed

    Aurora, R Nisha; Swartz, Rachel; Punjabi, Naresh M

    2015-03-01

    The advent of home sleep testing has allowed for the development of an ambulatory care model for OSA that most health-care providers can easily deploy. Although automated algorithms that accompany home sleep monitors can identify and classify disordered breathing events, it is unclear whether manual scoring followed by expert review of home sleep recordings is of any value. Thus, this study examined the agreement between automated and manual scoring of home sleep recordings. Two type 3 monitors (ApneaLink Plus [ResMed] and Embletta [Embla Systems]) were examined in distinct study samples. Data from manual and automated scoring were available for 200 subjects. Two thresholds for oxygen desaturation (≥ 3% and ≥ 4%) were used to define disordered breathing events. Agreement between manual and automated scoring was examined using Pearson correlation coefficients and Bland-Altman analyses. Automated scoring consistently underscored disordered breathing events compared with manual scoring for both sleep monitors irrespective of whether a ≥ 3% or ≥ 4% oxygen desaturation threshold was used to define the apnea-hypopnea index (AHI). For the ApneaLink Plus monitor, Bland-Altman analyses revealed an average AHI difference between manual and automated scoring of 6.1 (95% CI, 4.9-7.3) and 4.6 (95% CI, 3.5-5.6) events/h for the ≥ 3% and ≥ 4% oxygen desaturation thresholds, respectively. Similarly for the Embletta monitor, the average difference between manual and automated scoring was 5.3 (95% CI, 3.2-7.3) and 8.4 (95% CI, 7.2-9.6) events/h, respectively. Although agreement between automated and manual scoring of home sleep recordings varies based on the device used, modest agreement was observed between the two approaches. However, manual review of home sleep test recordings can decrease the misclassification of OSA severity, particularly for those with mild disease. ClinicalTrials.gov; No.: NCT01503164; www.clinicaltrials.gov.

  3. Misclassification of OSA Severity With Automated Scoring of Home Sleep Recordings

    PubMed Central

    Aurora, R. Nisha; Swartz, Rachel

    2015-01-01

    BACKGROUND: The advent of home sleep testing has allowed for the development of an ambulatory care model for OSA that most health-care providers can easily deploy. Although automated algorithms that accompany home sleep monitors can identify and classify disordered breathing events, it is unclear whether manual scoring followed by expert review of home sleep recordings is of any value. Thus, this study examined the agreement between automated and manual scoring of home sleep recordings. METHODS: Two type 3 monitors (ApneaLink Plus [ResMed] and Embletta [Embla Systems]) were examined in distinct study samples. Data from manual and automated scoring were available for 200 subjects. Two thresholds for oxygen desaturation (≥ 3% and ≥ 4%) were used to define disordered breathing events. Agreement between manual and automated scoring was examined using Pearson correlation coefficients and Bland-Altman analyses. RESULTS: Automated scoring consistently underscored disordered breathing events compared with manual scoring for both sleep monitors irrespective of whether a ≥ 3% or ≥ 4% oxygen desaturation threshold was used to define the apnea-hypopnea index (AHI). For the ApneaLink Plus monitor, Bland-Altman analyses revealed an average AHI difference between manual and automated scoring of 6.1 (95% CI, 4.9-7.3) and 4.6 (95% CI, 3.5-5.6) events/h for the ≥ 3% and ≥ 4% oxygen desaturation thresholds, respectively. Similarly for the Embletta monitor, the average difference between manual and automated scoring was 5.3 (95% CI, 3.2-7.3) and 8.4 (95% CI, 7.2-9.6) events/h, respectively. CONCLUSIONS: Although agreement between automated and manual scoring of home sleep recordings varies based on the device used, modest agreement was observed between the two approaches. However, manual review of home sleep test recordings can decrease the misclassification of OSA severity, particularly for those with mild disease. TRIAL REGISTRY: ClinicalTrials.gov; No.: NCT01503164; www.clinicaltrials.gov PMID:25411804

  4. Long-range fluctuations and multifractality in connectivity density time series of a wind speed monitoring network

    NASA Astrophysics Data System (ADS)

    Laib, Mohamed; Telesca, Luciano; Kanevski, Mikhail

    2018-03-01

    This paper studies the daily connectivity time series of a wind speed-monitoring network using multifractal detrended fluctuation analysis. It investigates the long-range fluctuation and multifractality in the residuals of the connectivity time series. Our findings reveal that the daily connectivity of the correlation-based network is persistent for any correlation threshold. Further, the multifractality degree is higher for larger absolute values of the correlation threshold.

  5. Near-infrared spectroscopic monitoring during cardiopulmonary exercise testing detects anaerobic threshold.

    PubMed

    Rao, Rohit P; Danduran, Michael J; Loomba, Rohit S; Dixon, Jennifer E; Hoffman, George M

    2012-06-01

    Cardiopulmonary exercise testing (CPET) provides assessment of the integrative responses involving the pulmonary, cardiovascular, and skeletal muscle systems. Application of exercise testing remains limited to children who are able to understand and cooperate with the exercise protocol. Near-infrared spectroscopy (NIRS) provides a noninvasive, continuous method to monitor regional tissue oxygenation (rSO2). Our specific aim was to predict anaerobic threshold (AT) during CPET noninvasively using two-site NIRS monitoring. Achievement of a practical noninvasive technology for estimating AT will increase the compatibility of CPET. Patients without structural or acquired heart disease were eligible for inclusion if they were ordered to undergo CPET by a cardiologist. Data from 51 subjects was analyzed. The ventilatory anaerobic threshold (VAT) was computed on [Formula: see text] and respiratory quotient post hoc using the standard V-slope method. The inflection points of the regional rSO2 time-series were identified as the noninvasive regional NIRS AT for each of the two monitored regions (cerebral and kidney). AT calculation made using an average of kidney and brain NIRS matched the calculation made by VAT for the same patient. Two-site NIRS monitoring of visceral organs is a predictor of AT.

  6. The Nurse Watch: Design and Evaluation of a Smart Watch Application with Vital Sign Monitoring and Checklist Reminders

    PubMed Central

    Bang, Magnus; Solnevik, Katarina; Eriksson, Henrik

    2015-01-01

    Computerized wearable devices such as smart watches will become valuable nursing tools. This paper describes a smart-watch system developed in close collaboration with a team of nurses working in a Swedish ICU. The smart-watch system provides real-time vital-sign monitoring, threshold alarms, and to-do reminders. Additionally, a Kanban board, visualized on a multitouch screen provides an overview of completed and upcoming tasks. We describe an approach to implement automated checklist systems with smart watches and discuss aspects of importance when implementing such memory and attention support. The paper is finalized with an in-development formative evaluation of the system. PMID:26958162

  7. The Nurse Watch: Design and Evaluation of a Smart Watch Application with Vital Sign Monitoring and Checklist Reminders.

    PubMed

    Bang, Magnus; Solnevik, Katarina; Eriksson, Henrik

    Computerized wearable devices such as smart watches will become valuable nursing tools. This paper describes a smart-watch system developed in close collaboration with a team of nurses working in a Swedish ICU. The smart-watch system provides real-time vital-sign monitoring, threshold alarms, and to-do reminders. Additionally, a Kanban board, visualized on a multitouch screen provides an overview of completed and upcoming tasks. We describe an approach to implement automated checklist systems with smart watches and discuss aspects of importance when implementing such memory and attention support. The paper is finalized with an in-development formative evaluation of the system.

  8. Application of outlier analysis for baseline-free damage diagnosis

    NASA Astrophysics Data System (ADS)

    Kim, Seung Dae; In, Chi Won; Cronin, Kelly E.; Sohn, Hoon; Harries, Kent

    2006-03-01

    As carbon fiber-reinforced polymer (CFRP) laminates have been widely accepted as valuable materials for retrofitting civil infrastructure systems, an appropriate assessment of bonding conditions between host structures and CFRP laminates becomes a critical issue to guarantee the performance of CFRP strengthened structures. This study attempts to develop a continuous performance monitoring system for CFRP strengthened structures by autonomously inspecting the bonding conditions between the CFRP layers and the host structure. The uniqueness of this study is to develop a new concept and theoretical framework of nondestructive testing (NDT), in which debonding is detected "without using past baseline data." The proposed baseline-free damage diagnosis is achieved in two stages. In the first step, features sensitive to debonding of the CFPR layers but insensitive to loading conditions are extracted based on a concept referred to as a time reversal process. This time reversal process allows extracting damage-sensitive features without direct comparison with past baseline data. Then, a statistical damage classifier will be developed in the second step to make a decision regarding the bonding condition of the CFRP layers. The threshold necessary for decision making will be adaptively determined without predetermined threshold values. Monotonic and fatigue load tests of full-scale CFRP strengthened RC beams are conducted to demonstrate the potential of the proposed reference-free debonding monitoring system.

  9. A COMPACTRIO-BASED BEAM LOSS MONITOR FOR THE SNS RF TEST CAVE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blokland, Willem; Armstrong, Gary A

    2009-01-01

    An RF Test Cave has been built at the Spallation Neutron Source (SNS) to be able to test RF cavities without interfering the SNS accelerator operations. In addition to using thick concrete wall to minimize radiation exposure, a Beam Loss Monitor (BLM) must abort the operation within 100 usec when the integrated radiation within the cave exceeds a threshold. We choose the CompactRIO platform to implement the BLM based on its performance, cost-effectiveness, and rapid development. Each in/output module is connected through an FPGA to provide point-by-point processing. Every 10 usec the data is acquired analyzed and compared to themore » threshold. Data from the FPGA is transferred using DMA to the real-time controller, which communicates to a gateway PC to talk to the SNS control system. The system includes diagnostics to test the hardware and integrates the losses in real-time. In this paper we describe our design, implementation, and results« less

  10. Erosive Burning Study Utilizing Ultrasonic Measurement Techniques

    NASA Technical Reports Server (NTRS)

    Furfaro, James A.

    2003-01-01

    A 6-segment subscale motor was developed to generate a range of internal environments from which multiple propellants could be characterized for erosive burning. The motor test bed was designed to provide a high Mach number, high mass flux environment. Propellant regression rates were monitored for each segment utilizing ultrasonic measurement techniques. These data were obtained for three propellants RSRM, ETM- 03, and Castor@ IVA, which span two propellant types, PBAN (polybutadiene acrylonitrile) and HTPB (hydroxyl terminated polybutadiene). The characterization of these propellants indicates a remarkably similar erosive burning response to the induced flow environment. Propellant burnrates for each type had a conventional response with respect to pressure up to a bulk flow velocity threshold. Each propellant, however, had a unique threshold at which it would experience an increase in observed propellant burn rate. Above the observed threshold each propellant again demonstrated a similar enhanced burn rate response corresponding to the local flow environment.

  11. NetMOD Version 2.0 Mathematical Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merchant, Bion J.; Young, Christopher J.; Chael, Eric P.

    2015-08-01

    NetMOD ( Net work M onitoring for O ptimal D etection) is a Java-based software package for conducting simulation of seismic, hydroacoustic and infrasonic networks. Network simulations have long been used to study network resilience to station outages and to determine where additional stations are needed to reduce monitoring thresholds. NetMOD makes use of geophysical models to determine the source characteristics, signal attenuation along the path between the source and station, and the performance and noise properties of the station. These geophysical models are combined to simulate the relative amplitudes of signal and noise that are observed at each ofmore » the stations. From these signal-to-noise ratios (SNR), the probabilities of signal detection at each station and event detection across the network of stations can be computed given a detection threshold. The purpose of this document is to clearly and comprehensively present the mathematical framework used by NetMOD, the software package developed by Sandia National Laboratories to assess the monitoring capability of ground-based sensor networks. Many of the NetMOD equations used for simulations are inherited from the NetSim network capability assessment package developed in the late 1980s by SAIC (Sereno et al., 1990).« less

  12. Applying threshold concepts to conservation management of dryland ecosystems: Case studies on the Colorado Plateau

    USGS Publications Warehouse

    Bowker, Matthew A.; Miller, Mark E.; Garman, Steven L.; Belote, Travis; Guntenspergen, Glenn R.

    2014-01-01

    Ecosystems may occupy functionally distinct alternative states, some of which are more or less desirable from a management standpoint. Transitions from state to state are usually associated with a particular trigger or sequence of triggers, such as the addition or subtraction of a disturbance. Transitions are often not linear, rather it is common to see an abrupt transition come about even though the trigger increases only incrementally; these are examples of threshold behaviors. An ideal monitoring program, such as the National Park Service’s Inventory and Monitoring Program, would quantify triggers, and be able to inform managers when measurements of a trigger are approaching a threshold so that management action can avoid an unwanted state transition. Unfortunately, both triggers and the threshold points at which state transitions occur are generally only partially known. Using case studies, we advance a general procedure to help identify triggers and estimate where threshold dynamics may occur. Our procedure is as follows: (1) Operationally define the ecosystem type being considered; we suggest that the ecological site concept of the Natural Resource Conservation Service is a useful system, (2) Using all available a priori knowledge to develop a state-and-transition model (STM), which defines possible ecosystem states, plausible transitions among them and likely triggers, (3) Validate the STM by verifying the existence of its states to the greatest degree possible, (4) Use the STM model to identify transitions and triggers likely to be detectable by a monitoring program, and estimate to the greatest degree possible the value of a measurable indicator of a trigger at the point that a state transition is imminent (tipping point), and values that may indicate when management intervention should be considered (assessment points). We illustrate two different methods for attaining these goals using a data-rich case study in Canyonlands National Park, and a data-poor case study in Wupatki National Monument. In the data-rich case, STMs are validated and revised, and tipping and assessment points are estimated using statistical analysis of data. In the data-poor case, we develop an iterative expert opinion survey approach to validate the degree of confidence in an STM, revise the model, identify lack of confidence in specific model components, and create reasonable first approximations of tipping and assessment points, which can later be refined when more data are available. Our goal should be to develop the best set of models possible given the level of information available to support decisions, which is often not much. The approach presented here offers a flexible means of achieving this goal, and determining specific research areas in need of study.

  13. Towards developing drought impact functions to advance drought monitoring and early warning

    NASA Astrophysics Data System (ADS)

    Bachmair, Sophie; Stahl, Kerstin; Hannaford, Jamie; Svoboda, Mark

    2015-04-01

    In natural hazard analysis, damage functions (also referred to as vulnerability or susceptibility functions) relate hazard intensity to the negative effects of the hazard event, often expressed as damage ratio or monetary loss. While damage functions for floods and seismic hazards have gained considerable attention, there is little knowledge on how drought intensity translates into ecological and socioeconomic impacts. One reason for this is the multifaceted nature of drought affecting different domains of the hydrological cycle and different sectors of human activity (for example, recognizing meteorological - agricultural - hydrological - socioeconomic drought) leading to a wide range of drought impacts. Moreover, drought impacts are often non-structural and hard to quantify or monetarize (e.g. impaired navigability of streams, bans on domestic water use, increased mortality of aquatic species). Knowledge on the relationship between drought intensity and drought impacts, i.e. negative environmental, economic or social effects experienced under drought conditions, however, is vital to identify critical thresholds for drought impact occurrence. Such information may help to improve drought monitoring and early warning (M&EW), one goal of the international DrIVER project (Drought Impacts: Vulnerability thresholds in monitoring and Early-warning Research). The aim of this study is to test the feasibility of designing "drought impact functions" for case study areas in Europe (Germany and UK) and the United States to derive thresholds meaningful for drought impact occurrence; to account for the multidimensionality of drought impacts, we use the broader term "drought impact function" over "damage function". First steps towards developing empirical drought impact functions are (1) to identify meaningful indicators characterizing the hazard intensity (e.g. indicators expressing a precipitation or streamflow deficit), (2) to identify suitable variables representing impacts, damage, or loss due to drought, and (3) to test different statistical models to link drought intensity with drought impact information to derive meaningful thresholds. While the focus regarding drought impact variables lies on text-based impact reports from the European Drought Impact report Inventory (EDII) and the US Drought Impact Reporter (DIR), the information gain through exploiting other variables such as agricultural yield statistics and remotely sensed vegetation indices is explored. First results reveal interesting insights into the complex relationship between drought indicators and impacts and highlight differences among drought impact variables and geographies. Although a simple intensity threshold evoking specific drought impacts cannot be identified, developing drought impact functions helps to elucidate how drought conditions relate to ecological or socioeconomic impacts. Such knowledge may provide guidance for inferring meaningful triggers for drought M&EW and could have potential for a wide range of drought management applications (for example, building drought scenarios for testing the resilience of drought plans or water supply systems).

  14. Magnetotelluric Detection Thresholds as a Function of Leakage Plume Depth, TDS and Volume

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, X.; Buscheck, T. A.; Mansoor, K.

    We conducted a synthetic magnetotelluric (MT) data analysis to establish a set of specific thresholds of plume depth, TDS concentration and volume for detection of brine and CO 2 leakage from legacy wells into shallow aquifers in support of Strategic Monitoring Subtask 4.1 of the US DOE National Risk Assessment Partnership (NRAP Phase II), which is to develop geophysical forward modeling tools. 900 synthetic MT data sets span 9 plume depths, 10 TDS concentrations and 10 plume volumes. The monitoring protocol consisted of 10 MT stations in a 2×5 grid laid out along the flow direction. We model the MTmore » response in the audio frequency range of 1 Hz to 10 kHz with a 50 Ωm baseline resistivity and the maximum depth up to 2000 m. Scatter plots show the MT detection thresholds for a trio of plume depth, TDS concentration and volume. Plumes with a large volume and high TDS located at a shallow depth produce a strong MT signal. We demonstrate that the MT method with surface based sensors can detect a brine and CO 2 plume so long as the plume depth, TDS concentration and volume are above the thresholds. However, it is unlikely to detect a plume at a depth larger than 1000 m with the change of TDS concentration smaller than 10%. Simulated aquifer impact data based on the Kimberlina site provides a more realistic view of the leakage plume distribution than rectangular synthetic plumes in this sensitivity study, and it will be used to estimate MT responses over simulated brine and CO 2 plumes and to evaluate the leakage detectability. Integration of the simulated aquifer impact data and the MT method into the NRAP DREAM tool may provide an optimized MT survey configuration for MT data collection. This study presents a viable approach for sensitivity study of geophysical monitoring methods for leakage detection. The results come in handy for rapid assessment of leakage detectability.« less

  15. Which type of leader do I support in step-level public good dilemmas? The roles of level of threshold and trust.

    PubMed

    De Cremer, David

    2007-02-01

    The present research examined the moderating effect of the level of threshold on people's preferences for different leader types in step-level public good dilemmas. It was assumed that the primary focus of people in step-level public good dilemmas is to make sure that the group surpasses the threshold. Consequently, when the level of threshold is difficult to reach people are expected to provide more support for and cooperate with a leader that monitors and controls the contributions made toward the public good. However, if the threshold is easy to surpass people will focus more on whether the obtained public good or bonus will be distributed according to agreements, suggesting that people will provide more support to and cooperate with a leader that monitors and controls the distribution of the bonus. These predictions were confirmed across two experiments using a step-level public good paradigm with a dichotomous (Study 1) and a continuous (Study 2) contribution choice. Moreover, the results also revealed that perceptions of trust accounted, in part, for the effect of level of threshold on people's leadership preferences.

  16. Surveillance Monitoring Management for General Care Units: Strategy, Design, and Implementation.

    PubMed

    McGrath, Susan P; Taenzer, Andreas H; Karon, Nancy; Blike, George

    2016-07-01

    The growing number of monitoring devices, combined with suboptimal patient monitoring and alarm management strategies, has increased "alarm fatigue," which have led to serious consequences. Most reported alarm man- agement approaches have focused on the critical care setting. Since 2007 Dartmouth-Hitchcock (Lebanon, New Hamp- shire) has developed a generalizable and effective design, implementation, and performance evaluation approach to alarm systems for continuous monitoring in general care settings (that is, patient surveillance monitoring). In late 2007, a patient surveillance monitoring system was piloted on the basis of a structured design and implementation approach in a 36-bed orthopedics unit. Beginning in early 2009, it was expanded to cover more than 200 inpatient beds in all medicine and surgical units, except for psychiatry and labor and delivery. Improvements in clinical outcomes (reduction of unplanned transfers by 50% and reduction of rescue events by more than 60% in 2008) and approximately two alarms per patient per 12-hour nursing shift in the original pilot unit have been sustained across most D-H general care units in spite of increasing patient acuity and unit occupancy. Sample analysis of pager notifications indicates that more than 85% of all alarm conditions are resolved within 30 seconds and that more than 99% are resolved before escalation is triggered. The D-H surveillance monitoring system employs several important, generalizable features to manage alarms in a general care setting: alarm delays, static thresholds set appropriately for the prevalence of events in this setting, directed alarm annunciation, and policy-driven customization of thresholds to allow clinicians to respond to needs of individual patients. The systematic approach to design, implementation, and performance management has been key to the success of the system.

  17. A monitoring system for vegetable greenhouses based on a wireless sensor network.

    PubMed

    Li, Xiu-hong; Cheng, Xiao; Yan, Ke; Gong, Peng

    2010-01-01

    A wireless sensor network-based automatic monitoring system is designed for monitoring the life conditions of greenhouse vegetables. The complete system architecture includes a group of sensor nodes, a base station, and an internet data center. For the design of wireless sensor node, the JN5139 micro-processor is adopted as the core component and the Zigbee protocol is used for wireless communication between nodes. With an ARM7 microprocessor and embedded ZKOS operating system, a proprietary gateway node is developed to achieve data influx, screen display, system configuration and GPRS based remote data forwarding. Through a Client/Server mode the management software for remote data center achieves real-time data distribution and time-series analysis. Besides, a GSM-short-message-based interface is developed for sending real-time environmental measurements, and for alarming when a measurement is beyond some pre-defined threshold. The whole system has been tested for over one year and satisfactory results have been observed, which indicate that this system is very useful for greenhouse environment monitoring.

  18. Software thresholds alter the bias of actigraphy for monitoring sleep in team-sport athletes.

    PubMed

    Fuller, Kate L; Juliff, Laura; Gore, Christopher J; Peiffer, Jeremiah J; Halson, Shona L

    2017-08-01

    Actical ® actigraphy is commonly used to monitor athlete sleep. The proprietary software, called Actiware ® , processes data with three different sleep-wake thresholds (Low, Medium or High), but there is no standardisation regarding their use. The purpose of this study was to examine validity and bias of the sleep-wake thresholds for processing Actical ® sleep data in team sport athletes. Validation study comparing actigraph against accepted gold standard polysomnography (PSG). Sixty seven nights of sleep were recorded simultaneously with polysomnography and Actical ® devices. Individual night data was compared across five sleep measures for each sleep-wake threshold using Actiware ® software. Accuracy of each sleep-wake threshold compared with PSG was evaluated from mean bias with 95% confidence limits, Pearson moment-product correlation and associated standard error of estimate. The Medium threshold generated the smallest mean bias compared with polysomnography for total sleep time (8.5min), sleep efficiency (1.8%) and wake after sleep onset (-4.1min); whereas the Low threshold had the smallest bias (7.5min) for wake bouts. Bias in sleep onset latency was the same across thresholds (-9.5min). The standard error of the estimate was similar across all thresholds; total sleep time ∼25min, sleep efficiency ∼4.5%, wake after sleep onset ∼21min, and wake bouts ∼8 counts. Sleep parameters measured by the Actical ® device are greatly influenced by the sleep-wake threshold applied. In the present study the Medium threshold produced the smallest bias for most parameters compared with PSG. Given the magnitude of measurement variability, confidence limits should be employed when interpreting changes in sleep parameters. Copyright © 2017 Sports Medicine Australia. All rights reserved.

  19. Determination of prospective displacement-based gate threshold for respiratory-gated radiation delivery from retrospective phase-based gate threshold selected at 4D CT simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vedam, S.; Archambault, L.; Starkschall, G.

    2007-11-15

    Four-dimensional (4D) computed tomography (CT) imaging has found increasing importance in the localization of tumor and surrounding normal structures throughout the respiratory cycle. Based on such tumor motion information, it is possible to identify the appropriate phase interval for respiratory gated treatment planning and delivery. Such a gating phase interval is determined retrospectively based on tumor motion from internal tumor displacement. However, respiratory-gated treatment is delivered prospectively based on motion determined predominantly from an external monitor. Therefore, the simulation gate threshold determined from the retrospective phase interval selected for gating at 4D CT simulation may not correspond to the deliverymore » gate threshold that is determined from the prospective external monitor displacement at treatment delivery. The purpose of the present work is to establish a relationship between the thresholds for respiratory gating determined at CT simulation and treatment delivery, respectively. One hundred fifty external respiratory motion traces, from 90 patients, with and without audio-visual biofeedback, are analyzed. Two respiratory phase intervals, 40%-60% and 30%-70%, are chosen for respiratory gating from the 4D CT-derived tumor motion trajectory. From residual tumor displacements within each such gating phase interval, a simulation gate threshold is defined based on (a) the average and (b) the maximum respiratory displacement within the phase interval. The duty cycle for prospective gated delivery is estimated from the proportion of external monitor displacement data points within both the selected phase interval and the simulation gate threshold. The delivery gate threshold is then determined iteratively to match the above determined duty cycle. The magnitude of the difference between such gate thresholds determined at simulation and treatment delivery is quantified in each case. Phantom motion tests yielded coincidence of simulation and delivery gate thresholds to within 0.3%. For patient data analysis, differences between simulation and delivery gate thresholds are reported as a fraction of the total respiratory motion range. For the smaller phase interval, the differences between simulation and delivery gate thresholds are 8{+-}11% and 14{+-}21% with and without audio-visual biofeedback, respectively, when the simulation gate threshold is determined based on the mean respiratory displacement within the 40%-60% gating phase interval. For the longer phase interval, corresponding differences are 4{+-}7% and 8{+-}15% with and without audio-visual biofeedback, respectively. Alternatively, when the simulation gate threshold is determined based on the maximum average respiratory displacement within the gating phase interval, greater differences between simulation and delivery gate thresholds are observed. A relationship between retrospective simulation gate threshold and prospective delivery gate threshold for respiratory gating is established and validated for regular and nonregular respiratory motion. Using this relationship, the delivery gate threshold can be reliably estimated at the time of 4D CT simulation, thereby improving the accuracy and efficiency of respiratory-gated radiation delivery.« less

  20. Determination of prospective displacement-based gate threshold for respiratory-gated radiation delivery from retrospective phase-based gate threshold selected at 4D CT simulation.

    PubMed

    Vedam, S; Archambault, L; Starkschall, G; Mohan, R; Beddar, S

    2007-11-01

    Four-dimensional (4D) computed tomography (CT) imaging has found increasing importance in the localization of tumor and surrounding normal structures throughout the respiratory cycle. Based on such tumor motion information, it is possible to identify the appropriate phase interval for respiratory gated treatment planning and delivery. Such a gating phase interval is determined retrospectively based on tumor motion from internal tumor displacement. However, respiratory-gated treatment is delivered prospectively based on motion determined predominantly from an external monitor. Therefore, the simulation gate threshold determined from the retrospective phase interval selected for gating at 4D CT simulation may not correspond to the delivery gate threshold that is determined from the prospective external monitor displacement at treatment delivery. The purpose of the present work is to establish a relationship between the thresholds for respiratory gating determined at CT simulation and treatment delivery, respectively. One hundred fifty external respiratory motion traces, from 90 patients, with and without audio-visual biofeedback, are analyzed. Two respiratory phase intervals, 40%-60% and 30%-70%, are chosen for respiratory gating from the 4D CT-derived tumor motion trajectory. From residual tumor displacements within each such gating phase interval, a simulation gate threshold is defined based on (a) the average and (b) the maximum respiratory displacement within the phase interval. The duty cycle for prospective gated delivery is estimated from the proportion of external monitor displacement data points within both the selected phase interval and the simulation gate threshold. The delivery gate threshold is then determined iteratively to match the above determined duty cycle. The magnitude of the difference between such gate thresholds determined at simulation and treatment delivery is quantified in each case. Phantom motion tests yielded coincidence of simulation and delivery gate thresholds to within 0.3%. For patient data analysis, differences between simulation and delivery gate thresholds are reported as a fraction of the total respiratory motion range. For the smaller phase interval, the differences between simulation and delivery gate thresholds are 8 +/- 11% and 14 +/- 21% with and without audio-visual biofeedback, respectively, when the simulation gate threshold is determined based on the mean respiratory displacement within the 40%-60% gating phase interval. For the longer phase interval, corresponding differences are 4 +/- 7% and 8 +/- 15% with and without audiovisual biofeedback, respectively. Alternatively, when the simulation gate threshold is determined based on the maximum average respiratory displacement within the gating phase interval, greater differences between simulation and delivery gate thresholds are observed. A relationship between retrospective simulation gate threshold and prospective delivery gate threshold for respiratory gating is established and validated for regular and nonregular respiratory motion. Using this relationship, the delivery gate threshold can be reliably estimated at the time of 4D CT simulation, thereby improving the accuracy and efficiency of respiratory-gated radiation delivery.

  1. Real time observation system for monitoring environmental impact on marine ecosystems from oil drilling operations.

    PubMed

    Godø, Olav Rune; Klungsøyr, Jarle; Meier, Sonnich; Tenningen, Eirik; Purser, Autun; Thomsen, Laurenz

    2014-07-15

    Environmental awareness and technological advances has spurred development of new monitoring solutions for the petroleum industry. This paper presents experience from a monitoring program off Norway. To maintain operation within the limits of the government regulations Statoil tested a new monitoring concept. Multisensory data were cabled to surface buoys and transmitted to land via wireless communication. The system collected information about distribution of the drilling wastes and the welfare of the corals in relation to threshold values. The project experienced a series of failures, but the backup monitoring provided information to fulfil the requirements of the permit. The experience demonstrated the need for real time monitoring and how such systems enhance understanding of impacts on marine organisms. Also, drilling operations may improve by taking environmental information into account. The paper proposes to standardize and streamline monitoring protocols to maintain comparability during all phases of the operation and between drill sites. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Development of Thresholds and Exceedance Probabilities for Influent Water Quality to Meet Drinking Water Regulations

    NASA Astrophysics Data System (ADS)

    Reeves, K. L.; Samson, C.; Summers, R. S.; Balaji, R.

    2017-12-01

    Drinking water treatment utilities (DWTU) are tasked with the challenge of meeting disinfection and disinfection byproduct (DBP) regulations to provide safe, reliable drinking water under changing climate and land surface characteristics. DBPs form in drinking water when disinfectants, commonly chlorine, react with organic matter as measured by total organic carbon (TOC), and physical removal of pathogen microorganisms are achieved by filtration and monitored by turbidity removal. Turbidity and TOC in influent waters to DWTUs are expected to increase due to variable climate and more frequent fires and droughts. Traditional methods for forecasting turbidity and TOC require catchment specific data (i.e. streamflow) and have difficulties predicting them under non-stationary climate. A modelling framework was developed to assist DWTUs with assessing their risk for future compliance with disinfection and DBP regulations under changing climate. A local polynomial method was developed to predict surface water TOC using climate data collected from NOAA, Normalized Difference Vegetation Index (NDVI) data from the IRI Data Library, and historical TOC data from three DWTUs in diverse geographic locations. Characteristics from the DWTUs were used in the EPA Water Treatment Plant model to determine thresholds for influent TOC that resulted in DBP concentrations within compliance. Lastly, extreme value theory was used to predict probabilities of threshold exceedances under the current climate. Results from the utilities were used to produce a generalized TOC threshold approach that only requires water temperature and bromide concentration. The threshold exceedance model will be used to estimate probabilities of exceedances under projected climate scenarios. Initial results show that TOC can be forecasted using widely available data via statistical methods, where temperature, precipitation, Palmer Drought Severity Index, and NDVI with various lags were shown to be important predictors of TOC, and TOC thresholds can be determined using water temperature and bromide concentration. Results include a model to predict influent turbidity and turbidity thresholds, similar to the TOC models, as well as probabilities of threshold exceedances for TOC and turbidity under changing climate.

  3. Dynamic Self-adaptive Remote Health Monitoring System for Diabetics

    PubMed Central

    Suh, Myung-kyung; Moin, Tannaz; Woodbridge, Jonathan; Lan, Mars; Ghasemzadeh, Hassan; Bui, Alex; Ahmadi, Sheila; Sarrafzadeh, Majid

    2016-01-01

    Diabetes is the seventh leading cause of death in the United States. In 2010, about 1.9 million new cases of diabetes were diagnosed in people aged 20 years or older. Remote health monitoring systems can help diabetics and their healthcare professionals monitor health-related measurements by providing real-time feedback. However, data-driven methods to dynamically prioritize and generate tasks are not well investigated in the remote health monitoring. This paper presents a task optimization technique used in WANDA (Weight and Activity with Blood Pressure and Other Vital Signs); a wireless health project that leverages sensor technology and wireless communication to monitor the health status of patients with diabetes. WANDA applies data analytics in real-time to improving the quality of care. The developed algorithm minimizes the number of daily tasks required by diabetic patients using association rules that satisfies a minimum support threshold. Each of these tasks maximizes information gain, thereby improving the overall level of care. Experimental results show that the developed algorithm can reduce the number of tasks up to 28.6% with minimum support 0.95, minimum confidence 0.97 and high efficiency. PMID:23366365

  4. Use of email and telephone prompts to increase self-monitoring in a Web-based intervention: randomized controlled trial.

    PubMed

    Greaney, Mary L; Sprunck-Harrild, Kim; Bennett, Gary G; Puleo, Elaine; Haines, Jess; Viswanath, K Vish; Emmons, Karen M

    2012-07-27

    Self-monitoring is a key behavior change mechanism associated with sustained health behavior change. Although Web-based interventions can offer user-friendly approaches for self-monitoring, engagement with these tools is suboptimal. Increased use could encourage, promote, and sustain behavior change. To determine whether email prompts or email plus telephone prompts increase self-monitoring of behaviors on a website created for a multiple cancer risk reduction program. We recruited and enrolled participants (N = 100) in a Web-based intervention during a primary care well visit at an urban primary care health center. The frequency of daily self-monitoring was tracked on the study website. Participants who tracked at least one behavior 3 or more times during week 1 were classified as meeting the tracking threshold and were assigned to the observation-only group (OO, n = 14). This group was followed but did not receive prompts. Participants who did not meet the threshold during week 1 were randomly assigned to one of 2 prompting conditions: automated assistance (AA, n = 36) or automated assistance + calls (AAC, n = 50). During prompting periods (weeks 2-3), participants in the AA and AAC conditions received daily automated emails that encouraged tracking and two tailored self-monitoring reports (end of week 2, end of week 3) that provided feedback on tracking frequency. Individuals in the AAC condition also received two technical assistance calls from trained study staff. Frequency of self-monitoring was tracked from week 2 through week 17. Self-monitoring rates increased in both intervention conditions during prompting and declined when prompting ceased. Over the 16 weeks of observation, there was a significant between-group difference in the percentage who met the self-monitoring threshold each week, with better maintenance in the AAC than in the AA condition (P < .001). Self-monitoring rates were greater in the OO group than in either the AA or AAC condition (P < .001). Prompting can increase self-monitoring rates. The decrease in self-monitoring after the promoting period suggests that additional reminder prompts would be useful. The use of technical assistance calls appeared to have a greater effect in promoting self-monitoring at a therapeutic threshold than email reminders and the tailored self-monitoring reports alone. ClinicalTrials.gov NCT01415492; http://clinicaltrials.gov/ct2/show/NCT01415492 (Archived by WebCite at http://www.webcitation.org/68LOXOMe2).

  5. Rhetoric or Reality? Ethnic Monitoring in the "Threshold Assessment" of Teachers in England and Wales

    ERIC Educational Resources Information Center

    Menter, Ian; Hextall, Ian; Mahony, Pat

    2003-01-01

    Following the 1998 Green Paper on teachers' work, the UK government introduced Threshold Assessment of teachers in England and Wales in 2000. Teachers who met the Threshold standards were rewarded with a pay rise and access to an upper pay spine. At the time ministers gave assurances that equal opportunities would be taken very seriously in the…

  6. Automatic charge control system for satellites

    NASA Technical Reports Server (NTRS)

    Shuman, B. M.; Cohen, H. A.

    1985-01-01

    The SCATHA and the ATS-5 and 6 spacecraft provided insights to the problem of spacecraft charging at geosychronous altitudes. Reduction of the levels of both absolute and differential charging was indicated, by the emission of low energy neutral plasma. It is appropriate to complete the transition from experimental results to the development of a system that will sense the state-of-charge of a spacecraft, and, when a predetermined threshold is reached, will respond automatically to reduce it. A development program was initiated utilizing sensors comparable to the proton electrostatic analyzer, the surface potential monitor, and the transient pulse monitor that flew in SCATHA, and combine these outputs through a microprocessor controller to operate a rapid-start, low energy plasma source.

  7. Methods for automatic trigger threshold adjustment

    DOEpatents

    Welch, Benjamin J; Partridge, Michael E

    2014-03-18

    Methods are presented for adjusting trigger threshold values to compensate for drift in the quiescent level of a signal monitored for initiating a data recording event, thereby avoiding false triggering conditions. Initial threshold values are periodically adjusted by re-measuring the quiescent signal level, and adjusting the threshold values by an offset computation based upon the measured quiescent signal level drift. Re-computation of the trigger threshold values can be implemented on time based or counter based criteria. Additionally, a qualification width counter can be utilized to implement a requirement that a trigger threshold criterion be met a given number of times prior to initiating a data recording event, further reducing the possibility of a false triggering situation.

  8. Checkpoint triggering in a computer system

    DOEpatents

    Cher, Chen-Yong

    2016-09-06

    According to an aspect, a method for triggering creation of a checkpoint in a computer system includes executing a task in a processing node of the computer system and determining whether it is time to read a monitor associated with a metric of the task. The monitor is read to determine a value of the metric based on determining that it is time to read the monitor. A threshold for triggering creation of the checkpoint is determined based on the value of the metric. Based on determining that the value of the metric has crossed the threshold, the checkpoint including state data of the task is created to enable restarting execution of the task upon a restart operation.

  9. Atrial Fibrillation Burden: Moving Beyond Atrial Fibrillation as a Binary Entity: A Scientific Statement From the American Heart Association.

    PubMed

    Chen, Lin Y; Chung, Mina K; Allen, Larry A; Ezekowitz, Michael; Furie, Karen L; McCabe, Pamela; Noseworthy, Peter A; Perez, Marco V; Turakhia, Mintu P

    2018-05-15

    Our understanding of the risk factors and complications of atrial fibrillation (AF) is based mostly on studies that have evaluated AF in a binary fashion (present or absent) and have not investigated AF burden. This scientific statement discusses the published literature and knowledge gaps related to methods of defining and measuring AF burden, the relationship of AF burden to cardiovascular and neurological outcomes, and the effect of lifestyle and risk factor modification on AF burden. Many studies examine outcomes by AF burden classified by AF type (paroxysmal versus nonparoxysmal); however, quantitatively, AF burden can be defined by longest duration, number of AF episodes during a monitoring period, and the proportion of time an individual is in AF during a monitoring period (expressed as a percentage). Current guidelines make identical recommendations for anticoagulation regardless of AF pattern or burden; however, a review of recent evidence suggests that higher AF burden is associated with higher risk of stroke. It is unclear whether the risk increases continuously or whether a threshold exists; if a threshold exists, it has not been defined. Higher burden of AF is also associated with higher prevalence and incidence of heart failure and higher risk of mortality, but not necessarily lower quality of life. A structured and comprehensive risk factor management program targeting risk factors, weight loss, and maintenance of a healthy weight appears to be effective in reducing AF burden. Despite this growing understanding of AF burden, research is needed into validation of definitions and measures of AF burden, determination of the threshold of AF burden that results in an increased risk of stroke that warrants anticoagulation, and discovery of the mechanisms underlying the weak temporal correlations of AF and stroke. Moreover, developments in monitoring technologies will likely change the landscape of long-term AF monitoring and could allow better definition of the significance of changes in AF burden over time. © 2018 American Heart Association, Inc.

  10. Current concepts in the pathophysiology, evaluation, and diagnosis of compartment syndrome

    NASA Technical Reports Server (NTRS)

    Hargens, A. R.; Mubarak, S. J.

    1998-01-01

    This article reviews present knowledge of the pathophysiology and diagnosis of acute compartment syndromes. Recent results using compression of legs in normal volunteers provide objective data concerning local pressure thresholds for neuromuscular dysfunction in the anterior compartment. Results with this model indicate that a progression of neuromuscular deficits occurs when IMP increases to within 35 to 40 mm Hg of diastolic blood pressure. These findings provide useful information on the diagnosis and compression thresholds for acute compartment syndromes. Time factors are also important, however, and usually are incompletely known in most cases of acute compartment syndrome. Although the slit catheter is a very good technique for monitoring IMP during rest, these catheters and their associated extracorporeal transducer systems are not ideal. Recently developed miniature transducer-tipped catheters and, perhaps, future development of noninvasive techniques may provide accurate recordings of IMP in patients with acute compartment syndromes.

  11. Spatial Patterns in Alternative States and Thresholds: A Missing Link for Management of Landscapes?

    USDA-ARS?s Scientific Manuscript database

    The detection of threshold dynamics (and other dynamics of interest) would benefit from explicit representations of spatial patterns of disturbance, spatial dependence in responses to disturbance, and the spatial structure of feedbacks in the design of monitoring and management strategies. Spatially...

  12. Pulse oximeter based mobile biotelemetry application.

    PubMed

    Işik, Ali Hakan; Güler, Inan

    2012-01-01

    Quality and features of tele-homecare are improved by information and communication technologies. In this context, a pulse oximeter-based mobile biotelemetry application is developed. With this application, patients can measure own oxygen saturation and heart rate through Bluetooth pulse oximeter at home. Bluetooth virtual serial port protocol is used to send the test results from pulse oximeter to the smart phone. These data are converted into XML type and transmitted to remote web server database via smart phone. In transmission of data, GPRS, WLAN or 3G can be used. The rule based algorithm is used in the decision making process. By default, the threshold value of oxygen saturation is 80; the heart rate threshold values are 40 and 150 respectively. If the patient's heart rate is out of the threshold values or the oxygen saturation is below the threshold value, an emergency SMS is sent to the doctor. By this way, the directing of an ambulance to the patient can be performed by doctor. The doctor for different patients can change these threshold values. The conversion of the result of the evaluated data to SMS XML template is done on the web server. Another important component of the application is web-based monitoring of pulse oximeter data. The web page provides access to of all patient data, so the doctors can follow their patients and send e-mail related to the evaluation of the disease. In addition, patients can follow own data on this page. Eight patients have become part of the procedure. It is believed that developed application will facilitate pulse oximeter-based measurement from anywhere and at anytime.

  13. Detection and classification of alarm threshold violations in condition monitoring systems working in highly varying operational conditions

    NASA Astrophysics Data System (ADS)

    Strączkiewicz, M.; Barszcz, T.; Jabłoński, A.

    2015-07-01

    All commonly used condition monitoring systems (CMS) enable defining alarm thresholds that enhance efficient surveillance and maintenance of dynamic state of machinery. The thresholds are imposed on the measured values such as vibration-based indicators, temperature, pressure, etc. For complex machinery such as wind turbine (WT) the total number of thresholds might be counted in hundreds multiplied by the number of operational states. All the parameters vary not only due to possible machinery malfunctions, but also due to changes in operating conditions and these changes are typically much stronger than the former ones. Very often, such a behavior may lead to hundreds of false alarms. Therefore, authors propose a novel approach based on parameterized description of the threshold violation. For this purpose the novelty and severity factors are introduced. The first parameter refers to the time of violation occurrence while the second one describes the impact of the indicator-increase to the entire machine. Such approach increases reliability of the CMS by providing the operator with the most useful information of the system events. The idea of the procedure is presented on a simulated data similar to those from a wind turbine.

  14. Epstein-barr virus DNAemia monitoring for the management of post-transplant lymphoproliferative disorder.

    PubMed

    Kalra, Amit; Roessner, Cameron; Jupp, Jennifer; Williamson, Tyler; Tellier, Raymond; Chaudhry, Ahsan; Khan, Faisal; Taparia, Minakshi; Jimenez-Zepeda, Victor H; Stewart, Douglas A; Daly, Andrew; Storek, Jan

    2018-05-01

    Post-transplant lymphoproliferative disorder (PTLD) is a potentially fatal complication of allogeneic hematopoietic cell transplantation (HCT). Epstein-Barr virus (EBV) reactivation (detectable DNAemia) predisposes to the development of PTLD. We retrospectively studied 306 patients monitored for EBV DNAemia after Thymoglobulin-conditioned HCT to determine the utility of the monitoring in the management of PTLD. DNAemia was monitored weekly for ≥12 weeks post-transplantation. Reactivation was detected in 82% of patients. PTLD occurred in 14% of the total patients (17% of patients with reactivation). PTLD was treated with rituximab only when and if the diagnosis was established. This allowed us to evaluate potential DNAemia thresholds for pre-emptive therapy. We suggest 100,000-500,000 IU per mL whole blood as this would result in unnecessary rituximab administration to only 4-20% of patients and near zero mortality due to PTLD. After starting rituximab (for diagnosed PTLD), sustained regression of PTLD occurred in 25/25 (100%) patients in whom DNAemia became undetectable. PTLD progressed or relapsed in 12/17 (71%) patients in whom DNAemia was persistently detectable. In conclusion, for pre-emptive therapy of PTLD, we suggest threshold DNAemia of 100,000-500,000 IU/mL. Persistently detectable DNAemia after PTLD treatment with rituximab appears to have 71% positive predictive value and 100% negative predictive value for PTLD progression/relapse. Copyright © 2018 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  15. Alternative life histories in the Atlantic salmon: genetic covariances within the sneaker sexual tactic in males.

    PubMed

    Páez, David James; Bernatchez, Louis; Dodson, Julian J

    2011-07-22

    Alternative reproductive tactics are ubiquitous in many species. Tactic expression often depends on whether an individual's condition surpasses thresholds that are responsible for activating particular developmental pathways. Two central goals in understanding the evolution of reproductive tactics are quantifying the extent to which thresholds are explained by additive genetic effects, and describing their covariation with condition-related traits. We monitored the development of early sexual maturation that leads to the sneaker reproductive tactic in Atlantic salmon (Salmo salar L.). We found evidence for additive genetic variance in the timing of sexual maturity (which is a measure of the surpassing of threshold values) and body-size traits. This suggests that selection can affect the patterns of sexual development by changing the timing of this event and/or body size. Significant levels of covariation between these traits also occurred, implying a potential for correlated responses to selection. Closer examination of genetic covariances suggests that the detected genetic variation is distributed along at least five directions of phenotypic variation. Our results show that the potential for evolution of the life-history traits constituting this reproductive phenotype is greatly influenced by their patterns of genetic covariance.

  16. Alternative life histories in the Atlantic salmon: genetic covariances within the sneaker sexual tactic in males

    PubMed Central

    Páez, David James; Bernatchez, Louis; Dodson, Julian J.

    2011-01-01

    Alternative reproductive tactics are ubiquitous in many species. Tactic expression often depends on whether an individual's condition surpasses thresholds that are responsible for activating particular developmental pathways. Two central goals in understanding the evolution of reproductive tactics are quantifying the extent to which thresholds are explained by additive genetic effects, and describing their covariation with condition-related traits. We monitored the development of early sexual maturation that leads to the sneaker reproductive tactic in Atlantic salmon (Salmo salar L.). We found evidence for additive genetic variance in the timing of sexual maturity (which is a measure of the surpassing of threshold values) and body-size traits. This suggests that selection can affect the patterns of sexual development by changing the timing of this event and/or body size. Significant levels of covariation between these traits also occurred, implying a potential for correlated responses to selection. Closer examination of genetic covariances suggests that the detected genetic variation is distributed along at least five directions of phenotypic variation. Our results show that the potential for evolution of the life-history traits constituting this reproductive phenotype is greatly influenced by their patterns of genetic covariance. PMID:21177685

  17. Display/control requirements for automated VTOL aircraft

    NASA Technical Reports Server (NTRS)

    Hoffman, W. C.; Kleinman, D. L.; Young, L. R.

    1976-01-01

    A systematic design methodology for pilot displays in advanced commercial VTOL aircraft was developed and refined. The analyst is provided with a step-by-step procedure for conducting conceptual display/control configurations evaluations for simultaneous monitoring and control pilot tasks. The approach consists of three phases: formulation of information requirements, configuration evaluation, and system selection. Both the monitoring and control performance models are based upon the optimal control model of the human operator. Extensions to the conventional optimal control model required in the display design methodology include explicit optimization of control/monitoring attention; simultaneous monitoring and control performance predictions; and indifference threshold effects. The methodology was applied to NASA's experimental CH-47 helicopter in support of the VALT program. The CH-47 application examined the system performance of six flight conditions. Four candidate configurations are suggested for evaluation in pilot-in-the-loop simulations and eventual flight tests.

  18. Online fault diagnostics and testing of area gamma radiation monitor using wireless network

    NASA Astrophysics Data System (ADS)

    Reddy, Padi Srinivas; Kumar, R. Amudhu Ramesh; Mathews, M. Geo; Amarendra, G.

    2017-07-01

    Periodical surveillance, checking, testing, and calibration of the installed Area Gamma Radiation Monitors (AGRM) in the nuclear plants are mandatory. The functionality of AGRM counting electronics and Geiger-Muller (GM) tube is to be monitored periodically. The present paper describes the development of online electronic calibration and testing of the GM tube from the control room. Two electronic circuits were developed, one for AGRM electronic test and another for AGRM detector test. A dedicated radiation data acquisition system was developed using an open platform communication server and data acquisition software. The Modbus RTU protocol on ZigBee based wireless communication was used for online monitoring and testing. The AGRM electronic test helps to carry out the three-point electronic calibration and verification of accuracy. The AGRM detector test is used to verify the GM threshold voltage and the plateau slope of the GM tube in-situ. The real-time trend graphs generated during these tests clearly identified the state of health of AGRM electronics and GM tube on go/no-go basis. This method reduces the radiation exposures received by the maintenance crew and facilitates quick testing with minimum downtime of the instrument.

  19. Volcano early warning system based on MSG-SEVIRI multispectral data

    NASA Astrophysics Data System (ADS)

    Ganci, Gaetana; Vicari, Annamaria; Del Negro, Ciro

    2010-05-01

    Spaceborne remote sensing of high-temperature volcanic features offers an excellent opportunity to monitor the onset and development of new eruptive activity. Particularly, images with lower spatial but higher temporal resolution from meteorological satellites have been proved to be a sound instrument for continuous monitoring of volcanic activity, even though the relevant volcanic characteristics are much smaller than the nominal pixel size. The launch of Spinning Enhanced Visible and Infrared Imager (SEVIRI), on August 2002, onboard the geosynchronous platforms MSG1 and MSG2, has opened a new perspective for near real-time volcano monitoring by providing images at 15 minutes interval. Indeed, in spite of the low spatial resolution (3 km2 at nadir), the high frequency of observations afforded by the MSG SEVIRI was recently applied both for forest fire detection and for the monitoring of effusive volcanoes in Europe and Africa. Our Laboratory of Technologies (TecnoLab) at INGV-CT has been developing methods and know-how for the automated acquisition and management of MSG SEVIRI data. To provide a basis for real-time response during eruptive events, we designed and developed the automated system called HOTSAT. Our algorithm takes advantages from both spectral and spatial comparisons. Firstly, we use an adaptive thresholding procedure based on the computation of the spatial standard deviation derived from the immediately neighboring of each pixel to detect "potential" hot pixels. Secondly, it is required to further assess as true or false hotspot detections base on other thresholds test derived from the SEVIRI middle infrared (MIR, 3.9 μm) brightness temperatures taking into account its statistic behavior. Following these procedures, all the computations are based on dynamic thresholds reducing the number of false alarm due to atmospheric conditions. Our algorithm allows also the derivation of radiative power at all "hot" pixels. This is carried out using the MIR radiance method introduced by Wooster et al. [2003] for forest fires. It's based on an approximation of the Plank's Law as a power law. No assumption is made on the thermal structure of the pixel. The radiant flux, i.e. the fire radiative power, is proportional to the calibrated radiance associated to the hot part of the pixel computed as the difference between the observed hotspot pixel radiance in the SEVIRI MIR channel and the background radiance that would have been observed at the same location in the absence of thermal anomalies. The HOTSAT early warning system based on SEVIRI multispectral data is now suitable to be employed in an operational system of volcano monitoring. To validate and test the system some real cases on Mt Etna are presented.

  20. A Monitoring System for Vegetable Greenhouses based on a Wireless Sensor Network

    PubMed Central

    Li, Xiu-hong; Cheng, Xiao; Yan, Ke; Gong, Peng

    2010-01-01

    A wireless sensor network-based automatic monitoring system is designed for monitoring the life conditions of greenhouse vegetatables. The complete system architecture includes a group of sensor nodes, a base station, and an internet data center. For the design of wireless sensor node, the JN5139 micro-processor is adopted as the core component and the Zigbee protocol is used for wireless communication between nodes. With an ARM7 microprocessor and embedded ZKOS operating system, a proprietary gateway node is developed to achieve data influx, screen display, system configuration and GPRS based remote data forwarding. Through a Client/Server mode the management software for remote data center achieves real-time data distribution and time-series analysis. Besides, a GSM-short-message-based interface is developed for sending real-time environmental measurements, and for alarming when a measurement is beyond some pre-defined threshold. The whole system has been tested for over one year and satisfactory results have been observed, which indicate that this system is very useful for greenhouse environment monitoring. PMID:22163391

  1. A total patient monitoring system for point-of-care applications

    NASA Astrophysics Data System (ADS)

    Whitchurch, Ashwin K.; Abraham, Jose K.; Varadan, Vijay K.

    2007-04-01

    Traditionally, home care for chronically ill patients and the elderly requires periodic visits to the patient's home by doctors or healthcare personnel. During these visits, the visiting person usually records the patient's vital signs and takes decisions as to any change in treatment and address any issues that the patient may have. Patient monitoring systems have since changed this scenario by significantly reducing the number of home visits while not compromising on continuous monitoring. This paper describes the design and development of a patient monitoring systems capable of concurrent remote monitoring of 8 patient-worn sensors: Electroencephalogram (EEG), Electrocardiogram (ECG), temperature, airflow pressure, movement and chest expansion. These sensors provide vital signs useful for monitoring the health of chronically ill patients and alerts can be raised if certain specified signal levels fall above or below a preset threshold value. The data from all eight sensors are digitally transmitted to a PC or to a standalone network appliance which relays the data through an available internet connection to the remote monitoring client. Thus it provides a real-time rendering of the patient's health at a remote location.

  2. Linking removal targets to the ecological effects of invaders: a predictive model and field test.

    PubMed

    Green, Stephanie J; Dulvy, Nicholas K; Brooks, Annabelle M L; Akins, John L; Cooper, Andrew B; Miller, Skylar; Côté, Isabelle M

    Species invasions have a range of negative effects on recipient ecosystems, and many occur at a scale and magnitude that preclude complete eradication. When complete extirpation is unlikely with available management resources, an effective strategy may be to suppress invasive populations below levels predicted to cause undesirable ecological change. We illustrated this approach by developing and testing targets for the control of invasive Indo-Pacific lionfish (Pterois volitans and P. miles) on Western Atlantic coral reefs. We first developed a size-structured simulation model of predation by lionfish on native fish communities, which we used to predict threshold densities of lionfish beyond which native fish biomass should decline. We then tested our predictions by experimentally manipulating lionfish densities above or below reef-specific thresholds, and monitoring the consequences for native fish populations on 24 Bahamian patch reefs over 18 months. We found that reducing lionfish below predicted threshold densities effectively protected native fish community biomass from predation-induced declines. Reductions in density of 25–92%, depending on the reef, were required to suppress lionfish below levels predicted to overconsume prey. On reefs where lionfish were kept below threshold densities, native prey fish biomass increased by 50–70%. Gains in small (<6 cm) size classes of native fishes translated into lagged increases in larger size classes over time. The biomass of larger individuals (>15 cm total length), including ecologically important grazers and economically important fisheries species, had increased by 10–65% by the end of the experiment. Crucially, similar gains in prey fish biomass were realized on reefs subjected to partial and full removal of lionfish, but partial removals took 30% less time to implement. By contrast, the biomass of small native fishes declined by >50% on all reefs with lionfish densities exceeding reef-specific thresholds. Large inter-reef variation in the biomass of prey fishes at the outset of the study, which influences the threshold density of lionfish, means that we could not identify a single rule of thumb for guiding control efforts. However, our model provides a method for setting reef-specific targets for population control using local monitoring data. Our work is the first to demonstrate that for ongoing invasions, suppressing invaders below densities that cause environmental harm can have a similar effect, in terms of protecting the native ecosystem on a local scale, to achieving complete eradication.

  3. Threshold magnitudes for a multichannel correlation detector in background seismicity

    DOE PAGES

    Carmichael, Joshua D.; Hartse, Hans

    2016-04-01

    Colocated explosive sources often produce correlated seismic waveforms. Multichannel correlation detectors identify these signals by scanning template waveforms recorded from known reference events against "target" data to find similar waveforms. This screening problem is challenged at thresholds required to monitor smaller explosions, often because non-target signals falsely trigger such detectors. Therefore, it is generally unclear what thresholds will reliably identify a target explosion while screening non-target background seismicity. Here, we estimate threshold magnitudes for hypothetical explosions located at the North Korean nuclear test site over six months of 2010, by processing International Monitoring System (IMS) array data with a multichannelmore » waveform correlation detector. Our method (1) accounts for low amplitude background seismicity that falsely triggers correlation detectors but is unidentifiable with conventional power beams, (2) adapts to diurnally variable noise levels and (3) uses source-receiver reciprocity concepts to estimate thresholds for explosions spatially separated from the template source. Furthermore, we find that underground explosions with body wave magnitudes m b = 1.66 are detectable at the IMS array USRK with probability 0.99, when using template waveforms consisting only of P -waves, without false alarms. We conservatively find that these thresholds also increase by up to a magnitude unit for sources located 4 km or more from the Feb.12, 2013 announced nuclear test.« less

  4. Development of ecological indicator guilds for land management

    USGS Publications Warehouse

    Krzysik, A.J.; Balbach, H.E.; Duda, J.J.; Emlen, J.M.; Freeman, D.C.; Graham, J.H.; Kovacic, D.A.; Smith, L.M.; Zak, J.C.

    2005-01-01

    Agency land-use must be efficiently and cost-effectively monitored to assess conditions and trends in ecosystem processes and natural resources relevant to mission requirements and legal mandates. Ecological Indicators represent important land management tools for tracking ecological changes and preventing irreversible environmental damage in disturbed landscapes. The overall objective of the research was to develop both individual and integrated sets (i.e., statistically derived guilds) of Ecological Indicators to: quantify habitat conditions and trends, track and monitor ecological changes, provide early warning or threshold detection, and provide guidance for land managers. The derivation of Ecological Indicators was based on statistical criteria, ecosystem relevance, reliability and robustness, economy and ease of use for land managers, multi-scale performance, and stress response criteria. The basis for the development of statistically based Ecological Indicators was the identification of ecosystem metrics that analytically tracked a landscape disturbance gradient.

  5. Real-time monitoring and short-term forecasting of drought in Norway

    NASA Astrophysics Data System (ADS)

    Kwok Wong, Wai; Hisdal, Hege

    2013-04-01

    Drought is considered to be one of the most costly natural disasters. Drought monitoring and forecasting are thus important for sound water management. In this study hydrological drought characteristics applicable for real-time monitoring and short-term forecasting of drought in Norway were developed. A spatially distributed hydrological model (HBV) implemented in a Web-based GIS framework provides a platform for drought analyses and visualizations. A number of national drought maps can be produced, which is a simple and effective way to communicate drought conditions to decision makers and the public. The HBV model is driven by precipitation and air temperature data. On a daily time step it calculates the water balance for 1 x 1 km2 grid cells characterized by their elevation and land use. Drought duration and areal drought coverage for runoff and subsurface storage (sum of soil moisture and groundwater) were derived. The threshold level method was used to specify drought conditions on a grid cell basis. The daily 10th percentile thresholds were derived from seven-day windows centered on that calendar day from the reference period 1981-2010 (threshold not exceeded 10% of the time). Each individual grid cell was examined to determine if it was below its respective threshold level. Daily drought-stricken areas can then be easily identified when visualized on a map. The drought duration can also be tracked and calculated by a retrospective analysis. Real-time observations from synoptic stations interpolated to a regular grid of 1 km resolution constituted the forcing data for the current situation. 9-day meteorological forecasts were used as input to the HBV model to obtain short-term hydrological drought forecasts. Downscaled precipitation and temperature fields from two different atmospheric models were applied. The first two days of the forecast period adopted the forecasts from Unified Model (UM4) while the following seven days were based on the 9-day forecasts from ECMWF. The approach has been tested and is now available on the Web for operational water management.

  6. On-Board GPS Clock Monitoring for Signal Integrity

    DTIC Science & Technology

    2010-11-01

    to-alert requirements to permit primary reliance for safety -of-life applications. Augmentation systems are being developed and deployed to address...virtually no false alerts from the combined system . With three running, on-board AFSs , occasional breaks of the error threshold can be allowed if the... system can be assured of transfer to another AFS within a period shorter than the required TTA. With only two AFSs on board running and measured

  7. Measuring milk fat content by random laser emission

    NASA Astrophysics Data System (ADS)

    Abegão, Luis M. G.; Pagani, Alessandra A. C.; Zílio, Sérgio C.; Alencar, Márcio A. R. C.; Rodrigues, José J.

    2016-10-01

    The luminescence spectra of milk containing rhodamine 6G are shown to exhibit typical signatures of random lasing when excited with 532 nm laser pulses. Experiments carried out on whole and skim forms of two commercial brands of UHT milk, with fat volume concentrations ranging from 0 to 4%, presented lasing threshold values dependent on the fat concentration, suggesting that a random laser technique can be developed to monitor such important parameter.

  8. Measuring milk fat content by random laser emission.

    PubMed

    Abegão, Luis M G; Pagani, Alessandra A C; Zílio, Sérgio C; Alencar, Márcio A R C; Rodrigues, José J

    2016-10-12

    The luminescence spectra of milk containing rhodamine 6G are shown to exhibit typical signatures of random lasing when excited with 532 nm laser pulses. Experiments carried out on whole and skim forms of two commercial brands of UHT milk, with fat volume concentrations ranging from 0 to 4%, presented lasing threshold values dependent on the fat concentration, suggesting that a random laser technique can be developed to monitor such important parameter.

  9. The FY 1980 Department of Defense Program for Research, Development, and Acquisition

    DTIC Science & Technology

    1979-02-01

    materiel. Up to a point, superior performance is an offset to this quantitative disadvantage. Lanchester’s theory of warfare derived simplified relations...intermediate ranges. Underground Test. The next scheduled underground test ( UGT ), MINERS IRON, in FY 1980, will provide engineering and design data on...methods of discriminating between UGTs and earthquakes, and address U.S. capabilities to monitor both the existing Threshold Test Ban Treaty and the

  10. Dynamic Task Optimization in Remote Diabetes Monitoring Systems.

    PubMed

    Suh, Myung-Kyung; Woodbridge, Jonathan; Moin, Tannaz; Lan, Mars; Alshurafa, Nabil; Samy, Lauren; Mortazavi, Bobak; Ghasemzadeh, Hassan; Bui, Alex; Ahmadi, Sheila; Sarrafzadeh, Majid

    2012-09-01

    Diabetes is the seventh leading cause of death in the United States, but careful symptom monitoring can prevent adverse events. A real-time patient monitoring and feedback system is one of the solutions to help patients with diabetes and their healthcare professionals monitor health-related measurements and provide dynamic feedback. However, data-driven methods to dynamically prioritize and generate tasks are not well investigated in the domain of remote health monitoring. This paper presents a wireless health project (WANDA) that leverages sensor technology and wireless communication to monitor the health status of patients with diabetes. The WANDA dynamic task management function applies data analytics in real-time to discretize continuous features, applying data clustering and association rule mining techniques to manage a sliding window size dynamically and to prioritize required user tasks. The developed algorithm minimizes the number of daily action items required by patients with diabetes using association rules that satisfy a minimum support, confidence and conditional probability thresholds. Each of these tasks maximizes information gain, thereby improving the overall level of patient adherence and satisfaction. Experimental results from applying EM-based clustering and Apriori algorithms show that the developed algorithm can predict further events with higher confidence levels and reduce the number of user tasks by up to 76.19 %.

  11. Dynamic Task Optimization in Remote Diabetes Monitoring Systems

    PubMed Central

    Suh, Myung-kyung; Woodbridge, Jonathan; Moin, Tannaz; Lan, Mars; Alshurafa, Nabil; Samy, Lauren; Mortazavi, Bobak; Ghasemzadeh, Hassan; Bui, Alex; Ahmadi, Sheila; Sarrafzadeh, Majid

    2016-01-01

    Diabetes is the seventh leading cause of death in the United States, but careful symptom monitoring can prevent adverse events. A real-time patient monitoring and feedback system is one of the solutions to help patients with diabetes and their healthcare professionals monitor health-related measurements and provide dynamic feedback. However, data-driven methods to dynamically prioritize and generate tasks are not well investigated in the domain of remote health monitoring. This paper presents a wireless health project (WANDA) that leverages sensor technology and wireless communication to monitor the health status of patients with diabetes. The WANDA dynamic task management function applies data analytics in real-time to discretize continuous features, applying data clustering and association rule mining techniques to manage a sliding window size dynamically and to prioritize required user tasks. The developed algorithm minimizes the number of daily action items required by patients with diabetes using association rules that satisfy a minimum support, confidence and conditional probability thresholds. Each of these tasks maximizes information gain, thereby improving the overall level of patient adherence and satisfaction. Experimental results from applying EM-based clustering and Apriori algorithms show that the developed algorithm can predict further events with higher confidence levels and reduce the number of user tasks by up to 76.19 %. PMID:27617297

  12. A method for achieving an order-of-magnitude increase in the temporal resolution of a standard CRT computer monitor.

    PubMed

    Fiesta, Matthew P; Eagleman, David M

    2008-09-15

    As the frequency of a flickering light is increased, the perception of flicker is replaced by the perception of steady light at what is known as the critical flicker fusion threshold (CFFT). This threshold provides a useful measure of the brain's information processing speed, and has been used in medicine for over a century both for diagnostic and drug efficacy studies. However, the hardware for presenting the stimulus has not advanced to take advantage of computers, largely because the refresh rates of typical monitors are too slow to provide fine-grained changes in the alternation rate of a visual stimulus. For example, a cathode ray tube (CRT) computer monitor running at 100Hz will render a new frame every 10 ms, thus restricting the period of a flickering stimulus to multiples of 20 ms. These multiples provide a temporal resolution far too low to make precise threshold measurements, since typical CFFT values are in the neighborhood of 35 ms. We describe here a simple and novel technique to enable alternating images at several closely-spaced periods on a standard monitor. The key to our technique is to programmatically control the video card to dynamically reset the refresh rate of the monitor. Different refresh rates allow slightly different frame durations; this can be leveraged to vastly increase the resolution of stimulus presentation times. This simple technique opens new inroads for experiments on computers that require more finely-spaced temporal resolution than a monitor at a single, fixed refresh rate can allow.

  13. Development of a Personal Integrated Environmental Monitoring System

    PubMed Central

    Wong, Man Sing; Yip, Tsan Pong; Mok, Esmond

    2014-01-01

    Environmental pollution in the urban areas of Hong Kong has become a serious public issue but most urban inhabitants have no means of judging their own living environment in terms of dangerous threshold and overall livability. Currently there exist many low-cost sensors such as ultra-violet, temperature and air quality sensors that provide reasonably accurate data quality. In this paper, the development and evaluation of Integrated Environmental Monitoring System (IEMS) are illustrated. This system consists of three components: (i) position determination and sensor data collection for real-time geospatial-based environmental monitoring; (ii) on-site data communication and visualization with the aid of an Android-based application; and (iii) data analysis on a web server. This system has shown to be working well during field tests in a bus journey and a construction site. It provides an effective service platform for collecting environmental data in near real-time, and raises the public awareness of environmental quality in micro-environments. PMID:25420154

  14. Development of intelligent monitoring purifier for indoor PM 2.5

    NASA Astrophysics Data System (ADS)

    Lou, Guanting; Zhu, Rong; Guo, Jiangwei; Wei, Yongqing

    2018-03-01

    The particulate matter 2.5 (PM2.5) refers to tiny particles or droplets in the air that are two and one half microns or less in width. PM2.5 is an air pollutant that is a concern for people’s health when levels in air are high. The intelligent monitoring purifier was developed to detect indoor PM2.5 concentration before and after purification and the monitoring data could be displayed on the LCD screen, displaying different color patterns according to the concentrations. Through the Bluetooth transport module, real-time values could also display on the mobile phone and voice broadcast PM2.5 concentration level in the air. When PM2.5 concentration is higher than the setting threshold, the convection fan rotation and the speed can be remote controlled with mobile phone through the Bluetooth transport. Therefore, the efficiency and scope of the purification could be enhanced and further better air quality could be achieved.

  15. Operation of a real-time warning system for debris flows in the San Francisco bay area, California

    USGS Publications Warehouse

    Wilson, Raymond C.; Mark, Robert K.; Barbato, Gary; ,

    1993-01-01

    The United States Geological Survey (USGS) and the National Weather Service (NWS) have developed an operational warning system for debris flows during severe rainstorms in the San Francisco Bay region. The NWS makes quantitative forecasts of precipitation from storm systems approaching the Bay area and coordinates a regional network of radio-telemetered rain gages. The USGS has formulated thresholds for the intensity and duration of rainfall required to initiate debris flows. The first successful public warnings were issued during a severe storm sequence in February 1986. Continued operation of the warning system since 1986 has provided valuable working experience in rainfall forecasting and monitoring, refined rainfall thresholds, and streamlined procedures for issuing public warnings. Advisory statements issued since 1986 are summarized.

  16. A better way to evaluate remote monitoring programs in chronic disease care: receiver operating characteristic analysis.

    PubMed

    Brown Connolly, Nancy E

    2014-12-01

    This foundational study applies the process of receiver operating characteristic (ROC) analysis to evaluate utility and predictive value of a disease management (DM) model that uses RM devices for chronic obstructive pulmonary disease (COPD). The literature identifies a need for a more rigorous method to validate and quantify evidence-based value for remote monitoring (RM) systems being used to monitor persons with a chronic disease. ROC analysis is an engineering approach widely applied in medical testing, but that has not been evaluated for its utility in RM. Classifiers (saturated peripheral oxygen [SPO2], blood pressure [BP], and pulse), optimum threshold, and predictive accuracy are evaluated based on patient outcomes. Parametric and nonparametric methods were used. Event-based patient outcomes included inpatient hospitalization, accident and emergency, and home health visits. Statistical analysis tools included Microsoft (Redmond, WA) Excel(®) and MedCalc(®) (MedCalc Software, Ostend, Belgium) version 12 © 1993-2013 to generate ROC curves and statistics. Persons with COPD were monitored a minimum of 183 days, with at least one inpatient hospitalization within 12 months prior to monitoring. Retrospective, de-identified patient data from a United Kingdom National Health System COPD program were used. Datasets included biometric readings, alerts, and resource utilization. SPO2 was identified as a predictive classifier, with an optimal average threshold setting of 85-86%. BP and pulse were failed classifiers, and areas of design were identified that may improve utility and predictive capacity. Cost avoidance methodology was developed. RESULTS can be applied to health services planning decisions. Methods can be applied to system design and evaluation based on patient outcomes. This study validated the use of ROC in RM program evaluation.

  17. Process Control for Precipitation Prevention in Space Water Recovery Systems

    NASA Technical Reports Server (NTRS)

    Sargusingh, Miriam; Callahan, Michael R.; Muirhead, Dean

    2015-01-01

    The ability to recover and purify water through physiochemical processes is crucial for realizing long-term human space missions, including both planetary habitation and space travel. Because of their robust nature, rotary distillation systems have been actively pursued by NASA as one of the technologies for water recovery from wastewater primarily comprised of human urine. A specific area of interest is the prevention of the formation of solids that could clog fluid lines and damage rotating equipment. To mitigate the formation of solids, operational constraints are in place that limits such that the concentration of key precipitating ions in the wastewater brine are below the theoretical threshold. This control in effected by limiting the amount of water recovered such that the risk of reaching the precipitation threshold is within acceptable limits. The water recovery limit is based on an empirically derived worst case wastewater composition. During the batch process, water recovery is estimated by monitoring the throughput of the system. NASA Johnson Space Center is working on means of enhancing the process controls to increase water recovery. Options include more precise prediction of the precipitation threshold. To this end, JSC is developing a means of more accurately measuring the constituent of the brine and/or wastewater. Another means would be to more accurately monitor the throughput of the system. In spring of 2015, testing will be performed to test strategies for optimizing water recovery without increasing the risk of solids formation in the brine.

  18. Effects of Local Compression on Peroneal Nerve Function in Humans

    NASA Technical Reports Server (NTRS)

    Hargens, Alan R.; Botte, Michael J.; Swenson, Michael R.; Gelberman, Richard H.; Rhoades, Charles E.; Akeson, Wayne H.

    1993-01-01

    A new apparatus was developed to compress the anterior compartment selectively and reproducibly in humans. Thirty-five normal volunteers were studied to determine short-term thresholds of local tissue pressure that produce significant neuromuscular dysfunction. Local tissue fluid pressure adjacent to the deep peroneal nerve was elevated by the compression apparatus and continuously monitored for 2-3 h by the slit catheter technique. Elevation of tissue fluid pressure to within 35-40 mm Hg of diastolic blood pressure (approx. 40 mm Hg of in situ pressure in our subjects) elicited a consistent progression of neuromuscular deterioration including, in order, (a) gradual loss of sensation, as assessed by Semmes-Weinstein monofilaments, (b) subjective complaints, (c) reduced nerve conduction velocity, (d) decreased action potential amplitude of the extensor digitorum brevis muscle, and (e) motor weakness of muscles within the anterior compartment. Generally, higher intracompartment at pressures caused more rapid deterioration of neuromuscular function. In two subjects, when in situ compression levels were 0 and 30 mm Hg, normal neuromuscular function was maintained for 3 h. Threshold pressures for significant dysfunction were not always the same for each functional parameter studied, and the magnitudes of each functional deficit did not always correlate with compression level. This variable tolerance to elevated pressure emphasizes the need to monitor clinical signs and symptoms carefully in the diagnosis of compartment syndromes. The nature of the present studies was short term; longer term compression of myoneural tissues may result in dysfunction at lower pressure thresholds.

  19. Home blood pressure monitoring. Current knowledge and directions for future research.

    PubMed

    Reims, H; Fossum, E; Kjeldsen, S E; Julius, S

    2001-01-01

    Home blood pressure (BP) monitoring has become popular in clinical practice and several automated devices for home BP measurement are now recommendable. Home BP is generally lower than clinic BP, and similar to daytime ambulatory BP. Home BP measurement eliminates the white coat effect and provides a high number of readings, and it is considered more accurate and reproducible than clinic BP. It can improve the sensitivity and statistical power of clinical drug trials and may have a higher prognostic value than clinic BP. Home monitoring may improve compliance and BP control, and reduce costs of hypertension management. Diagnostic thresholds and treatment target values for home BP remain to be established by longitudinal studies. Until then, home BP monitoring is to be considered a supplement. However, high home BP may support or confirm the diagnosis made in the doctor's office, and low home BP may warrant ambulatory BP monitoring. During long-term follow-up, home BP monitoring provides an opportunity for close attention to BP levels and variations. The first international guidelines have established a consensus document with recommendations, including a proposal of preliminary diagnostic thresholds, but further research is needed to define the precise role of home BP monitoring in clinical practice.

  20. Thermal bistability-based method for real-time optimization of ultralow-threshold whispering gallery mode microlasers.

    PubMed

    Lin, Guoping; Candela, Y; Tillement, O; Cai, Zhiping; Lefèvre-Seguin, V; Hare, J

    2012-12-15

    A method based on thermal bistability for ultralow-threshold microlaser optimization is demonstrated. When sweeping the pump laser frequency across a pump resonance, the dynamic thermal bistability slows down the power variation. The resulting line shape modification enables a real-time monitoring of the laser characteristic. We demonstrate this method for a functionalized microsphere exhibiting a submicrowatt laser threshold. This approach is confirmed by comparing the results with a step-by-step recording in quasi-static thermal conditions.

  1. Detection and quantification system for monitoring instruments

    DOEpatents

    Dzenitis, John M [Danville, CA; Hertzog, Claudia K [Houston, TX; Makarewicz, Anthony J [Livermore, CA; Henderer, Bruce D [Livermore, CA; Riot, Vincent J [Oakland, CA

    2008-08-12

    A method of detecting real events by obtaining a set of recent signal results, calculating measures of the noise or variation based on the set of recent signal results, calculating an expected baseline value based on the set of recent signal results, determining sample deviation, calculating an allowable deviation by multiplying the sample deviation by a threshold factor, setting an alarm threshold from the baseline value plus or minus the allowable deviation, and determining whether the signal results exceed the alarm threshold.

  2. First Evaluation of Infrared Thermography as a Tool for the Monitoring of Udder Health Status in Farms of Dairy Cows.

    PubMed

    Zaninelli, Mauro; Redaelli, Veronica; Luzi, Fabio; Bronzo, Valerio; Mitchell, Malcolm; Dell'Orto, Vittorio; Bontempo, Valentino; Cattaneo, Donata; Savoini, Giovanni

    2018-03-14

    The aim of the present study was to test infrared thermography (IRT), under field conditions, as a possible tool for the evaluation of cow udder health status. Thermographic images (n. 310) from different farms (n. 3) were collected and evaluated using a dedicated software application to calculate automatically and in a standardized way, thermographic indices of each udder. Results obtained have confirmed a significant relationship between udder surface skin temperature (USST) and classes of somatic cell count in collected milk samples. Sensitivity and specificity in the classification of udder health were: 78.6% and 77.9%, respectively, considering a level of somatic cell count ( SCC ) of 200,000 cells/mL as a threshold to classify a subclinical mastitis or 71.4% and 71.6%, respectively when a threshold of 400,000 cells/mL was adopted. Even though the sensitivity and specificity were lower than in other published papers dealing with non-automated analysis of IRT images, they were considered acceptable as a first field application of this new and developing technology. Future research will permit further improvements in the use of IRT, at farm level. Such improvements could be attained through further image processing and enhancement, and the application of indicators developed and tested in the present study with the purpose of developing a monitoring system for the automatic and early detection of mastitis in individual animals on commercial farms.

  3. First Evaluation of Infrared Thermography as a Tool for the Monitoring of Udder Health Status in Farms of Dairy Cows

    PubMed Central

    Luzi, Fabio; Bronzo, Valerio; Mitchell, Malcolm; Dell’Orto, Vittorio; Bontempo, Valentino; Savoini, Giovanni

    2018-01-01

    The aim of the present study was to test infrared thermography (IRT), under field conditions, as a possible tool for the evaluation of cow udder health status. Thermographic images (n. 310) from different farms (n. 3) were collected and evaluated using a dedicated software application to calculate automatically and in a standardized way, thermographic indices of each udder. Results obtained have confirmed a significant relationship between udder surface skin temperature (USST) and classes of somatic cell count in collected milk samples. Sensitivity and specificity in the classification of udder health were: 78.6% and 77.9%, respectively, considering a level of somatic cell count (SCC) of 200,000 cells/mL as a threshold to classify a subclinical mastitis or 71.4% and 71.6%, respectively when a threshold of 400,000 cells/mL was adopted. Even though the sensitivity and specificity were lower than in other published papers dealing with non-automated analysis of IRT images, they were considered acceptable as a first field application of this new and developing technology. Future research will permit further improvements in the use of IRT, at farm level. Such improvements could be attained through further image processing and enhancement, and the application of indicators developed and tested in the present study with the purpose of developing a monitoring system for the automatic and early detection of mastitis in individual animals on commercial farms. PMID:29538352

  4. Design and fabrication of prototype system for early warning of impending bearing failure

    NASA Technical Reports Server (NTRS)

    Meacher, J.; Chen, H. M.

    1974-01-01

    A test program was conducted with the objective of developing a method and equipment for on-line monitoring of installed ball bearings to detect deterioration or impending failure of the bearings. The program was directed at the spin-axis bearings of a control moment gyro. The bearings were tested at speeds of 6000 and 8000 rpm, thrust loads from 50 to 1000 pounds, with a wide range of lubrication conditions, with and without a simulated fatigue spall implanted in the inner race ball track. It was concluded that a bearing monitor system based on detection and analysis of modulations of a fault indicating bearing resonance frequency can provide a low threshold of sensitivity.

  5. Real-time identification of residential appliance events based on power monitoring

    NASA Astrophysics Data System (ADS)

    Yang, Zhao; Zhu, Zhicheng; Wei, Zhiqiang; Yin, Bo; Wang, Xiuwei

    2018-03-01

    Energy monitoring for specific home appliances has been regarded as the pre-requisite for reducing residential energy consumption. To enhance the accuracy of identifying operation status of household appliances and to keep pace with the development of smart power grid, this paper puts forward the integration of electric current and power data on the basis of existing algorithm. If average power difference of several adjacent cycles varies from the baseline and goes beyond the pre-assigned threshold value, the event will be flagged. Based on MATLAB platform and domestic appliances simulations, the results of tested data and verified algorithm indicate that the power method has accomplished desired results of appliance identification.

  6. Longitudinal Comparison of Auditory Steady-State Evoked Potentials in Preterm and Term Infants: The Maturation Process

    PubMed Central

    Sousa, Ana Constantino; Didoné, Dayane Domeneghini; Sleifer, Pricila

    2017-01-01

    Introduction  Preterm neonates are at risk of changes in their auditory system development, which explains the need for auditory monitoring of this population. The Auditory Steady-State Response (ASSR) is an objective method that allows obtaining the electrophysiological thresholds with greater applicability in neonatal and pediatric population. Objective  The purpose of this study is to compare the ASSR thresholds in preterm and term infants evaluated during two stages. Method  The study included 63 normal hearing neonates: 33 preterm and 30 term. They underwent assessment of ASSR in both ears simultaneously through insert phones in the frequencies of 500 to 4000Hz with the amplitude modulated from 77 to 103Hz. We presented the intensity at a decreasing level to detect the minimum level of responses. At 18 months, 26 of 33 preterm infants returned for the new assessment for ASSR and were compared with 30 full-term infants. We compared between groups according to gestational age. Results  Electrophysiological thresholds were higher in preterm than in full-term neonates ( p  < 0.05) at the first testing. There were no significant differences between ears and gender. At 18 months, there was no difference between groups ( p  > 0.05) in all the variables described. Conclusion  In the first evaluation preterm had higher thresholds in ASSR. There was no difference at 18 months of age, showing the auditory maturation of preterm infants throughout their development. PMID:28680486

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmichael, Joshua D.; Hartse, Hans

    Colocated explosive sources often produce correlated seismic waveforms. Multichannel correlation detectors identify these signals by scanning template waveforms recorded from known reference events against "target" data to find similar waveforms. This screening problem is challenged at thresholds required to monitor smaller explosions, often because non-target signals falsely trigger such detectors. Therefore, it is generally unclear what thresholds will reliably identify a target explosion while screening non-target background seismicity. Here, we estimate threshold magnitudes for hypothetical explosions located at the North Korean nuclear test site over six months of 2010, by processing International Monitoring System (IMS) array data with a multichannelmore » waveform correlation detector. Our method (1) accounts for low amplitude background seismicity that falsely triggers correlation detectors but is unidentifiable with conventional power beams, (2) adapts to diurnally variable noise levels and (3) uses source-receiver reciprocity concepts to estimate thresholds for explosions spatially separated from the template source. Furthermore, we find that underground explosions with body wave magnitudes m b = 1.66 are detectable at the IMS array USRK with probability 0.99, when using template waveforms consisting only of P -waves, without false alarms. We conservatively find that these thresholds also increase by up to a magnitude unit for sources located 4 km or more from the Feb.12, 2013 announced nuclear test.« less

  8. Using natural range of variation to set decision thresholds: a case study for great plains grasslands

    USGS Publications Warehouse

    Symstad, Amy J.; Jonas, Jayne L.; Edited by Guntenspergen, Glenn R.

    2014-01-01

    Natural range of variation (NRV) may be used to establish decision thresholds or action assessment points when ecological thresholds are either unknown or do not exist for attributes of interest in a managed ecosystem. The process for estimating NRV involves identifying spatial and temporal scales that adequately capture the heterogeneity of the ecosystem; compiling data for the attributes of interest via study of historic records, analysis and interpretation of proxy records, modeling, space-for-time substitutions, or analysis of long-term monitoring data; and quantifying the NRV from those data. At least 19 National Park Service (NPS) units in North America’s Great Plains are monitoring plant species richness and evenness as indicators of vegetation integrity in native grasslands, but little information on natural, temporal variability of these indicators is available. In this case study, we use six long-term vegetation monitoring datasets to quantify the temporal variability of these attributes in reference conditions for a variety of Great Plains grassland types, and then illustrate the implications of using different NRVs based on these quantities for setting management decision thresholds. Temporal variability of richness (as measured by the coefficient of variation, CV) is fairly consistent across the wide variety of conditions occurring in Colorado shortgrass prairie to Minnesota tallgrass sand savanna (CV 0.20–0.45) and generally less than that of production at the same sites. Temporal variability of evenness spans a greater range of CV than richness, and it is greater than that of production in some sites but less in other sites. This natural temporal variability may mask undesirable changes in Great Plains grasslands vegetation. Consequently, we suggest that managers consider using a relatively narrow NRV (interquartile range of all richness or evenness values observed in reference conditions) for designating a surveillance threshold, at which greater attention to the situation would be paid, and a broader NRV for designating management thresholds, at which action would be instigated.

  9. Structural Health Monitoring: Leveraging Pain in the Human Body

    NASA Astrophysics Data System (ADS)

    Nayak, Subhadarshi

    2012-07-01

    Tissue damage, or the perception thereof, is managed through pain experience. The neurobiological process of pain triggers most effective defense mechanisms for our safety. Structural health monitoring (SHM) is also a very similar function, albeit in engineering systems. SHM technology can leverage many aspects of pain mechanisms to progress in several critical areas. Discrimination between features from the undamaged and damaged structures can follow the threshold gate mechanism of the pain perception. Furthermore, the sensing mechanisms can be adaptive to changes by adjusting the threshold as does the pain perception. A distributed sensor network, often advanced by SHM, can be made fault-tolerant and robust by following the perception way of self-organization and redundancy. Data handling in real life is a huge challenge for large-scale SHM. As sensory data of pain is first cleaned, the threshold is then processed through experiential information gathering and use.

  10. Method and apparatus for monitoring a hydrocarbon-selective catalytic reduction device

    DOEpatents

    Schmieg, Steven J; Viola, Michael B; Cheng, Shi-Wai S; Mulawa, Patricia A; Hilden, David L; Sloane, Thompson M; Lee, Jong H

    2014-05-06

    A method for monitoring a hydrocarbon-selective catalytic reactor device of an exhaust aftertreatment system of an internal combustion engine operating lean of stoichiometry includes injecting a reductant into an exhaust gas feedstream upstream of the hydrocarbon-selective catalytic reactor device at a predetermined mass flowrate of the reductant, and determining a space velocity associated with a predetermined forward portion of the hydrocarbon-selective catalytic reactor device. When the space velocity exceeds a predetermined threshold space velocity, a temperature differential across the predetermined forward portion of the hydrocarbon-selective catalytic reactor device is determined, and a threshold temperature as a function of the space velocity and the mass flowrate of the reductant is determined. If the temperature differential across the predetermined forward portion of the hydrocarbon-selective catalytic reactor device is below the threshold temperature, operation of the engine is controlled to regenerate the hydrocarbon-selective catalytic reactor device.

  11. Safe and effective error rate monitors for SS7 signaling links

    NASA Astrophysics Data System (ADS)

    Schmidt, Douglas C.

    1994-04-01

    This paper describes SS7 error monitor characteristics, discusses the existing SUERM (Signal Unit Error Rate Monitor), and develops the recently proposed EIM (Error Interval Monitor) for higher speed SS7 links. A SS7 error monitor is considered safe if it ensures acceptable link quality and is considered effective if it is tolerant to short-term phenomena. Formal criteria for safe and effective error monitors are formulated in this paper. This paper develops models of changeover transients, the unstable component of queue length resulting from errors. These models are in the form of recursive digital filters. Time is divided into sequential intervals. The filter's input is the number of errors which have occurred in each interval. The output is the corresponding change in transmit queue length. Engineered EIM's are constructed by comparing an estimated changeover transient with a threshold T using a transient model modified to enforce SS7 standards. When this estimate exceeds T, a changeover will be initiated and the link will be removed from service. EIM's can be differentiated from SUERM by the fact that EIM's monitor errors over an interval while SUERM's count errored messages. EIM's offer several advantages over SUERM's, including the fact that they are safe and effective, impose uniform standards in link quality, are easily implemented, and make minimal use of real-time resources.

  12. Describing temporal variation in reticuloruminal pH using continuous monitoring data.

    PubMed

    Denwood, M J; Kleen, J L; Jensen, D B; Jonsson, N N

    2018-01-01

    Reticuloruminal pH has been linked to subclinical disease in dairy cattle, leading to considerable interest in identifying pH observations below a given threshold. The relatively recent availability of continuously monitored data from pH boluses gives new opportunities for characterizing the normal patterns of pH over time and distinguishing these from abnormal patterns using more sensitive and specific methods than simple thresholds. We fitted a series of statistical models to continuously monitored data from 93 animals on 13 farms to characterize normal variation within and between animals. We used a subset of the data to relate deviations from the normal pattern to the productivity of 24 dairy cows from a single herd. Our findings show substantial variation in pH characteristics between animals, although animals within the same farm tended to show more consistent patterns. There was strong evidence for a predictable diurnal variation in all animals, and up to 70% of the observed variation in pH could be explained using a simple statistical model. For the 24 animals with available production information, there was also a strong association between productivity (as measured by both milk yield and dry matter intake) and deviations from the expected diurnal pattern of pH 2 d before the productivity observation. In contrast, there was no association between productivity and the occurrence of observations below a threshold pH. We conclude that statistical models can be used to account for a substantial proportion of the observed variability in pH and that future work with continuously monitored pH data should focus on deviations from a predictable pattern rather than the frequency of observations below an arbitrary pH threshold. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  13. Detection capability of the IMS seismic network based on ambient seismic noise measurements

    NASA Astrophysics Data System (ADS)

    Gaebler, Peter J.; Ceranna, Lars

    2016-04-01

    All nuclear explosions - on the Earth's surface, underground, underwater or in the atmosphere - are banned by the Comprehensive Nuclear-Test-Ban Treaty (CTBT). As part of this treaty, a verification regime was put into place to detect, locate and characterize nuclear explosion testings at any time, by anyone and everywhere on the Earth. The International Monitoring System (IMS) plays a key role in the verification regime of the CTBT. Out of the different monitoring techniques used in the IMS, the seismic waveform approach is the most effective technology for monitoring nuclear underground testing and to identify and characterize potential nuclear events. This study introduces a method of seismic threshold monitoring to assess an upper magnitude limit of a potential seismic event in a certain given geographical region. The method is based on ambient seismic background noise measurements at the individual IMS seismic stations as well as on global distance correction terms for body wave magnitudes, which are calculated using the seismic reflectivity method. From our investigations we conclude that a global detection threshold of around mb 4.0 can be achieved using only stations from the primary seismic network, a clear latitudinal dependence for the detection threshold can be observed between northern and southern hemisphere. Including the seismic stations being part of the auxiliary seismic IMS network results in a slight improvement of global detection capability. However, including wave arrivals from distances greater than 120 degrees, mainly PKP-wave arrivals, leads to a significant improvement in average global detection capability. In special this leads to an improvement of the detection threshold on the southern hemisphere. We further investigate the dependence of the detection capability on spatial (latitude and longitude) and temporal (time) parameters, as well as on parameters such as source type and percentage of operational IMS stations.

  14. Evaluation of Bayesian estimation of a hidden continuous-time Markov chain model with application to threshold violation in water-quality indicators

    USGS Publications Warehouse

    Deviney, Frank A.; Rice, Karen; Brown, Donald E.

    2012-01-01

    Natural resource managers require information concerning  the frequency, duration, and long-term probability of occurrence of water-quality indicator (WQI) violations of defined thresholds. The timing of these threshold crossings often is hidden from the observer, who is restricted to relatively infrequent observations. Here, a model for the hidden process is linked with a model for the observations, and the parameters describing duration, return period, and long-term probability of occurrence are estimated using Bayesian methods. A simulation experiment is performed to evaluate the approach under scenarios based on the equivalent of a total monitoring period of 5-30 years and an observation frequency of 1-50 observations per year. Given constant threshold crossing rate, accuracy and precision of parameter estimates increased with longer total monitoring period and more-frequent observations. Given fixed monitoring period and observation frequency, accuracy and precision of parameter estimates increased with longer times between threshold crossings. For most cases where the long-term probability of being in violation is greater than 0.10, it was determined that at least 600 observations are needed to achieve precise estimates.  An application of the approach is presented using 22 years of quasi-weekly observations of acid-neutralizing capacity from Deep Run, a stream in Shenandoah National Park, Virginia. The time series also was sub-sampled to simulate monthly and semi-monthly sampling protocols. Estimates of the long-term probability of violation were unbiased despite sampling frequency; however, the expected duration and return period were over-estimated using the sub-sampled time series with respect to the full quasi-weekly time series.

  15. Dynamic-thresholding level set: a novel computer-aided volumetry method for liver tumors in hepatic CT images

    NASA Astrophysics Data System (ADS)

    Cai, Wenli; Yoshida, Hiroyuki; Harris, Gordon J.

    2007-03-01

    Measurement of the volume of focal liver tumors, called liver tumor volumetry, is indispensable for assessing the growth of tumors and for monitoring the response of tumors to oncology treatments. Traditional edge models, such as the maximum gradient and zero-crossing methods, often fail to detect the accurate boundary of a fuzzy object such as a liver tumor. As a result, the computerized volumetry based on these edge models tends to differ from manual segmentation results performed by physicians. In this study, we developed a novel computerized volumetry method for fuzzy objects, called dynamic-thresholding level set (DT level set). An optimal threshold value computed from a histogram tends to shift, relative to the theoretical threshold value obtained from a normal distribution model, toward a smaller region in the histogram. We thus designed a mobile shell structure, called a propagating shell, which is a thick region encompassing the level set front. The optimal threshold calculated from the histogram of the shell drives the level set front toward the boundary of a liver tumor. When the volume ratio between the object and the background in the shell approaches one, the optimal threshold value best fits the theoretical threshold value and the shell stops propagating. Application of the DT level set to 26 hepatic CT cases with 63 biopsy-confirmed hepatocellular carcinomas (HCCs) and metastases showed that the computer measured volumes were highly correlated with those of tumors measured manually by physicians. Our preliminary results showed that DT level set was effective and accurate in estimating the volumes of liver tumors detected in hepatic CT images.

  16. Global Monitoring of the CTBT: Progress, Capabilities and Plans (Invited)

    NASA Astrophysics Data System (ADS)

    Zerbo, L.

    2013-12-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), established in 1996, is tasked with building up the verification regime of the CTBT. The regime includes a global system for monitoring the earth, the oceans and the atmosphere for nuclear tests, and an on-site inspection (OSI) capability. More than 80% of the 337 facilities of the International Monitoring System (IMS) have been installed and are sending data to the International Data Centre (IDC) in Vienna, Austria for processing. These IMS data along with IDC processed and reviewed products are available to all States that have signed the Treaty. Concurrent with the build-up of the global monitoring networks, near-field geophysical methods are being developed and tested for OSIs. The monitoring system is currently operating in a provisional mode, as the Treaty has not yet entered into force. Progress in installing and operating the IMS and the IDC and in building up an OSI capability will be described. The capabilities of the monitoring networks have progressively improved as stations are added to the IMS and IDC processing techniques refined. Detection thresholds for seismic, hydroacoustic, infrasound and radionuclide events have been measured and in general are equal to or lower than the predictions used during the Treaty negotiations. The measurements have led to improved models and tools that allow more accurate predictions of future capabilities and network performance under any configuration. Unplanned tests of the monitoring network occurred when the DPRK announced nuclear tests in 2006, 2009, and 2013. All three tests were well above the detection threshold and easily detected and located by the seismic monitoring network. In addition, noble gas consistent with the nuclear tests in 2006 and 2013 (according to atmospheric transport models) was detected by stations in the network. On-site inspections of these tests were not conducted as the Treaty has not entered into force. In order to achieve a credible and trustworthy Verification System, increased focus is being put on the development of OSI operational capabilities while operating and sustaining the existing monitoring system, increasing the data availability and quality, and completing the remaining facilities of the IMS. Furthermore, as mandated by the Treaty, the CTBTO also seeks to continuously improve its technologies and methods through interaction with the scientific community. Workshops and scientific conferences such as the CTBT Science and Technology Conference series provide venues for exchanging ideas, and mechanisms have been developed for sharing IMS data with researchers who are developing and testing new and innovative methods pertinent to the verification regime. While progress is steady on building up the verification regime, there is also progress in gaining entry into force of the Treaty, which requires the signatures and ratifications of the DPRK, India and Pakistan; it also requires the ratifications of China, Egypt, Iran, Israel and the United States. Thirty-six other States, whose signatures and ratifications are needed for entry into force have already done so.

  17. Normal Perceptual Sensitivity Arising From Weakly Reflective Cone Photoreceptors

    PubMed Central

    Bruce, Kady S.; Harmening, Wolf M.; Langston, Bradley R.; Tuten, William S.; Roorda, Austin; Sincich, Lawrence C.

    2015-01-01

    Purpose To determine the light sensitivity of poorly reflective cones observed in retinas of normal subjects, and to establish a relationship between cone reflectivity and perceptual threshold. Methods Five subjects (four male, one female) with normal vision were imaged longitudinally (7–26 imaging sessions, representing 82–896 days) using adaptive optics scanning laser ophthalmoscopy (AOSLO) to monitor cone reflectance. Ten cones with unusually low reflectivity, as well as 10 normally reflective cones serving as controls, were targeted for perceptual testing. Cone-sized stimuli were delivered to the targeted cones and luminance increment thresholds were quantified. Thresholds were measured three to five times per session for each cone in the 10 pairs, all located 2.2 to 3.3° from the center of gaze. Results Compared with other cones in the same retinal area, three of 10 monitored dark cones were persistently poorly reflective, while seven occasionally manifested normal reflectance. Tested psychophysically, all 10 dark cones had thresholds comparable with those from normally reflecting cones measured concurrently (P = 0.49). The variation observed in dark cone thresholds also matched the wide variation seen in a large population (n = 56 cone pairs, six subjects) of normal cones; in the latter, no correlation was found between cone reflectivity and threshold (P = 0.0502). Conclusions Low cone reflectance cannot be used as a reliable indicator of cone sensitivity to light in normal retinas. To improve assessment of early retinal pathology, other diagnostic criteria should be employed along with imaging and cone-based microperimetry. PMID:26193919

  18. [Determination of the anaerobic threshold by the rate of ventilation and cardio interval variability].

    PubMed

    Seluianov, V N; Kalinin, E M; Pak, G D; Maevskaia, V I; Konrad, A H

    2011-01-01

    The aim of this work is to develop methods for determining the anaerobic threshold according to the rate of ventilation and cardio interval variability during the test with stepwise increases load on the cycle ergometer and treadmill. In the first phase developed the method for determining the anaerobic threshold for lung ventilation. 49 highly skilled skiers took part in the experiment. They performed a treadmill ski-walking test with sticks with gradually increasing slope from 0 to 25 degrees, the slope increased by one degree every minute. In the second phase we developed a method for determining the anaerobic threshold according dynamics ofcardio interval variability during the test. The study included 86 athletes of different sports specialties who performed pedaling on the cycle ergometer "Monarch" in advance. Initial output was 25 W, power increased by 25 W every 2 min. The pace was steady--75 rev/min. Measurement of pulmonary ventilation and oxygen and carbon dioxide content was performed using gas analyzer COSMED K4. Sampling of arterial blood was carried from the ear lobe or finger, blood lactate concentration was determined using an "Akusport" instrument. RR-intervals registration was performed using heart rate monitor Polar s810i. As a result, it was shown that the graphical method for determining the onset of anaerobic threshold ventilation (VAnP) coincides with the accumulation of blood lactate 3.8 +/- 0.1 mmol/l when testing on a treadmill and 4.1 +/- 0.6 mmol/1 on the cycle ergometer. The connection between the measure of oxygen consumption at VAnP and the dispersion of cardio intervals (SD1), derived regression equation: VO2AnT = 0.35 + 0.01SD1W + 0.0016SD1HR + + 0.106SD1(ms), l/min; (R = 0.98, error evaluation function 0.26 L/min, p < 0.001), where W (W)--Power, HR--heart rate (beats/min), SD1--cardio intervals dispersion (ms) at the moment of registration of cardio interval threshold.

  19. Development of a landlside EWS based on rainfall thresholds for Tuscany Region, Italy

    NASA Astrophysics Data System (ADS)

    Rosi, Ascanio; Segoni, Samuele; Battistini, Alessandro; Rossi, Guglielmo; Catani, Filippo; Casagli, Nicola

    2017-04-01

    We present the set-up of a landslide EWS based on rainfall thresholds for the Tuscany region (central Italy), that shows a heterogeneous distribution of reliefs and precipitation. The work started with the definition of a single set of thresholds for the whole region, but it resulted unsuitable for EWS purposes, because of the heterogeneity of the Tuscan territory and non-repeatability of the analyses, that were affected by a high degree of subjectivity. To overcome this problem, the work started from the implementation of a software capable of objectively defining the rainfall thresholds, since some of the main issues of these thresholds are the subjectivity of the analysis and therefore their non-repeatability. This software, named MaCumBA, is largely automated and can analyze, in a short time, a high number of rainfall events to define several parameters of the threshold, such as the intensity (I) and the duration (D) of the rainfall event, the no-rain time gap (NRG: how many hours without rain are needed to consider two events as separated) and the equation describing the threshold. The possibility of quickly perform several analyses lead to the decision to divide the territory in 25 homogeneous areas (named alert zones, AZ), so as a single threshold for each AZ could be defined. For the definition of the thresholds two independent datasets (of joint rainfall-landslide occurrences) have been used: a calibration dataset (data from 2000 to 2007) and a validation dataset (2008-2009). Once the thresholds were defined, a WebGIS-based EWS has been implemented. In this system it is possible to focus both on monitoring of real-time data and on forecasting at different lead times up to 48 h; forecasting data are collected from LAMI (Limited Area Model Italy) rainfall forecasts. The EWS works on the basis of the threshold parameters defined by MaCumBA (I, D, NRG). An important feature of the warning system is that the visualization of the thresholds in the WebGIS interface may vary in time depending on when the starting time of the rainfall event is set. Therefore, the starting time of the rainfall event is considered as a variable by the system: whenever new rainfall data are available, a recursive algorithm identifies the starting time for which the rainfall path is closest to or overcomes the threshold. This is considered the most hazardous condition, and it is displayed by the WebGIS interface. One more issue that came to surface, after the EWS implementation, was the time-limited validity of the thresholds. On one hand rainfall thresholds can give good results, on the other hand their validity is limited in time, because of several factors, such as changes of pluviometric regime, land use and urban development. Furthermore, the availability of new landslide data can lead to more robust results. For the aforementioned reasons some of the thresholds defined for Tuscany region were updated, by using new landslide data (from 2010 to march 2013). A comparison between updated and former thresholds clearly shows that the performance of an EWS can be enhanced if the thresholds are constantly updated.

  20. Evaluation of Earthquake Detection Performance in Terms of Quality and Speed in SEISCOMP3 Using New Modules Qceval, Npeval and Sceval

    NASA Astrophysics Data System (ADS)

    Roessler, D.; Weber, B.; Ellguth, E.; Spazier, J.

    2017-12-01

    The geometry of seismic monitoring networks, site conditions and data availability as well as monitoring targets and strategies typically impose trade-offs between data quality, earthquake detection sensitivity, false detections and alert times. Network detection capabilities typically change with alteration of the seismic noise level by human activity or by varying weather and sea conditions. To give helpful information to operators and maintenance coordinators, gempa developed a range of tools to evaluate earthquake detection and network performance including qceval, npeval and sceval. qceval is a module which analyzes waveform quality parameters in real-time and deactivates and reactivates data streams based on waveform quality thresholds for automatic processing. For example, thresholds can be defined for latency, delay, timing quality, spikes and gaps count and rms. As changes in the automatic processing have a direct influence on detection quality and speed, another tool called "npeval" was designed to calculate in real-time the expected time needed to detect and locate earthquakes by evaluating the effective network geometry. The effective network geometry is derived from the configuration of stations participating in the detection. The detection times are shown as an additional layer on the map and updated in real-time as soon as the effective network geometry changes. Yet another new tool, "sceval", is an automatic module which classifies located seismic events (Origins) in real-time. sceval evaluates the spatial distribution of the stations contributing to an Origin. It confirms or rejects the status of Origins, adds comments or leaves the Origin unclassified. The comments are passed to an additional sceval plug-in where the end user can customize event types. This unique identification of real and fake events in earthquake catalogues allows to lower network detection thresholds. In real-time monitoring situations operators can limit the processing to events with unclassified Origins, reducing their workload. Classified Origins can be treated specifically by other procedures. These modules have been calibrated and fully tested by several complex seismic monitoring networks in the region of Indonesia and Northern Chile.

  1. Electrical power distribution control methods, electrical energy demand monitoring methods, and power management devices

    DOEpatents

    Chassin, David P [Pasco, WA; Donnelly, Matthew K [Kennewick, WA; Dagle, Jeffery E [Richland, WA

    2011-12-06

    Electrical power distribution control methods, electrical energy demand monitoring methods, and power management devices are described. In one aspect, an electrical power distribution control method includes providing electrical energy from an electrical power distribution system, applying the electrical energy to a load, providing a plurality of different values for a threshold at a plurality of moments in time and corresponding to an electrical characteristic of the electrical energy, and adjusting an amount of the electrical energy applied to the load responsive to an electrical characteristic of the electrical energy triggering one of the values of the threshold at the respective moment in time.

  2. Electrical power distribution control methods, electrical energy demand monitoring methods, and power management devices

    DOEpatents

    Chassin, David P.; Donnelly, Matthew K.; Dagle, Jeffery E.

    2006-12-12

    Electrical power distribution control methods, electrical energy demand monitoring methods, and power management devices are described. In one aspect, an electrical power distribution control method includes providing electrical energy from an electrical power distribution system, applying the electrical energy to a load, providing a plurality of different values for a threshold at a plurality of moments in time and corresponding to an electrical characteristic of the electrical energy, and adjusting an amount of the electrical energy applied to the load responsive to an electrical characteristic of the electrical energy triggering one of the values of the threshold at the respective moment in time.

  3. Triangle Area Water Supply Monitoring Project, North Carolina—Summary of monitoring activities, quality assurance, and data, October 2013–September 2015

    USGS Publications Warehouse

    Pfeifle, C.A.; Cain, J.L.; Rasmussen, R.B.

    2017-09-27

    Surface-water supplies are important sources of drinking water for residents in the Triangle area of North Carolina, which is located within the upper Cape Fear and Neuse River Basins. Since 1988, the U.S. Geological Survey and a consortium of local governments have tracked water-quality conditions and trends in several of the area’s water-supply lakes and streams. This report summarizes data collected through this cooperative effort, known as the Triangle Area Water Supply Monitoring Project, during October 2013 through September 2014 (water year 2014) and October 2014 through September 2015 (water year 2015). Major findings for this period include:More than 5,500 individual measurements of water quality were made at a total of 15 sites—4 in the Neuse River Basin and 11 in the Cape Fear River Basin. Thirty water-quality properties or constituents were measured; State water-quality thresholds exist for 11 of these.All observations met State water-quality thresholds for temperature, hardness, chloride, fluoride, sulfate, and nitrate plus nitrite.North Carolina water-quality thresholds were exceeded one or more times for dissolved oxygen, dissolved-oxygen percent saturation, pH, turbidity, and chlorophyll a.

  4. Subclavian vein pacing and venous pressure waveform measurement for phrenic nerve monitoring during cryoballoon ablation of atrial fibrillation.

    PubMed

    Ghosh, Justin; Singarayar, Suresh; Kabunga, Peter; McGuire, Mark A

    2015-06-01

    The phrenic nerves may be damaged during catheter ablation of atrial fibrillation. Phrenic nerve function is routinely monitored during ablation by stimulating the right phrenic nerve from a site in the superior vena cava (SVC) and manually assessing the strength of diaphragmatic contraction. However the optimal stimulation site, method of assessing diaphragmatic contraction, and techniques for monitoring the left phrenic nerve have not been established. We assessed novel techniques to monitor phrenic nerve function during cryoablation procedures. Pacing threshold and stability of phrenic nerve capture were assessed when pacing from the SVC, left and right subclavian veins. Femoral venous pressure waveforms were used to monitor the strength of diaphragmatic contraction. Stable capture of the left phrenic nerve by stimulation in the left subclavian vein was achieved in 96 of 100 patients, with a median capture threshold of 2.5 mA [inter-quartile range (IQR) 1.4-5.0 mA]. Stimulation of the right phrenic nerve from the subclavian vein was superior to stimulation from the SVC with lower pacing thresholds (1.8 mA IQR 1.4-3.3 vs. 6.0 mA IQR 3.4-8.0, P < 0.001). Venous pressure waveforms were obtained in all patients and attenuation of the waveform was always observed prior to onset of phrenic nerve palsy. The left phrenic nerve can be stimulated from the left subclavian vein. The subclavian veins are the optimal sites for phrenic nerve stimulation. Monitoring the femoral venous pressure waveform is a novel technique for detecting impending phrenic nerve damage. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2014. For permissions please email: journals.permissions@oup.com.

  5. Auditory brainstem evoked responses and temperature monitoring during pediatric cardiopulmonary bypass.

    PubMed

    Rodriguez, R A; Edmonds, H L; Auden, S M; Austin, E H

    1999-09-01

    To examine the effects of temperature on auditory brainstem responses (ABRs) in infants during hypothermic cardiopulmonary bypass for total circulatory arrest (TCA). The relationship between ABRs (as a surrogate measure of core-brain temperature) and body temperature as measured at several temperature monitoring sites was determined. In a prospective, observational study, ABRs were recorded non-invasively at normothermia and at every 1 or 2 degrees C change in ear-canal temperature during cooling and rewarming in 15 infants (ages: 2 days to 14 months) that required TCA. The ABR latencies and amplitudes and the lowest temperatures at which an ABR was identified (the threshold) were measured during both cooling and rewarming. Temperatures from four standard temperature monitoring sites were simultaneously recorded. The latencies of ABRs increased and amplitudes decreased with cooling (P < 0.01), but rewarming reversed these effects. The ABR threshold temperature as related to each monitoring site (ear-canal, nasopharynx, esophagus and bladder) was respectively determined as 23 +/- 2.2 degrees C, 20.8 +/- 1.7 degrees C, 14.6 +/- 3.4 degrees C, and 21.5 +/- 3.8 degrees C during cooling and 21.8 +/- 1.6 degrees C, 22.4 +/- 2.0 degrees C, 27.6 +/- 3.6 degrees C, and 23.0 +/- 2.4 degrees C during rewarming. The rewarming latencies were shorter and Q10 latencies smaller than the corresponding cooling values (P < 0.01). Esophageal and bladder sites were more susceptible to temperature variations as compared with the ear-canal and nasopharynx. No temperature site reliably predicted an electrophysiological threshold. A faster latency recovery during rewarming suggests that body temperature monitoring underestimates the effects of rewarming in the core-brain. ABRs may be helpful to monitor the effects of cooling and rewarming on the core-brain during pediatric cardiopulmonary bypass.

  6. A quantitative method for optimized placement of continuous air monitors.

    PubMed

    Whicker, Jeffrey J; Rodgers, John C; Moxley, John S

    2003-11-01

    Alarming continuous air monitors (CAMs) are a critical component for worker protection in facilities that handle large amounts of hazardous materials. In nuclear facilities, continuous air monitors alarm when levels of airborne radioactive materials exceed alarm thresholds, thus prompting workers to exit the room to reduce inhalation exposures. To maintain a high level of worker protection, continuous air monitors are required to detect radioactive aerosol clouds quickly and with good sensitivity. This requires that there are sufficient numbers of continuous air monitors in a room and that they are well positioned. Yet there are no published methodologies to quantitatively determine the optimal number and placement of continuous air monitors in a room. The goal of this study was to develop and test an approach to quantitatively determine optimal number and placement of continuous air monitors in a room. The method we have developed uses tracer aerosol releases (to simulate accidental releases) and the measurement of the temporal and spatial aspects of the dispersion of the tracer aerosol through the room. The aerosol dispersion data is then analyzed to optimize continuous air monitor utilization based on simulated worker exposure. This method was tested in a room within a Department of Energy operated plutonium facility at the Savannah River Site in South Carolina, U.S. Results from this study show that the value of quantitative airflow and aerosol dispersion studies is significant and that worker protection can be significantly improved while balancing the costs associated with CAM programs.

  7. Early warning, warning or alarm systems for natural hazards? A generic classification.

    NASA Astrophysics Data System (ADS)

    Sättele, Martina; Bründl, Michael; Straub, Daniel

    2013-04-01

    Early warning, warning and alarm systems have gained popularity in recent years as cost-efficient measures for dangerous natural hazard processes such as floods, storms, rock and snow avalanches, debris flows, rock and ice falls, landslides, flash floods, glacier lake outburst floods, forest fires and even earthquakes. These systems can generate information before an event causes loss of property and life. In this way, they mainly mitigate the overall risk by reducing the presence probability of endangered objects. These systems are typically prototypes tailored to specific project needs. Despite their importance there is no recognised system classification. This contribution classifies warning and alarm systems into three classes: i) threshold systems, ii) expert systems and iii) model-based expert systems. The result is a generic classification, which takes the characteristics of the natural hazard process itself and the related monitoring possibilities into account. The choice of the monitoring parameters directly determines the system's lead time. The classification of 52 active systems moreover revealed typical system characteristics for each system class. i) Threshold systems monitor dynamic process parameters of ongoing events (e.g. water level of a debris flow) and incorporate minor lead times. They have a local geographical coverage and a predefined threshold determines if an alarm is automatically activated to warn endangered objects, authorities and system operators. ii) Expert systems monitor direct changes in the variable disposition (e.g crack opening before a rock avalanche) or trigger events (e.g. heavy rain) at a local scale before the main event starts and thus offer extended lead times. The final alarm decision incorporates human, model and organisational related factors. iii) Model-based expert systems monitor indirect changes in the variable disposition (e.g. snow temperature, height or solar radiation that influence the occurrence probability of snow avalanches) or trigger events (e.g. heavy snow fall) to predict spontaneous hazard events in advance. They encompass regional or national measuring networks and satisfy additional demands such as the standardisation of the measuring stations. The developed classification and the characteristics, which were revealed for each class, yield a valuable input to quantifying the reliability of warning and alarm systems. Importantly, this will facilitate to compare them with well-established standard mitigation measures such as dams, nets and galleries within an integrated risk management approach.

  8. SHARD - a SeisComP3 module for Structural Health Monitoring

    NASA Astrophysics Data System (ADS)

    Weber, B.; Becker, J.; Ellguth, E.; Henneberger, R.; Herrnkind, S.; Roessler, D.

    2016-12-01

    Monitoring building and structure response to strong earthquake ground shaking or human-induced vibrations in real-time forms the backbone of modern structural health monitoring (SHM). The continuous data transmission, processing and analysis reduces drastically the time decision makers need to plan for appropriate response to possible damages of high-priority buildings and structures. SHARD is a web browser based module using the SeisComp3 framework to monitor the structural health of buildings and other structures by calculating standard engineering seismology parameters and checking their exceedance in real-time. Thresholds can be defined, e.g. compliant with national building codes (IBC2000, DIN4149 or EC8), for PGA/PGV/PGD, response spectra and drift ratios. In case thresholds are exceeded automatic or operator driven reports are generated and send to the decision makers. SHARD also determines waveform quality in terms of data delay and variance to report sensor status. SHARD is the perfect tool for civil protection to monitor simultaneously multiple city-wide critical infrastructure as hospitals, schools, governmental buildings and structures as bridges, dams and power substations.

  9. Tetramethylammonium for in vivo marking of the cross-sectional area of the scala media in the guinea pig cochlea.

    PubMed

    Salt, A N; DeMott, J

    1992-01-01

    A physiologic technique was developed to measure endolymphatic cross-sectional area in vivo using tetramethylammonium (TMA) as a volume marker. The technique was evaluated in guinea pigs as an animal model. In the method, the cochlea was exposed surgically and TMA was injected into endolymph of the second turn at a constant rate by iontophoresis. The concentration of TMA was monitored during and after the injection using ion-selective electrodes. Cross-section estimates derived from the TMA concentration measurements were compared in normal animals and animals in which endolymphatic hydrops had been induced by ablation of the endolymphatic duct and sac 8 weeks earlier. The method demonstrated a mean increase in cross-sectional area of 258% in the hydropic group. Individually measured area values were compared with action potential threshold shifts and the magnitude of the endocochlear potential (EP). Hydropic animals typically showed an increase in threshold to 2 kHz stimuli and a decrease in EP. However, the degree of threshold shift or EP decrease did not correlate well with the degree of hydrops present.

  10. Efficacy of intrathoracic impedance and remote monitoring in patients with an implantable device after the 2011 great East Japan earthquake.

    PubMed

    Suzuki, Hitoshi; Yamada, Shinya; Kamiyama, Yoshiyuki; Takeishi, Yasuchika

    2014-01-01

    Several studies have revealed that stress after catastrophic disasters can trigger cardiovascular events, however, little is known about its association with the occurrence of heart failure in past earthquakes. The objective of the present study was to determine whether the Great East Japan Earthquake on March 11, 2011, increased the incidence of worsening heart failure in chronic heart failure (CHF) patients with implantable devices. Furthermore, we examined whether intrathoracic impedance using remote monitoring was effective for the management of CHF.We enrolled 44 CHF patients (32 males, mean age 63 ± 12 years) with implantable devices that can check intrathoracic impedance using remote monitoring. We defined the worsening heart failure as accumulated impedance under reference impedance exceeding 60 ohms-days (fluid index threshold), and compared the incidence of worsening heart failure and arrhythmic events 30 days before and after March 11.Within the 30 days after March 11, 10 patients exceeded the threshold compared with only 2 patients in the preceding 30 days (P < 0.05). Although 9 patients using remote monitoring among the 10 patients with threshold crossings were not hospitalized, one patient without the system was hospitalized due to acute decompensated heart failure. On the contrary, arrhythmic events did not change between before and after March 11.Our results suggest that earthquake-induced stress causes an increased risk of worsening heart failure without changes in arrhythmia. Furthermore, intrathoracic impedance using remote monitoring may be a useful tool for the management of CHF in catastrophic disasters.

  11. Hierarchical population monitoring of greater sage-grouse (Centrocercus urophasianus) in Nevada and California—Identifying populations for management at the appropriate spatial scale

    USGS Publications Warehouse

    Coates, Peter S.; Prochazka, Brian G.; Ricca, Mark A.; Wann, Gregory T.; Aldridge, Cameron L.; Hanser, Steven E.; Doherty, Kevin E.; O'Donnell, Michael S.; Edmunds, David R.; Espinosa, Shawn P.

    2017-08-10

    Population ecologists have long recognized the importance of ecological scale in understanding processes that guide observed demographic patterns for wildlife species. However, directly incorporating spatial and temporal scale into monitoring strategies that detect whether trajectories are driven by local or regional factors is challenging and rarely implemented. Identifying the appropriate scale is critical to the development of management actions that can attenuate or reverse population declines. We describe a novel example of a monitoring framework for estimating annual rates of population change for greater sage-grouse (Centrocercus urophasianus) within a hierarchical and spatially nested structure. Specifically, we conducted Bayesian analyses on a 17-year dataset (2000–2016) of lek counts in Nevada and northeastern California to estimate annual rates of population change, and compared trends across nested spatial scales. We identified leks and larger scale populations in immediate need of management, based on the occurrence of two criteria: (1) crossing of a destabilizing threshold designed to identify significant rates of population decline at a particular nested scale; and (2) crossing of decoupling thresholds designed to identify rates of population decline at smaller scales that decouple from rates of population change at a larger spatial scale. This approach establishes how declines affected by local disturbances can be separated from those operating at larger scales (for example, broad-scale wildfire and region-wide drought). Given the threshold output from our analysis, this adaptive management framework can be implemented readily and annually to facilitate responsive and effective actions for sage-grouse populations in the Great Basin. The rules of the framework can also be modified to identify populations responding positively to management action or demonstrating strong resilience to disturbance. Similar hierarchical approaches might be beneficial for other species occupying landscapes with heterogeneous disturbance and climatic regimes.

  12. Chronic Glucose Exposure Systematically Shifts the Oscillatory Threshold of Mouse Islets: Experimental Evidence for an Early Intrinsic Mechanism of Compensation for Hyperglycemia

    PubMed Central

    Glynn, Eric; Thompson, Benjamin; Vadrevu, Suryakiran; Lu, Shusheng; Kennedy, Robert T.; Ha, Joon; Sherman, Arthur

    2016-01-01

    Mouse islets exhibit glucose-dependent oscillations in electrical activity, intracellular Ca2+ and insulin secretion. We developed a mathematical model in which a left shift in glucose threshold helps compensate for insulin resistance. To test this experimentally, we exposed isolated mouse islets to varying glucose concentrations overnight and monitored their glucose sensitivity the next day by measuring intracellular Ca2+, electrical activity, and insulin secretion. Glucose sensitivity of all oscillation modes was increased when overnight glucose was greater than 2.8mM. To determine whether threshold shifts were a direct effect of glucose or involved secreted insulin, the KATP opener diazoxide (Dz) was coapplied with glucose to inhibit insulin secretion. The addition of Dz or the insulin receptor antagonist s961 increased islet glucose sensitivity, whereas the KATP blocker tolbutamide tended to reduce it. This suggests insulin and glucose have opposing actions on the islet glucose threshold. To test the hypothesis that the threshold shifts were due to changes in plasma membrane KATP channels, we measured cell KATP conductance, which was confirmed to be reduced by high glucose pretreatment and further reduced by Dz. Finally, treatment of INS-1 cells with glucose and Dz overnight reduced high affinity sulfonylurea receptor (SUR1) trafficking to the plasma membrane vs glucose alone, consistent with insulin increasing KATP conductance by altering channel number. The results support a role for metabolically regulated KATP channels in the maintenance of glucose homeostasis. PMID:26697721

  13. A Cyber-Physical System for Girder Hoisting Monitoring Based on Smartphones.

    PubMed

    Han, Ruicong; Zhao, Xuefeng; Yu, Yan; Guan, Quanhua; Hu, Weitong; Li, Mingchu

    2016-07-07

    Offshore design and construction is much more difficult than land-based design and construction, particularly due to hoisting operations. Real-time monitoring of the orientation and movement of a hoisted structure is thus required for operators' safety. In recent years, rapid development of the smart-phone commercial market has offered the possibility that everyone can carry a mini personal computer that is integrated with sensors, an operating system and communication system that can act as an effective aid for cyber-physical systems (CPS) research. In this paper, a CPS for hoisting monitoring using smartphones was proposed, including a phone collector, a controller and a server. This system uses smartphones equipped with internal sensors to obtain girder movement information, which will be uploaded to a server, then returned to controller users. An alarming system will be provided on the controller phone once the returned data exceeds a threshold. The proposed monitoring system is used to monitor the movement and orientation of a girder during hoisting on a cross-sea bridge in real time. The results show the convenience and feasibility of the proposed system.

  14. Monitoring and Managing Codling Moth Clearly and Precisely

    USDA-ARS?s Scientific Manuscript database

    Studies were conducted in two ‘Comice’ pear orchards treated with sex pheromone in southern Oregon to implement the use of site-specific management practices for codling moth. The density of monitoring traps was increased and insecticide sprays were applied based on moth catch thresholds. Only porti...

  15. An Ethnographic Observational Study to Evaluate and Optimize the Use of Respiratory Acoustic Monitoring in Children Receiving Postoperative Opioid Infusions.

    PubMed

    Görges, Matthias; West, Nicholas C; Christopher, Nancy A; Koch, Jennifer L; Brodie, Sonia M; Lowlaavar, Nasim; Lauder, Gillian R; Ansermino, J Mark

    2016-04-01

    Respiratory depression in children receiving postoperative opioid infusions is a significant risk because of the interindividual variability in analgesic requirement. Detection of respiratory depression (or apnea) in these children may be improved with the introduction of automated acoustic respiratory rate (RR) monitoring. However, early detection of adverse events must be balanced with the risk of alarm fatigue. Our objective was to evaluate the use of acoustic RR monitoring in children receiving opioid infusions on a postsurgical ward and identify the causes of false alarm and optimal alarm thresholds. A video ethnographic study was performed using an observational, mixed methods approach. After surgery, an acoustic RR sensor was placed on the participant's neck and attached to a Rad87 monitor. The monitor was networked with paging for alarms. Vital signs data and paging notification logs were obtained from the central monitoring system. Webcam videos of the participant, infusion pump, and Rad87 monitor were recorded, stored on a secure server, and subsequently analyzed by 2 research nurses to identify the cause of the alarm, response, and effectiveness. Alarms occurring within a 90-second window were grouped into a single-alarm response opportunity. Data from 49 patients (30 females) with median age 14 (range, 4.4-18.8) years were analyzed. The 896 bedside vital sign threshold alarms resulted in 160 alarm response opportunities (44 low RR, 74 high RR, and 42 low SpO2). In 141 periods (88% of total), for which video was available, 65% of alarms were deemed effective (followed by an alarm-related action within 10 minutes). Nurses were the sole responders in 55% of effective alarms and the patient or parent in 20%. Episodes of desaturation (SpO2 < 90%) were observed in 9 patients: At the time of the SpO2 paging trigger, the RR was >10 bpm in 6 of 9 patients. Based on all RR samples observed, the default alarm thresholds, to serve as a starting point for each patient, would be a low RR of 6 (>10 years of age) and 10 (4-9 years of age). In this study, the use of RR monitoring did not improve the detection of respiratory depression. An RR threshold, which would have been predictive of desaturations, would have resulted in an unacceptably high false alarm rate. Future research using a combination of variables (e.g., SpO2 and RR), or the measurement of tidal volumes, may be needed to improve patient safety in the postoperative ward.

  16. No-Impact Threshold Values for NRAP's Reduced Order Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Last, George V.; Murray, Christopher J.; Brown, Christopher F.

    2013-02-01

    The purpose of this study was to develop methodologies for establishing baseline datasets and statistical protocols for determining statistically significant changes between background concentrations and predicted concentrations that would be used to represent a contamination plume in the Gen II models being developed by NRAP’s Groundwater Protection team. The initial effort examined selected portions of two aquifer systems; the urban shallow-unconfined aquifer system of the Edwards-Trinity Aquifer System (being used to develop the ROM for carbon-rock aquifers, and the a portion of the High Plains Aquifer (an unconsolidated and semi-consolidated sand and gravel aquifer, being used to development the ROMmore » for sandstone aquifers). Threshold values were determined for Cd, Pb, As, pH, and TDS that could be used to identify contamination due to predicted impacts from carbon sequestration storage reservoirs, based on recommendations found in the EPA’s ''Unified Guidance for Statistical Analysis of Groundwater Monitoring Data at RCRA Facilities'' (US Environmental Protection Agency 2009). Results from this effort can be used to inform a ''no change'' scenario with respect to groundwater impacts, rather than the use of an MCL that could be significantly higher than existing concentrations in the aquifer.« less

  17. Identification of inactivity behavior in smart home.

    PubMed

    Poujaud, J; Noury, N; Lundy, J-E

    2008-01-01

    To help elderly people live independently at home, the TIMC-IMAG laboratory developed Health Smart Homes called 'HIS'. These smart Homes are composed of several sensors to monitor the activities of daily living of the patients. Volunteers have accepted to be monitored during 2 years in their own flats. During one year, we carried out our survey on one elderly patient. Thanks to this experimentation, we will access to relevant information like physiological, environmental and activity. This paper focuses on daily living activity. We will introduce an original data splitting method based on the relationship between the frame of time and the location in the flat. Moreover we will present two different methods to determine a threshold of critical inactivity and eventually we will discuss their possible utilities.

  18. Multi-parameter vital sign database to assist in alarm optimization for general care units.

    PubMed

    Welch, James; Kanter, Benjamin; Skora, Brooke; McCombie, Scott; Henry, Isaac; McCombie, Devin; Kennedy, Rosemary; Soller, Babs

    2016-12-01

    Continual vital sign assessment on the general care, medical-surgical floor is expected to provide early indication of patient deterioration and increase the effectiveness of rapid response teams. However, there is concern that continual, multi-parameter vital sign monitoring will produce alarm fatigue. The objective of this study was the development of a methodology to help care teams optimize alarm settings. An on-body wireless monitoring system was used to continually assess heart rate, respiratory rate, SpO 2 and noninvasive blood pressure in the general ward of ten hospitals between April 1, 2014 and January 19, 2015. These data, 94,575 h for 3430 patients are contained in a large database, accessible with cloud computing tools. Simulation scenarios assessed the total alarm rate as a function of threshold and annunciation delay (s). The total alarm rate of ten alarms/patient/day predicted from the cloud-hosted database was the same as the total alarm rate for a 10 day evaluation (1550 h for 36 patients) in an independent hospital. Plots of vital sign distributions in the cloud-hosted database were similar to other large databases published by different authors. The cloud-hosted database can be used to run simulations for various alarm thresholds and annunciation delays to predict the total alarm burden experienced by nursing staff. This methodology might, in the future, be used to help reduce alarm fatigue without sacrificing the ability to continually monitor all vital signs.

  19. Quantitative head ultrasound measurements to determine thresholds for preterm neonates requiring interventional therapies following intraventricular hemorrhage

    NASA Astrophysics Data System (ADS)

    Kishimoto, Jessica; Fenster, Aaron; Salehi, Fateme; Romano, Walter; Lee, David S. C.; de Ribaupierre, Sandrine

    2016-04-01

    Dilation of the cerebral ventricles is a common condition in preterm neonates with intraventricular hemorrhage (IVH). This post hemorrhagic ventricle dilation (PHVD) can lead to lifelong neurological impairment through ischemic injury due to increased intracranial pressure and without treatment, can lead to death. Clinically, 2D ultrasound (US) through the fontanelles ('soft spots') of the patients are serially acquired to monitor the progression of the ventricle dilation. These images are used to determine when interventional therapies such as needle aspiration of the built up cerebrospinal fluid (CSF) ('ventricle tap', VT) might be indicated for a patient; however, quantitative measurements of the growth of the ventricles are often not performed. There is no consensus on when a neonate with PHVD should have an intervention and often interventions are performed after the potential for brain damage is quite high. Previously we have developed and validated a 3D US system to monitor the progression of ventricle volumes (VV) in IVH patients. We will describe the potential utility of quantitative 2D and 3D US to monitor and manage PHVD in neonates. Specifically, we will look to determine image-based measurement thresholds for patients who will require VT in comparison to patients with PHVD who resolve without intervention. Additionally, since many patients who have an initial VT will require subsequent interventions, we look at the potential for US to determine which PHVD patients will require additional VT after the initial one has been performed.

  20. Water quality and bed sediment quality in the Albemarle Sound, North Carolina, 2012–14

    USGS Publications Warehouse

    Moorman, Michelle C.; Fitzgerald, Sharon A.; Gurley, Laura N.; Rhoni-Aref, Ahmed; Loftin, Keith A.

    2017-01-23

    The Albemarle Sound region was selected in 2012 as one of two demonstration sites in the Nation to test and improve the design of the National Water Quality Monitoring Council’s National Monitoring Network (NMN) for U.S. Coastal Waters and Tributaries. The goal of the NMN for U.S. Coastal Waters and Tributaries is to provide information about the health of our oceans, coastal ecosystems, and inland influences on coastal waters for improved resource management. The NMN is an integrated, multidisciplinary, and multi-organizational program using multiple sources of data and information to augment current monitoring programs.This report presents and summarizes selected water-quality and bed sediment-quality data collected as part of the demonstration project conducted in two phases. The first phase was an occurrence and distribution study to assess nutrients, metals, pesticides, cyanotoxins, and phytoplankton communities in the Albemarle Sound during the summer of 2012 at 34 sites in Albemarle Sound, nearby sounds, and various tributaries. The second phase consisted of monthly sampling over a year (March 2013 through February 2014) to assess seasonality in a more limited set of constituents including nutrients, cyanotoxins, and phytoplankton communities at a subset (eight) of the sites sampled in the first phase. During the summer of 2012, few constituent concentrations exceeded published water-quality thresholds; however, elevated levels of chlorophyll a and pH were observed in the northern embayments and in Currituck Sound. Chlorophyll a, and metals (copper, iron, and zinc) were detected above a water-quality threshold. The World Health Organization provisional guideline based on cyanobacterial density for high recreational risk was exceeded in approximately 50 percent of water samples collected during the summer of 2012. Cyanobacteria capable of producing toxins were present, but only low levels of cyanotoxins below human health benchmarks were detected. Finally, 12 metals in surficial bed sediments were detected at levels above a published sediment-quality threshold. These metals included chromium, mercury, copper, lead, arsenic, nickel, and cadmium. Sites with several metal concentrations above the respective thresholds had relatively high concentrations of organic carbon or fine sediment (silt plus clay), or both and were predominantly located in the western and northwestern parts of the Albemarle Sound.Results from the second phase were generally similar to those of the first in that relatively few constituents exceeded a water-quality threshold, both pH and chlorophyll a were detected above the respective water-quality thresholds, and many of these elevated concentrations occurred in the northern embayments and in Currituck Sound. In contrast to the results from phase one, the cyanotoxin, microcystin was detected at more than 10 times the water-quality threshold during a phytoplankton bloom on the Chowan River at Mount Gould, North Carolina in August of 2013. This was the only cyanotoxin concentration measured during the entire study that exceeded a respective water-quality threshold.The information presented in this report can be used to improve understanding of water-quality conditions in the Albemarle Sound, particularly when evaluating causal and response variables that are indicators of eutrophication. In particular, this information can be used by State agencies to help develop water-quality criteria for nutrients, and to understand factors like cyanotoxins that may affect fisheries and recreation in the Albemarle Sound region.

  1. Accuracy of intensity and inclinometer output of three activity monitors for identification of sedentary behavior and light-intensity activity.

    PubMed

    Carr, Lucas J; Mahar, Matthew T

    2012-01-01

    Purpose. To examine the accuracy of intensity and inclinometer output of three physical activity monitors during various sedentary and light-intensity activities. Methods. Thirty-six participants wore three physical activity monitors (ActiGraph GT1M, ActiGraph GT3X+, and StepWatch) while completing sedentary (lying, sitting watching television, sitting using computer, and standing still) light (walking 1.0 mph, pedaling 7.0 mph, pedaling 15.0 mph) intensity activities under controlled settings. Accuracy for correctly categorizing intensity was assessed for each monitor and threshold. Accuracy of the GT3X+ inclinometer function (GT3X+Incl) for correctly identifying anatomical position was also assessed. Percentage agreement between direct observation and the monitor recorded time spent in sedentary behavior and light intensity was examined. Results. All monitors using all thresholds accurately identified over 80% of sedentary behaviors and 60% of light-intensity walking time based on intensity output. The StepWatch was the most accurate in detecting pedaling time but unable to detect pedal workload. The GT3X+Incl accurately identified anatomical position during 70% of all activities but demonstrated limitations in discriminating between activities of differing intensity. Conclusions. Our findings suggest that all three monitors accurately measure most sedentary and light-intensity activities although choice of monitors should be based on study-specific needs.

  2. 75 FR 18607 - Mandatory Reporting of Greenhouse Gases: Petroleum and Natural Gas Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-12

    ... any of the following methods: Federal eRulemaking Portal: http://www.regulations.gov . Follow the... the Source Category D. Selection of Reporting Threshold E. Selection of Proposed Monitoring Methods F... rule and the monitoring methods proposed. This section then provides a brief summary of, and rationale...

  3. Advanced Pulse Oximetry System for Remote Monitoring and Management

    PubMed Central

    Pak, Ju Geon; Park, Kee Hyun

    2012-01-01

    Pulse oximetry data such as saturation of peripheral oxygen (SpO2) and pulse rate are vital signals for early diagnosis of heart disease. Therefore, various pulse oximeters have been developed continuously. However, some of the existing pulse oximeters are not equipped with communication capabilities, and consequently, the continuous monitoring of patient health is restricted. Moreover, even though certain oximeters have been built as network models, they focus on exchanging only pulse oximetry data, and they do not provide sufficient device management functions. In this paper, we propose an advanced pulse oximetry system for remote monitoring and management. The system consists of a networked pulse oximeter and a personal monitoring server. The proposed pulse oximeter measures a patient's pulse oximetry data and transmits the data to the personal monitoring server. The personal monitoring server then analyzes the received data and displays the results to the patient. Furthermore, for device management purposes, operational errors that occur in the pulse oximeter are reported to the personal monitoring server, and the system configurations of the pulse oximeter, such as thresholds and measurement targets, are modified by the server. We verify that the proposed pulse oximetry system operates efficiently and that it is appropriate for monitoring and managing a pulse oximeter in real time. PMID:22933841

  4. Advanced pulse oximetry system for remote monitoring and management.

    PubMed

    Pak, Ju Geon; Park, Kee Hyun

    2012-01-01

    Pulse oximetry data such as saturation of peripheral oxygen (SpO(2)) and pulse rate are vital signals for early diagnosis of heart disease. Therefore, various pulse oximeters have been developed continuously. However, some of the existing pulse oximeters are not equipped with communication capabilities, and consequently, the continuous monitoring of patient health is restricted. Moreover, even though certain oximeters have been built as network models, they focus on exchanging only pulse oximetry data, and they do not provide sufficient device management functions. In this paper, we propose an advanced pulse oximetry system for remote monitoring and management. The system consists of a networked pulse oximeter and a personal monitoring server. The proposed pulse oximeter measures a patient's pulse oximetry data and transmits the data to the personal monitoring server. The personal monitoring server then analyzes the received data and displays the results to the patient. Furthermore, for device management purposes, operational errors that occur in the pulse oximeter are reported to the personal monitoring server, and the system configurations of the pulse oximeter, such as thresholds and measurement targets, are modified by the server. We verify that the proposed pulse oximetry system operates efficiently and that it is appropriate for monitoring and managing a pulse oximeter in real time.

  5. Linear servomotor probe drive system with real-time self-adaptive position control for the Alcator C-Mod tokamak

    NASA Astrophysics Data System (ADS)

    Brunner, D.; Kuang, A. Q.; LaBombard, B.; Burke, W.

    2017-07-01

    A new servomotor drive system has been developed for the horizontal reciprocating probe on the Alcator C-Mod tokamak. Real-time measurements of plasma temperature and density—through use of a mirror Langmuir probe bias system—combined with a commercial linear servomotor and controller enable self-adaptive position control. Probe surface temperature and its rate of change are computed in real time and used to control probe insertion depth. It is found that a universal trigger threshold can be defined in terms of these two parameters; if the probe is triggered to retract when crossing the trigger threshold, it will reach the same ultimate surface temperature, independent of velocity, acceleration, or scrape-off layer heat flux scale length. In addition to controlling the probe motion, the controller is used to monitor and control all aspects of the integrated probe drive system.

  6. Fundamental Design Principles for Transcription-Factor-Based Metabolite Biosensors.

    PubMed

    Mannan, Ahmad A; Liu, Di; Zhang, Fuzhong; Oyarzún, Diego A

    2017-10-20

    Metabolite biosensors are central to current efforts toward precision engineering of metabolism. Although most research has focused on building new biosensors, their tunability remains poorly understood and is fundamental for their broad applicability. Here we asked how genetic modifications shape the dose-response curve of biosensors based on metabolite-responsive transcription factors. Using the lac system in Escherichia coli as a model system, we built promoter libraries with variable operator sites that reveal interdependencies between biosensor dynamic range and response threshold. We developed a phenomenological theory to quantify such design constraints in biosensors with various architectures and tunable parameters. Our theory reveals a maximal achievable dynamic range and exposes tunable parameters for orthogonal control of dynamic range and response threshold. Our work sheds light on fundamental limits of synthetic biology designs and provides quantitative guidelines for biosensor design in applications such as dynamic pathway control, strain optimization, and real-time monitoring of metabolism.

  7. Cost-effective binomial sequential sampling of western bean cutworm, Striacosta albicosta (Lepidoptera: Noctuidae), egg masses in corn.

    PubMed

    Paula-Moraes, S; Burkness, E C; Hunt, T E; Wright, R J; Hein, G L; Hutchison, W D

    2011-12-01

    Striacosta albicosta (Smith) (Lepidoptera: Noctuidae), is a native pest of dry beans (Phaseolus vulgaris L.) and corn (Zea mays L.). As a result of larval feeding damage on corn ears, S. albicosta has a narrow treatment window; thus, early detection of the pest in the field is essential, and egg mass sampling has become a popular monitoring tool. Three action thresholds for field and sweet corn currently are used by crop consultants, including 4% of plants infested with egg masses on sweet corn in the silking-tasseling stage, 8% of plants infested with egg masses on field corn with approximately 95% tasseled, and 20% of plants infested with egg masses on field corn during mid-milk-stage corn. The current monitoring recommendation is to sample 20 plants at each of five locations per field (100 plants total). In an effort to develop a more cost-effective sampling plan for S. albicosta egg masses, several alternative binomial sampling plans were developed using Wald's sequential probability ratio test, and validated using Resampling for Validation of Sampling Plans (RVSP) software. The benefit-cost ratio also was calculated and used to determine the final selection of sampling plans. Based on final sampling plans selected for each action threshold, the average sample number required to reach a treat or no-treat decision ranged from 38 to 41 plants per field. This represents a significant savings in sampling cost over the current recommendation of 100 plants.

  8. Continuous glucose monitoring: quality of hypoglycaemia detection.

    PubMed

    Zijlstra, E; Heise, T; Nosek, L; Heinemann, L; Heckermann, S

    2013-02-01

    To evaluate the accuracy of a (widely used) continuous glucose monitoring (CGM)-system and its ability to detect hypoglycaemic events. A total of 18 patients with type 1 diabetes mellitus used continuous glucose monitoring (Guardian REAL-Time CGMS) during two 9-day in-house periods. A hypoglycaemic threshold alarm alerted patients to sensor readings <70 mg/dl. Continuous glucose monitoring sensor readings were compared to laboratory reference measurements taken every 4 h and in case of a hypoglycaemic alarm. A total of 2317 paired data points were evaluated. Overall, the mean absolute relative difference (MARD) was 16.7%. The percentage of data points in the clinically accurate or acceptable Clarke Error Grid zones A + B was 94.6%. In the hypoglycaemic range, accuracy worsened (MARD 38.8%) leading to a failure to detect more than half of the true hypoglycaemic events (sensitivity 37.5%). Furthermore, more than half of the alarms that warn patients for hypoglycaemia were false (false alert rate 53.3%). Above the low alert threshold, the sensor confirmed 2077 of 2182 reference values (specificity 95.2%). Patients using continuous glucose monitoring should be aware of its limitation to accurately detect hypoglycaemia. © 2012 Blackwell Publishing Ltd.

  9. POIS, a Low Cost Tilt and Position Sensor: Design and First Tests

    PubMed Central

    Artese, Giuseppe; Perrelli, Michele; Artese, Serena; Meduri, Sebastiano; Brogno, Natale

    2015-01-01

    An integrated sensor for the measurement and monitoring of position and inclination, characterized by low cost, small size and low weight, has been designed, realized and calibrated at the Geomatics Lab of the University of Calabria. The design of the prototype, devoted to the monitoring of landslides and structures, was aiming at realizing a fully automated monitoring instrument, able to send the data acquired periodically or upon request by a control center through a bidirectional transmission protocol. The sensor can be released with different accuracy and range of measurement, by choosing bubble vials with different characteristics. The instrument is provided with a computer, which can be programmed so as to independently perform the processing of the data collected by a single sensor or a by a sensor network, and to transmit, consequently, alert signals if the thresholds determined by the monitoring center are exceeded. The bidirectional transmission also allows the users to vary the set of the monitoring parameters (time of acquisition, duration of satellite acquisitions, thresholds for the observed data). In the paper, hardware and software of the sensor are described, along with the calibration, the results of laboratory tests and of the first in field acquisitions. PMID:25961381

  10. Monitoring and modeling conditions for regional shallow landslide initiation in the San Francisco Bay area, California

    NASA Astrophysics Data System (ADS)

    Collins, B. D.; Stock, J. D.; Godt, J. W.

    2012-12-01

    Intense winter storms in the San Francisco Bay area (SFBA) of California often trigger widespread landsliding, including debris flows that originate as shallow (<3 m) landslides. The strongest storms result in the loss of lives and millions of dollars in damage. Whereas precipitation-based rainfall intensity-duration landslide initiation thresholds are available for the SFBA, antecedent soil moisture conditions also play a major role in determining the likelihood for landslide generation from a given storm. Previous research has demonstrated that antecedent triggering conditions can be obtained using pre-storm precipitation thresholds (e.g., 250-400 mm of seasonal pre-storm rainfall). However, these types of thresholds do not account for the often cyclic pattern of wetting and drying that can occur early in the winter storm season (i.e. October - December), and which may skew the applicability of precipitation-only based thresholds. To account for these cyclic and constantly evolving soil moisture conditions, we have pursued methods to measure soil moisture directly and integrate these measurements into predictive analyses. During the past three years, the USGS installed a series of four subsurface hydrology monitoring stations in shallow landslide-prone locations of the SFBA to establish a soil-moisture-based antecedent threshold. In addition to soil moisture sensors, the monitoring stations are each equipped with piezometers to record positive pore water pressure that is likely required for shallow landslide initiation and a rain gauge to compare storm intensities with existing precipitation-based thresholds. Each monitoring station is located on a natural, grassy hillslope typically composed of silty sands, underlain by sandstone, sloping at approximately 30°, and with a depth to bedrock of approximately 1 meter - conditions typical of debris flow generation in the SFBA. Our observations reveal that various locations respond differently to seasonal precipitation, with some areas (e.g., Marin County) remaining at higher levels of saturation for longer periods of time during the winter compared to other areas (e.g., the East Bay Hills). In general, this coincides directly with relative precipitation totals in each region (i.e., Marin county typically receives more rainfall over a longer period of time than the East Bay). In those areas that are saturated for longer periods, the shallow landslide hazard is prolonged because these conditions are first needed for storm-related precipitation to subsequently generate positive pore pressure on the failure plane. Both piezometric field measurements and limit equilibrium slope stability analyses indicate that positive pore pressure is required for most shallow landslide failures to occur in the study regions. Based on measurements from two of the sites, our analyses further indicate that at least 2 kPa of pressure is required to trigger shallow landsliding. We measured this pressure at one of our sites in 2011, where more than 30 landslides, including several that mobilized into debris flows, occurred. Additional monitoring at these sites will be used to further constrain and refine antecedent moisture-based thresholds for shallow landslide initiation.

  11. Why Waveform Correlation Sometimes Fails

    NASA Astrophysics Data System (ADS)

    Carmichael, J.

    2015-12-01

    Waveform correlation detectors used in explosion monitoring scan noisy geophysical data to test two competing hypotheses: either (1) an amplitude-scaled version of a template waveform is present, or, (2) no signal is present at all. In reality, geophysical wavefields that are monitored for explosion signatures include waveforms produced by non-target sources that are partially correlated with the waveform template. Such signals can falsely trigger correlation detectors, particularly at low thresholds required to monitor for smaller target explosions. This challenge is particularly formidable when monitoring known test sites for seismic disturbances, since uncatalogued natural seismicity is (generally) more prevalent at lower magnitudes, and could be mistaken for small explosions. To address these challenges, we identify real examples in which correlation detectors targeting explosions falsely trigger on both site-proximal earthquakes (Figure 1, below) and microseismic "noise". Motivated by these examples, we quantify performance loss when applying these detectors, and re-evaluate the correlation-detector's hypothesis test. We thereby derive new detectors from more general hypotheses that admit unknown background seismicity, and apply these to real data. From our treatment, we derive "rules of thumb'' for proper template and threshold selection in heavily cluttered signal environments. Last, we answer the question "what is the probability of falsely detecting an earthquake collocated at a test site?", using correlation detectors that include explosion-triggered templates. Figure Top: An eight-channel data stream (black) recorded from an earthquake near a mine. Red markers indicate a detection. Middle: The correlation statistic computed by scanning the template against the data stream at top. The red line indicates the threshold for event declaration, determined by a false-alarm on noise probability constraint, as computed from the signal-absent distribution using the Neyman Pearson criteria. Bottom: The histogram of the correlation statistic time series (gray) superimposed on the theoretical null distribution (black curve). The line shows the threshold, consistent with a right-tail probability, computed from the black curve.

  12. Effects of Soil Moisture Thresholds in Runoff Generation in two nested gauged basins

    NASA Astrophysics Data System (ADS)

    Fiorentino, M.; Gioia, A.; Iacobellis, V.; Manfreda, S.; Margiotta, M. R.; Onorati, B.; Rivelli, A. R.; Sole, A.

    2009-04-01

    Regarding catchment response to intense storm events, while the relevance of antecedent soil moisture conditions is generally recognized, the role and the quantification of runoff thresholds is still uncertain. Among others, Grayson et al. (1997) argue that above a wetness threshold a substantial portion of a small basin acts in unison and contributes to the runoff production. Investigations were conducted through an experimental approach and in particular exploiting the hydrological data monitored on "Fiumarella of Corleto" catchment (Southern Italy). The field instrumentation ensures continuous monitoring of all fundamental hydrological variables: climate forcing, streamflow and soil moisture. The experimental basin is equipped with two water level installations used to measure the hydrological response of the entire basin (with an area of 32 km2) and of a subcatchment of 0.65 km2. The aim of the present research is to better understand the dynamics of soil moisture and the runoff generation during flood events, comparing the data recorded in the transect and the runoff at the two different scales. Particular attention was paid to the influence of the soil moisture content on runoff activation mechanisms. We found that, the threshold value, responsible of runoff activation, is equal or almost to field capacity. In fact, we observed a rapid change in the subcatchment response when the mean soil moisture reaches a value close to the range of variability of the field capacity measured along a monitored transect of the small subcatchment. During dry periods the runoff coefficient is almost zero for each of the events recorded. During wet periods, however, it is rather variable and depends almost only on the total rainfall. Changing from the small scale (0.65 km2) up to the medium scale (represented by the basin of 32 km2) the threshold mechanism in runoff production is less detectable because masked by the increased spatial heterogeneity of the vegetation cover and soil texture.

  13. How to select a proper early warning threshold to detect infectious disease outbreaks based on the China infectious disease automated alert and response system (CIDARS).

    PubMed

    Wang, Ruiping; Jiang, Yonggen; Michael, Engelgau; Zhao, Genming

    2017-06-12

    China Centre for Diseases Control and Prevention (CDC) developed the China Infectious Disease Automated Alert and Response System (CIDARS) in 2005. The CIDARS was used to strengthen infectious disease surveillance and aid in the early warning of outbreak. The CIDARS has been integrated into the routine outbreak monitoring efforts of the CDC at all levels in China. Early warning threshold is crucial for outbreak detection in the CIDARS, but CDCs at all level are currently using thresholds recommended by the China CDC, and these recommended thresholds have recognized limitations. Our study therefore seeks to explore an operational method to select the proper early warning threshold according to the epidemic features of local infectious diseases. The data used in this study were extracted from the web-based Nationwide Notifiable Infectious Diseases Reporting Information System (NIDRIS), and data for infectious disease cases were organized by calendar week (1-52) and year (2009-2015) in Excel format; Px was calculated using a percentile-based moving window (moving window [5 week*5 year], x), where x represents one of 12 centiles (0.40, 0.45, 0.50….0.95). Outbreak signals for the 12 Px were calculated using the moving percentile method (MPM) based on data from the CIDARS. When the outbreak signals generated by the 'mean + 2SD' gold standard were in line with a Px generated outbreak signal for each week during the year of 2014, this Px was then defined as the proper threshold for the infectious disease. Finally, the performance of new selected thresholds for each infectious disease was evaluated by simulated outbreak signals based on 2015 data. Six infectious diseases were selected in this study (chickenpox, mumps, hand foot and mouth diseases (HFMD), scarlet fever, influenza and rubella). Proper thresholds for chickenpox (P75), mumps (P80), influenza (P75), rubella (P45), HFMD (P75), and scarlet fever (P80) were identified. The selected proper thresholds for these 6 infectious diseases could detect almost all simulated outbreaks within a shorter time period compared to thresholds recommended by the China CDC. It is beneficial to select the proper early warning threshold to detect infectious disease aberrations based on characteristics and epidemic features of local diseases in the CIDARS.

  14. Landslide monitoring and early warning systems in Lower Austria - current situation and new developments

    NASA Astrophysics Data System (ADS)

    Thiebes, Benni; Glade, Thomas; Schweigl, Joachim; Jäger, Stefan; Canli, Ekrem

    2014-05-01

    Landslides represent significant hazards in the mountainous areas of Austria. The Regional Geological Surveys are responsible to inform and protect the population, and to mitigate damage to infrastructure. Efforts of the Regional Geological Survey of Lower Austria include detailed site investigations, the planning and installation of protective structures (e.g. rock fall nets) as well as preventive measures such as regional scale landslide susceptibility assessments. For potentially endangered areas, where protection works are not feasible or would simply be too costly, monitoring systems have been installed. However, these systems are dominantly not automatic and require regular field visits to take measurements. Therefore, it is difficult to establish any relation between initiating and controlling factors, thus to fully understand the underlying process mechanism which is essential for any early warning system. Consequently, the implementation of new state-of-the-art monitoring and early warning systems has been started. In this presentation, the design of four landslide monitoring and early warning systems is introduced. The investigated landslide process types include a deep-seated landslide, a rock fall site, a complex earth flow, and a debris flow catchment. The monitoring equipment was chosen depending on the landslide processes and their activity. It aims to allow for a detailed investigation of process mechanisms in relation to its triggers and for reliable prediction of future landslide activities. The deep-seated landslide will be investigated by manual and automatic inclinometers to get detailed insights into subsurface displacements. In addition, TDR sensors and a weather station will be employed to get a better understanding on the influence of rainfall on sub-surface hydrology. For the rockfall site, a wireless sensor network will be installed to get real-time information on acceleration and inclination of potentially unstable blocks. The movement of the earth flow site will be monitored by differential GPS to get high precision information on displacements of marked points. Photogrammtetry based on octocopter surveys will provide spatial information on movement patterns. A similar approach will be followed for the debris flow catchment. Here, the focus lies on a monitoring of the landslide failures in the source area which prepares the material for subsequent debris flow transport. In addition to the methods already mentioned, repeated terrestrial laserscanning campaigns will be used to monitor geomorphological changes at all sites. All important data, which can be single measurements, episodic or continuous monitoring data for a given point (e.g. rainfall, inclination) or of spatial character (e.g. LiDAR measurements), are collected and analysed on an external server. Automatic data analysis methods, such as progressive failure analysis, are carried out automatically based on field measurements. The data and results from all monitoring sites are visualised on a web-based platform which enables registered users to analyse the respective information in near-real-time. Moreover, thresholds can be determined which trigger automated warning messages to the involved scientists if thresholds are exceeded by field measurements. The described system will enable scientists and decision-makers to access the latest data from the monitoring systems. Automatic alarms are raised when thresholds are exceeded to inform them about potentially hazardous changes. Thereby, a more efficient hazard management and early warning can be achieved. Keywords: landslide, rockfall, debris flow, earth flow, monitoring, early warning system.

  15. Using Supervised Machine Learning to Classify Real Alerts and Artifact in Online Multi-signal Vital Sign Monitoring Data

    PubMed Central

    Chen, Lujie; Dubrawski, Artur; Wang, Donghan; Fiterau, Madalina; Guillame-Bert, Mathieu; Bose, Eliezer; Kaynar, Ata M.; Wallace, David J.; Guttendorf, Jane; Clermont, Gilles; Pinsky, Michael R.; Hravnak, Marilyn

    2015-01-01

    OBJECTIVE Use machine-learning (ML) algorithms to classify alerts as real or artifacts in online noninvasive vital sign (VS) data streams to reduce alarm fatigue and missed true instability. METHODS Using a 24-bed trauma step-down unit’s non-invasive VS monitoring data (heart rate [HR], respiratory rate [RR], peripheral oximetry [SpO2]) recorded at 1/20Hz, and noninvasive oscillometric blood pressure [BP] less frequently, we partitioned data into training/validation (294 admissions; 22,980 monitoring hours) and test sets (2,057 admissions; 156,177 monitoring hours). Alerts were VS deviations beyond stability thresholds. A four-member expert committee annotated a subset of alerts (576 in training/validation set, 397 in test set) as real or artifact selected by active learning, upon which we trained ML algorithms. The best model was evaluated on alerts in the test set to enact online alert classification as signals evolve over time. MAIN RESULTS The Random Forest model discriminated between real and artifact as the alerts evolved online in the test set with area under the curve (AUC) performance of 0.79 (95% CI 0.67-0.93) for SpO2 at the instant the VS first crossed threshold and increased to 0.87 (95% CI 0.71-0.95) at 3 minutes into the alerting period. BP AUC started at 0.77 (95%CI 0.64-0.95) and increased to 0.87 (95% CI 0.71-0.98), while RR AUC started at 0.85 (95%CI 0.77-0.95) and increased to 0.97 (95% CI 0.94–1.00). HR alerts were too few for model development. CONCLUSIONS ML models can discern clinically relevant SpO2, BP and RR alerts from artifacts in an online monitoring dataset (AUC>0.87). PMID:26992068

  16. Using Supervised Machine Learning to Classify Real Alerts and Artifact in Online Multisignal Vital Sign Monitoring Data.

    PubMed

    Chen, Lujie; Dubrawski, Artur; Wang, Donghan; Fiterau, Madalina; Guillame-Bert, Mathieu; Bose, Eliezer; Kaynar, Ata M; Wallace, David J; Guttendorf, Jane; Clermont, Gilles; Pinsky, Michael R; Hravnak, Marilyn

    2016-07-01

    The use of machine-learning algorithms to classify alerts as real or artifacts in online noninvasive vital sign data streams to reduce alarm fatigue and missed true instability. Observational cohort study. Twenty-four-bed trauma step-down unit. Two thousand one hundred fifty-three patients. Noninvasive vital sign monitoring data (heart rate, respiratory rate, peripheral oximetry) recorded on all admissions at 1/20 Hz, and noninvasive blood pressure less frequently, and partitioned data into training/validation (294 admissions; 22,980 monitoring hours) and test sets (2,057 admissions; 156,177 monitoring hours). Alerts were vital sign deviations beyond stability thresholds. A four-member expert committee annotated a subset of alerts (576 in training/validation set, 397 in test set) as real or artifact selected by active learning, upon which we trained machine-learning algorithms. The best model was evaluated on test set alerts to enact online alert classification over time. The Random Forest model discriminated between real and artifact as the alerts evolved online in the test set with area under the curve performance of 0.79 (95% CI, 0.67-0.93) for peripheral oximetry at the instant the vital sign first crossed threshold and increased to 0.87 (95% CI, 0.71-0.95) at 3 minutes into the alerting period. Blood pressure area under the curve started at 0.77 (95% CI, 0.64-0.95) and increased to 0.87 (95% CI, 0.71-0.98), whereas respiratory rate area under the curve started at 0.85 (95% CI, 0.77-0.95) and increased to 0.97 (95% CI, 0.94-1.00). Heart rate alerts were too few for model development. Machine-learning models can discern clinically relevant peripheral oximetry, blood pressure, and respiratory rate alerts from artifacts in an online monitoring dataset (area under the curve > 0.87).

  17. AgroClimate: Simulating and Monitoring the Risk of Extreme Weather Events from a Crop Phenology Perspective

    NASA Astrophysics Data System (ADS)

    Fraisse, C.; Pequeno, D.; Staub, C. G.; Perry, C.

    2016-12-01

    Climate variability, particularly the occurrence of extreme weather conditions such as dry spells and heat stress during sensitive crop developmental phases can substantially increase the prospect of reduced crop yields. Yield losses or crop failure risk due to stressful weather conditions vary mainly due to stress severity and exposure time and duration. The magnitude of stress effects is also crop specific, differing in terms of thresholds and adaptation to environmental conditions. To help producers in the Southeast USA mitigate and monitor the risk of crop losses due to extreme weather events we developed a web-based tool that evaluates the risk of extreme weather events during the season taking into account the crop development stages. Producers can enter their plans for the upcoming season in a given field (e.g. crop, variety, planting date, acreage etc.), select or not a specific El Nino Southern Oscillation (ENSO) phase, and will be presented with the probabilities (ranging from 0 -100%) of extreme weather events occurring during sensitive phases of the growing season for the selected conditions. The DSSAT models CERES-Maize, CROPGRO-Soybean, CROPGRO-Cotton, and N-Wheat phenology models have been translated from FORTRAN to a standalone versions in R language. These models have been tested in collaboration with Extension faculty and producers during the 2016 season and their usefulness for risk mitigation and monitoring evaluated. A companion AgroClimate app was also developed to help producers track and monitor phenology development during the cropping season.

  18. Long term real-time monitoring of large alpine rockslides by GB-InSAR: mechanisms, triggers, scenario assessment and Early Warning

    NASA Astrophysics Data System (ADS)

    Crosta, G. B.; Agliardi, F.; Sosio, R.; Rivolta, C.; Leva, D.; Dei Cas, L.

    2012-04-01

    Large rockslides in alpine valleys can undergo catastrophic evolution, posing extraordinary risks to settlements, lives and critical infrastructures. These phenomena are controlled by a complex interplay of lithological, structural, hydrological and meteo-climatic factors, which eventually result in: complex triggering mechanisms and kinematics, highly variable activity, regressive to progressive trends with superimposed acceleration and deceleration periods related to rainfall and snowmelt. Managing large rockslide risk remains challenging, due the high uncertainty related to their geological model and dynamics. In this context, the most promising approach to constrain rockslide kinematics, establish correlations with triggering factors, and predict future displacements, velocity and acceleration, and eventually possible final collapse is based on the analysis and modelling of long-term series of monitoring data. More than traditional monitoring activities, remote sensing represents an important tool aimed at describing local rockslide displacements and kinematics, at distinguishing rates of activity, and providing real time data suitable for early warning. We analyze a long term monitoring dataset collected for a deep-seated rockslide (Ruinon, Lombardy, Italy), actively monitored since 1997 through an in situ monitoring network (topographic and GPS, wire extensometers and distometer baselines) and since 2006 by a ground based radar (GB-InSAR). Monitoring allowed to set-up and update the geological model, identify rockslide extent and geometry, analyze its sensitivity to seasonal changes and their impact on the reliability and EW potential of monitoring data. GB-InSAR data allowed to identify sub-areas with different behaviors associated to outcropping bedrock and thick debris cover, and to set-up a "virtual monitoring network" by a posteriori selection of critical locations. Resulting displacement time series provide a large amount of information even in debris-covered areas, where traditional monitoring fails. Such spatially-distributed, improved information, validated by selected ground-based measurements, allowed to establish new velocity thresholds for EW purposes. Relationships between rainfall and displacement rates allowed to identify different possible failure mechanisms and to constrain the applicability of rainfall EW thresholds. Comparison with temperature and snow melting time series allowed to clarify the sensitivity of the rockslide movement to these controlling factors. Finally, the recognition of the sensitivity to all these factors allowed us to accomplish a more complete hazard assessment by defining different failure scenarios and the associated triggering thresholds.

  19. Active Duty - U.S. Army Noise Induced Hearing Injury Surveillance Calendar Years 2009-2013

    DTIC Science & Technology

    2014-06-01

    rates for sensorineural hearing loss, significant threshold shift, tinnitus , and Noise-Induced Hearing Loss. The intention is to monitor the morbidity...surveillance. These code groups include sensorineural hearing loss (SNHL), significant threshold shift (STS), noise-induced hearing loss (NIHL) and tinnitus ... Tinnitus ) was analyzed using a regression model to determine the trend of incidence rates from 2007 to the current year. Statistical significance of a

  20. Comparison of Heart Rate Response to Tennis Activity between Persons with and without Spinal Cord Injuries: Implications for a Training Threshold

    ERIC Educational Resources Information Center

    Barfield, J. P.; Malone, Laurie A.; Coleman, Tristica A.

    2009-01-01

    The purpose of this study was to evaluate the ability of individuals with spinal cord injury (SCI) to reach a training threshold during on-court sport activity. Monitors collected heart rate (HR) data every 5 s for 11 wheelchair tennis players (WCT) with low paraplegia and 11 able-bodied controls matched on experience and skill level (ABT).…

  1. A computer-controlled color vision test for children based on the Cambridge Colour Test.

    PubMed

    Goulart, Paulo R K; Bandeira, Marcio L; Tsubota, Daniela; Oiwa, Nestor N; Costa, Marcelo F; Ventura, Dora F

    2008-01-01

    The present study aimed at providing conditions for the assessment of color discrimination in children using a modified version of the Cambridge Colour Test (CCT, Cambridge Research Systems Ltd., Rochester, UK). Since the task of indicating the gap of the Landolt C used in that test proved counterintuitive and/or difficult for young children to understand, we changed the target stimulus to a patch of color approximately the size of the Landolt C gap (about 7 degrees of visual angle at 50 cm from the monitor). The modifications were performed for the CCT Trivector test which measures color discrimination for the protan, deutan and tritan confusion lines. Experiment 1 sought to evaluate the correspondence between the CCT and the child-friendly adaptation with adult subjects (n = 29) with normal color vision. Results showed good agreement between the two test versions. Experiment 2 tested the child-friendly software with children 2 to 7 years old (n = 25) using operant training techniques for establishing and maintaining the subjects' performance. Color discrimination thresholds were progressively lower as age increased within the age range tested (2 to 30 years old), and the data--including those obtained for children--fell within the range of thresholds previously obtained for adults with the CCT. The protan and deutan thresholds were consistently lower than tritan thresholds, a pattern repeatedly observed in adults tested with the CCT. The results demonstrate that the test is fit for assessment of color discrimination in young children and may be a useful tool for the establishment of color vision thresholds during development.

  2. Monitoring the Hearing Handicap and the Recognition Threshold of Sentences of a Patient with Unilateral Auditory Neuropathy Spectrum Disorder with Use of a Hearing Aid.

    PubMed

    Lima, Aline Patrícia; Mantello, Erika Barioni; Anastasio, Adriana Ribeiro Tavares

    2016-04-01

    Introduction Treatment for auditory neuropathy spectrum disorder (ANSD) is not yet well established, including the use of hearing aids (HAs). Not all patients diagnosed with ASND have access to HAs, and in some cases HAs are even contraindicated. Objective To monitor the hearing handicap and the recognition threshold of sentences in silence and in noise in a patient with ASND using an HA. Resumed Report A 47-year-old woman reported moderate sensorineural hearing loss in the right ear and high-frequency loss of 4 kHz in the left ear, with bilateral otoacoustic emissions. Auditory brainstem response suggested changes in the functioning of the auditory pathway (up to the inferior colliculus) on the right. An HA was indicated on the right. The patient was tested within a 3-month period before the HA fitting with respect to recognition threshold of sentences in quiet and in noise and for handicap determination. After HA use, she showed a 2.1-dB improvement in the recognition threshold of sentences in silence, a 6.0-dB improvement for recognition threshold of sentences in noise, and a rapid improvement of the signal-to-noise ratio from +3.66 to -2.4 dB when compared with the same tests before the fitting of the HA. Conclusion There was a reduction of the auditory handicap, although speech perception continued to be severely limited. There was a significant improvement of the recognition threshold of sentences in silence and in noise and of the signal-to-noise ratio after 3 months of HA use.

  3. Sequence of Changes in Maize Responding to Soil Water Deficit and Related Critical Thresholds

    PubMed Central

    Ma, Xueyan; He, Qijin; Zhou, Guangsheng

    2018-01-01

    The sequence of changes in crop responding to soil water deficit and related critical thresholds are essential for better drought damage classification and drought monitoring indicators. This study was aimed to investigate the critical thresholds of maize growth and physiological characteristics responding to changing soil water and to reveal the sequence of changes in maize responding to soil water deficit both in seedling and jointing stages based on 2-year’s maize field experiment responding to six initial soil water statuses conducted in 2013 and 2014. Normal distribution tolerance limits were newly adopted to identify critical thresholds of maize growth and physiological characteristics to a wide range of soil water status. The results showed that in both stages maize growth characteristics related to plant water status [stem moisture content (SMC) and leaf moisture content (LMC)], leaf gas exchange [net photosynthetic rate (Pn), transpiration rate (Tr), and stomatal conductance (Gs)], and leaf area were sensitive to soil water deficit, while biomass-related characteristics were less sensitive. Under the concurrent weather conditions and agronomic managements, the critical soil water thresholds in terms of relative soil moisture of 0–30 cm depth (RSM) of maize SMC, LMC, net Pn, Tr, Gs, and leaf area were 72, 65, 62, 60, 58, and 46%, respectively, in seedling stage, and 64, 64, 51, 53, 48, and 46%, respectively, in jointing stage. It indicated that there is a sequence of changes in maize responding to soil water deficit, i.e., their response sequences as soil water deficit intensified: SMC ≥ LMC > leaf gas exchange > leaf area in both stages. This sequence of changes in maize responding to soil water deficit and related critical thresholds may be better indicators of damage classification and drought monitoring. PMID:29765381

  4. Monitoring the Hearing Handicap and the Recognition Threshold of Sentences of a Patient with Unilateral Auditory Neuropathy Spectrum Disorder with Use of a Hearing Aid

    PubMed Central

    Lima, Aline Patrícia; Mantello, Erika Barioni; Anastasio, Adriana Ribeiro Tavares

    2015-01-01

    Introduction Treatment for auditory neuropathy spectrum disorder (ANSD) is not yet well established, including the use of hearing aids (HAs). Not all patients diagnosed with ASND have access to HAs, and in some cases HAs are even contraindicated. Objective To monitor the hearing handicap and the recognition threshold of sentences in silence and in noise in a patient with ASND using an HA. Resumed Report A 47-year-old woman reported moderate sensorineural hearing loss in the right ear and high-frequency loss of 4 kHz in the left ear, with bilateral otoacoustic emissions. Auditory brainstem response suggested changes in the functioning of the auditory pathway (up to the inferior colliculus) on the right. An HA was indicated on the right. The patient was tested within a 3-month period before the HA fitting with respect to recognition threshold of sentences in quiet and in noise and for handicap determination. After HA use, she showed a 2.1-dB improvement in the recognition threshold of sentences in silence, a 6.0-dB improvement for recognition threshold of sentences in noise, and a rapid improvement of the signal-to-noise ratio from +3.66 to −2.4 dB when compared with the same tests before the fitting of the HA. Conclusion There was a reduction of the auditory handicap, although speech perception continued to be severely limited. There was a significant improvement of the recognition threshold of sentences in silence and in noise and of the signal-to-noise ratio after 3 months of HA use. PMID:27096026

  5. Chronic effects of environmentally-relevant concentrations of lead in Pelophylax nigromaculata tadpoles: Threshold dose and adverse effects.

    PubMed

    Huang, Min-Yi; Duan, Ren-Yan; Ji, Xiang

    2014-06-01

    Lead (Pb) is a common heavy metal in the natural environment, but its concentration has been increasing alongside widespread industrial and agricultural development in China. The dark-spotted frog Pelophylax (formerly Rana) nigromaculata (Anura: Ranidae) is distributed across East Asia and inhabits anthropogenic habitats such as farmland. Here, P. nigromaculata tadpoles (Gosner stage 19-46) were exposed to Pb at different concentrations (0, 40, 80, 160, 320, 640 and 1280µg/L) and Pb-induced survival, metamorphosis time, development, malformations, mobility and gonad structure were monitored. The results showed that above the threshold concentration of Pb, adverse effects were obvious. As the concentration of Pb increased, the adverse effects on different traits followed different patterns: the effects on hindlimb length, survival rate, metamorphosis rate, total malformation rate, swimming speed and jumping speed largely exhibited a linear pattern; the effects on snout-vent length, body mass and forelimb length largely exhibited a bimodal pattern. Sex ratio and gonadal histology were not affected by Pb, suggesting that Pb is not strongly estrogenic in P. nigromaculata. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Low energy analysis techniques for CUORE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alduino, C.; Alfonso, K.; Artusa, D. R.

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of 130Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. Here in this article, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, amore » single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils inCUORE-0.« less

  7. Low energy analysis techniques for CUORE

    DOE PAGES

    Alduino, C.; Alfonso, K.; Artusa, D. R.; ...

    2017-12-12

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of 130Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. Here in this article, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, amore » single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils inCUORE-0.« less

  8. Examination of nanosecond laser melting thresholds in refractory metals by shear wave acoustics

    NASA Astrophysics Data System (ADS)

    Abdullaev, A.; Muminov, B.; Rakhymzhanov, A.; Mynbayev, N.; Utegulov, Z. N.

    2017-07-01

    Nanosecond laser pulse-induced melting thresholds in refractory (Nb, Mo, Ta and W) metals are measured using detected laser-generated acoustic shear waves. Obtained melting threshold values were found to be scaled with corresponding melting point temperatures of investigated materials displaying dissimilar shearing behavior. The experiments were conducted with motorized control of the incident laser pulse energies with small and uniform energy increments to reach high measurement accuracy and real-time monitoring of the epicentral acoustic waveforms from the opposite side of irradiated sample plates. Measured results were found to be in good agreement with numerical finite element model solving coupled elastodynamic and thermal conduction governing equations on structured quadrilateral mesh. Solid-melt phase transition was handled by means of apparent heat capacity method. The onset of melting was attributed to vanished shear modulus and rapid radial molten pool propagation within laser-heated metal leading to preferential generation of transverse acoustic waves from sources surrounding the molten mass resulting in the delay of shear wave transit times. Developed laser-based technique aims for applications involving remote examination of rapid melting processes of materials present in harsh environment (e.g. spent nuclear fuels) with high spatio-temporal resolution.

  9. Definition of Verifiable School IPM

    EPA Pesticide Factsheets

    EPA is promoting use of verifiable school IPM. This is an activity that includes several elements with documentation, including pest identification, action thresholds, monitoring, effective pest control.

  10. Antipsychotic drug poisoning monitoring of clozapine in urine by using coffee ring effect based surface-enhanced Raman spectroscopy.

    PubMed

    Zhu, Qingxia; Yu, Xiaoyan; Wu, Zebing; Lu, Feng; Yuan, Yongfang

    2018-07-19

    Antipsychotics are the drugs most often involved in drug poisoning cases, and therefore, therapeutic drug monitoring (TDM) is necessary for safe and effective medication administration of these drugs. In this study, a coffee ring effect-based surface-enhanced Raman spectroscopy (CRE-SERS) method was developed and successfully used to monitor antipsychotic poisoning by using urine samples for the first time. The established method exhibited excellent SERS performance since more hot spots were obtained in the "coffee ring". Using the optimized CRE-SERS method, the sensitivity was improved one order more than that of the conventional method with reasonable reproducibility. The antipsychotic drug clozapine (CLO) spiked into urine samples at 0.5-50 μg mL -1 was quantitatively detected, at concentrations above the thresholds for toxicity. The CRE-SERS method allowed CLO and its metabolites to be ultimately distinguished from real poisoning urine samples. The coffee-ring effect would provide more opportunities for practical applications of the SERS-based method. The frequent occurrence of drug poisoning may have created a new area for the application of the CRE-SERS method. It is anticipated that the developed method will also have great potential for other drug poisoning monitoring. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Chip-scale sensor system integration for portable health monitoring.

    PubMed

    Jokerst, Nan M; Brooke, Martin A; Cho, Sang-Yeon; Shang, Allan B

    2007-12-01

    The revolution in integrated circuits over the past 50 yr has produced inexpensive computing and communications systems that are powerful and portable. The technologies for these integrated chip-scale sensing systems, which will be miniature, lightweight, and portable, are emerging with the integration of sensors with electronics, optical systems, micromachines, microfluidics, and the integration of chemical and biological materials (soft/wet material integration with traditional dry/hard semiconductor materials). Hence, we stand at a threshold for health monitoring technology that promises to provide wearable biochemical sensing systems that are comfortable, inauspicious, wireless, and battery-operated, yet that continuously monitor health status, and can transmit compressed data signals at regular intervals, or alarm conditions immediately. In this paper, we explore recent results in chip-scale sensor integration technology for health monitoring. The development of inexpensive chip-scale biochemical optical sensors, such as microresonators, that are customizable for high sensitivity coupled with rapid prototyping will be discussed. Ground-breaking work in the integration of chip-scale optical systems to support these optical sensors will be highlighted, and the development of inexpensive Si complementary metal-oxide semiconductor circuitry (which makes up the vast majority of computational systems today) for signal processing and wireless communication with local receivers that lie directly on the chip-scale sensor head itself will be examined.

  12. Distributed Monitoring of the R(sup 2) Statistic for Linear Regression

    NASA Technical Reports Server (NTRS)

    Bhaduri, Kanishka; Das, Kamalika; Giannella, Chris R.

    2011-01-01

    The problem of monitoring a multivariate linear regression model is relevant in studying the evolving relationship between a set of input variables (features) and one or more dependent target variables. This problem becomes challenging for large scale data in a distributed computing environment when only a subset of instances is available at individual nodes and the local data changes frequently. Data centralization and periodic model recomputation can add high overhead to tasks like anomaly detection in such dynamic settings. Therefore, the goal is to develop techniques for monitoring and updating the model over the union of all nodes data in a communication-efficient fashion. Correctness guarantees on such techniques are also often highly desirable, especially in safety-critical application scenarios. In this paper we develop DReMo a distributed algorithm with very low resource overhead, for monitoring the quality of a regression model in terms of its coefficient of determination (R2 statistic). When the nodes collectively determine that R2 has dropped below a fixed threshold, the linear regression model is recomputed via a network-wide convergecast and the updated model is broadcast back to all nodes. We show empirically, using both synthetic and real data, that our proposed method is highly communication-efficient and scalable, and also provide theoretical guarantees on correctness.

  13. STIMULUS AND TRANSDUCER EFFECTS ON THRESHOLD

    PubMed Central

    Flamme, Gregory A.; Geda, Kyle; McGregor, Kara; Wyllys, Krista; Deiters, Kristy K.; Murphy, William J.; Stephenson, Mark R.

    2015-01-01

    Objective This study examined differences in thresholds obtained under Sennheiser HDA200 circumaural earphones using pure tone, equivalent rectangular noise bands, and 1/3 octave noise bands relative to thresholds obtained using Telephonics TDH-39P supra-aural earphones. Design Thresholds were obtained via each transducer and stimulus condition six times within a 10-day period. Study Sample Forty-nine adults were selected from a prior study to represent low, moderate, and high threshold reliability. Results The results suggested that (1) only small adjustments were needed to reach equivalent TDH-39P thresholds, (2) pure-tone thresholds obtained with HDA200 circumaural earphones had reliability equal to or better than those obtained using TDH-39P earphones, (3) the reliability of noise-band thresholds improved with broader stimulus bandwidth and was either equal to or better than pure-tone thresholds, and (4) frequency-specificity declined with stimulus bandwidths greater than one Equivalent Rectangular Band, which could complicate early detection of hearing changes that occur within a narrow frequency range. Conclusions These data suggest that circumaural earphones such as the HDA200 headphones provide better reliability for audiometric testing as compared to the TDH-39P earphones. These data support the use of noise bands, preferably ERB noises, as stimuli for audiometric monitoring. PMID:25549164

  14. Operational feasibility of lot quality assurance sampling (LQAS) as a tool in routine process monitoring of filariasis control programmes.

    PubMed

    Vanamail, P; Subramanian, S; Srividya, A; Ravi, R; Krishnamoorthy, K; Das, P K

    2006-08-01

    Lot quality assurance sampling (LQAS) with two-stage sampling plan was applied for rapid monitoring of coverage after every round of mass drug administration (MDA). A Primary Health Centre (PHC) consisting of 29 villages in Thiruvannamalai district, Tamil Nadu was selected as the study area. Two threshold levels of coverage were used: threshold A (maximum: 60%; minimum: 40%) and threshold B (maximum: 80%; minimum: 60%). Based on these thresholds, one sampling plan each for A and B was derived with the necessary sample size and the number of allowable defectives (i.e. defectives mean those who have not received the drug). Using data generated through simple random sampling (SRSI) of 1,750 individuals in the study area, LQAS was validated with the above two sampling plans for its diagnostic and field applicability. Simultaneously, a household survey (SRSH) was conducted for validation and cost-effectiveness analysis. Based on SRSH survey, the estimated coverage was 93.5% (CI: 91.7-95.3%). LQAS with threshold A revealed that by sampling a maximum of 14 individuals and by allowing four defectives, the coverage was >or=60% in >90% of villages at the first stage. Similarly, with threshold B by sampling a maximum of nine individuals and by allowing four defectives, the coverage was >or=80% in >90% of villages at the first stage. These analyses suggest that the sampling plan (14,4,52,25) of threshold A may be adopted in MDA to assess if a minimum coverage of 60% has been achieved. However, to achieve the goal of elimination, the sampling plan (9, 4, 42, 29) of threshold B can identify villages in which the coverage is <80% so that remedial measures can be taken. Cost-effectiveness analysis showed that both options of LQAS are more cost-effective than SRSH to detect a village with a given level of coverage. The cost per village was US dollars 76.18 under SRSH. The cost of LQAS was US dollars 65.81 and 55.63 per village for thresholds A and B respectively. The total financial cost of classifying a village correctly with the given threshold level of LQAS could be reduced by 14% and 26% of the cost of conventional SRSH method.

  15. Implementation of a landslide early warning system based on near-real-time monitoring, multisensor mapping and geophysical measurements

    NASA Astrophysics Data System (ADS)

    Teza, Giordano; Galgaro, Antonio; Francese, Roberto; Ninfo, Andrea; Mariani, Rocco

    2017-04-01

    An early warning system has been implemented to monitor the Perarolo di Cadore landslide (North-Eastern Italian Alps), which is a slump whose induced risk is fairly high because a slope collapse could form a temporary dam on the underlying torrent and, therefore, could directly threaten the close village. A robotic total station (RTS) measures, with 6h returning time, the positions of 23 retro-reflectors placed on the landslide upper and middle sectors. The landslide's kinematical behavior derived from these near-real-time (NRT) surface displacements is interpreted on the basis of available geomorphological and geological information, geometrical data provided by some laser scanning and photogrammetric surveys, and a landslide model obtained by means of 3D Electrical Resistivity Tomography (3D ERT) measurements. In this way, an analysis of the time series provided by RTS and a pluviometer, which cover several years, allows the definition of some pre-alert and alert kinematical and rainfall thresholds. These thresholds, as well as the corresponding operational recommendations, are currently used for early warning purposes by Authorities involved in risk management for the Perarolo landslide. It should be noted the fact that, as new RTS and pluviometric data are available, the thresholds can be updated and, therefore, a fine tuning of the early warning system can be carried out in order to improve its performance. Although the proposed approach has been implemented in a particular case, it can be used to develop an early warning system based on NRT data in each site where a landslide threatens infrastructures and/or villages that cannot be relocated.

  16. Prioritizing CD4 Count Monitoring in Response to ART in Resource-Constrained Settings: A Retrospective Application of Prediction-Based Classification

    PubMed Central

    Liu, Yan; Li, Xiaohong; Johnson, Margaret; Smith, Collette; Kamarulzaman, Adeeba bte; Montaner, Julio; Mounzer, Karam; Saag, Michael; Cahn, Pedro; Cesar, Carina; Krolewiecki, Alejandro; Sanne, Ian; Montaner, Luis J.

    2012-01-01

    Background Global programs of anti-HIV treatment depend on sustained laboratory capacity to assess treatment initiation thresholds and treatment response over time. Currently, there is no valid alternative to CD4 count testing for monitoring immunologic responses to treatment, but laboratory cost and capacity limit access to CD4 testing in resource-constrained settings. Thus, methods to prioritize patients for CD4 count testing could improve treatment monitoring by optimizing resource allocation. Methods and Findings Using a prospective cohort of HIV-infected patients (n = 1,956) monitored upon antiretroviral therapy initiation in seven clinical sites with distinct geographical and socio-economic settings, we retrospectively apply a novel prediction-based classification (PBC) modeling method. The model uses repeatedly measured biomarkers (white blood cell count and lymphocyte percent) to predict CD4+ T cell outcome through first-stage modeling and subsequent classification based on clinically relevant thresholds (CD4+ T cell count of 200 or 350 cells/µl). The algorithm correctly classified 90% (cross-validation estimate = 91.5%, standard deviation [SD] = 4.5%) of CD4 count measurements <200 cells/µl in the first year of follow-up; if laboratory testing is applied only to patients predicted to be below the 200-cells/µl threshold, we estimate a potential savings of 54.3% (SD = 4.2%) in CD4 testing capacity. A capacity savings of 34% (SD = 3.9%) is predicted using a CD4 threshold of 350 cells/µl. Similar results were obtained over the 3 y of follow-up available (n = 619). Limitations include a need for future economic healthcare outcome analysis, a need for assessment of extensibility beyond the 3-y observation time, and the need to assign a false positive threshold. Conclusions Our results support the use of PBC modeling as a triage point at the laboratory, lessening the need for laboratory-based CD4+ T cell count testing; implementation of this tool could help optimize the use of laboratory resources, directing CD4 testing towards higher-risk patients. However, further prospective studies and economic analyses are needed to demonstrate that the PBC model can be effectively applied in clinical settings. Please see later in the article for the Editors' Summary PMID:22529752

  17. Remote humidity and temperature real time monitoring system for studying seed biology

    NASA Astrophysics Data System (ADS)

    Balachandran, Thiruparan

    This thesis discusses the design, prototyping, and testing of a remote monitoring system that is used to study the biology of seeds under various controlled conditions. Seed scientists use air-tight boxes to maintain relative humidity, which influences seed longevity and seed dormancy break. The common practice is the use of super-saturated solutions either with different chemicals or different concentrations of LiCl to create various relative humidity. Theretofore, no known system has been developed to remotely monitor the environmental conditions inside these boxes in real time. This thesis discusses the development of a remote monitoring system that can be used to accurately monitor and measure the relative humidity and temperature inside sealed boxes for the study of seed biology. The system allows the remote and real-time monitoring of these two parameters in five boxes with different conditions. It functions as a client that is connected to the internet using Wireless Fidelity (Wi-Fi) technology while Google spreadsheet is used as the server for uploading and plotting the data. This system directly gets connected to the Google sever through Wi-Fi and uploads the sensors' values in a Google spread sheet. Application-specific software is created and the user can monitor the data in real time and/or download the data into Excel for further analyses. Using Google drive app the data can be viewed using a smart phone or a tablet. Furthermore, an electronic mail (e-mail) alert is also integrated into the system. Whenever measured values go beyond the threshold values, the user will receive an e-mail alert.

  18. A NEW DIFFERENTIAL AND ERRANT BEAM CURRENT MONITOR FOR THE SNS* ACCELERATOR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blokland, Willem; Peters, Charles C

    2013-01-01

    A new Differential and errant Beam Current Monitor (DBCM) is being implemented for both the Spallation Neutron Source's Medium Energy Beam Transport (MEBT) and the Super Conducting Linac (SCL) accelerator sections. These new current monitors will abort the beam when the difference between two toroidal pickups exceeds a threshold. The MEBT DBCM will protect the MEBT chopper target, while the SCL DBCM will abort beam to minimize fast beam losses in the SCL cavities. The new DBCM will also record instances of errant beam, such as beam dropouts, to assist in further optimization of the SNS Accelerator. A software Errantmore » Beam Monitor was implemented on the regular BCM hardware to study errant beam pulses. The new system will take over this functionality and will also be able to abort beam on pulse-to-pulse variations. Because the system is based on the FlexRIO hardware and programmed in LabVIEW FPGA, it will be able to abort beam in about 5 us. This paper describes the development, implementation, and initial test results of the DBCM, as well as errant beam examples.« less

  19. Fog-Based Two-Phase Event Monitoring and Data Gathering in Vehicular Sensor Networks

    PubMed Central

    Yang, Fan; Su, Jinsong; Zhou, Qifeng; Wang, Tian; Zhang, Lu; Xu, Yifan

    2017-01-01

    Vehicular nodes are equipped with more and more sensing units, and a large amount of sensing data is generated. Recently, more and more research considers cooperative urban sensing as the heart of intelligent and green city traffic management. The key components of the platform will be a combination of a pervasive vehicular sensing system, as well as a central control and analysis system, where data-gathering is a fundamental component. However, the data-gathering and monitoring are also challenging issues in vehicular sensor networks because of the large amount of data and the dynamic nature of the network. In this paper, we propose an efficient continuous event-monitoring and data-gathering framework based on fog nodes in vehicular sensor networks. A fog-based two-level threshold strategy is adopted to suppress unnecessary data upload and transmissions. In the monitoring phase, nodes sense the environment in low cost sensing mode and generate sensed data. When the probability of the event is high and exceeds some threshold, nodes transfer to the event-checking phase, and some nodes would be selected to transfer to the deep sensing mode to generate more accurate data of the environment. Furthermore, it adaptively adjusts the threshold to upload a suitable amount of data for decision making, while at the same time suppressing unnecessary message transmissions. Simulation results showed that the proposed scheme could reduce more than 84 percent of the data transmissions compared with other existing algorithms, while it detects the events and gathers the event data. PMID:29286320

  20. Processing circuitry for single channel radiation detector

    NASA Technical Reports Server (NTRS)

    Holland, Samuel D. (Inventor); Delaune, Paul B. (Inventor); Turner, Kathryn M. (Inventor)

    2009-01-01

    Processing circuitry is provided for a high voltage operated radiation detector. An event detector utilizes a comparator configured to produce an event signal based on a leading edge threshold value. A preferred event detector does not produce another event signal until a trailing edge threshold value is satisfied. The event signal can be utilized for counting the number of particle hits and also for controlling data collection operation for a peak detect circuit and timer. The leading edge threshold value is programmable such that it can be reprogrammed by a remote computer. A digital high voltage control is preferably operable to monitor and adjust high voltage for the detector.

  1. Using 3D Printing for Rapid Prototyping of Characterization Tools for Investigating Powder Blend Behavior.

    PubMed

    Hirschberg, Cosima; Boetker, Johan P; Rantanen, Jukka; Pein-Hackelbusch, Miriam

    2018-02-01

    There is an increasing need to provide more detailed insight into the behavior of particulate systems. The current powder characterization tools are developed empirically and in many cases, modification of existing equipment is difficult. More flexible tools are needed to provide understanding of complex powder behavior, such as mixing process and segregation phenomenon. An approach based on the fast prototyping of new powder handling geometries and interfacing solutions for process analytical tools is reported. This study utilized 3D printing for rapid prototyping of customized geometries; overall goal was to assess mixing process of powder blends at small-scale with a combination of spectroscopic and mechanical monitoring. As part of the segregation evaluation studies, the flowability of three different paracetamol/filler-blends at different ratios was investigated, inter alia to define the percolation thresholds. Blends with a paracetamol wt% above the percolation threshold were subsequently investigated in relation to their segregation behavior. Rapid prototyping using 3D printing allowed designing two funnels with tailored flow behavior (funnel flow) of model formulations, which could be monitored with an in-line near-infrared (NIR) spectrometer. Calculating the root mean square (RMS) of the scores of the two first principal components of the NIR spectra visualized spectral variation as a function of process time. In a same setup, mechanical properties (basic flow energy) of the powder blend were monitored during blending. Rapid prototyping allowed for fast modification of powder testing geometries and easy interfacing with process analytical tools, opening new possibilities for more detailed powder characterization.

  2. 24 CFR 594.7 - Other threshold requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 3 2011-04-01 2010-04-01 true Other threshold requirements. 594.7 Section 594.7 Housing and Urban Development Regulations Relating to Housing and Urban Development... Other threshold requirements. In addition, an applicant must meet the following threshold requirements...

  3. ToTCompute: A Novel EEG-Based TimeOnTask Threshold Computation Mechanism for Engagement Modelling and Monitoring

    ERIC Educational Resources Information Center

    Ghergulescu, Ioana; Muntean, Cristina Hava

    2016-01-01

    Engagement influences participation, progression and retention in game-based e-learning (GBeL). Therefore, GBeL systems should engage the players in order to support them to maximize their learning outcomes, and provide the players with adequate feedback to maintain their motivation. Innovative engagement monitoring solutions based on players'…

  4. A custom acoustic emission monitoring system for harsh environments: application to freezing-induced damage in alpine rock-walls

    NASA Astrophysics Data System (ADS)

    Girard, L.; Beutel, J.; Gruber, S.; Hunziker, J.; Lim, R.; Weber, S.

    2012-06-01

    We present a custom acoustic emission (AE) monitoring system designed to perform long-term measurements on high-alpine rock-walls. AE monitoring is a common technique for characterizing damage evolution in solid materials. The system is based on a two-channel AE sensor node (AE-node) integrated into a Wireless Sensor Network (WSN) customized for operation in harsh environments. This wireless architecture offers flexibility in the deployment of AE-nodes at any position of the rock-wall that needs to be monitored, within a range of a few hundred meters from a core station connected to the internet. The system achieves near real-time data delivery and allows the user to remotely control the AE detection threshold. In order to protect AE sensors and capture acoustic signals from specific depths of the rock-wall, a special casing was developed. The monitoring system is completed by two probes that measure rock temperature and liquid water content, both probes being also integrated into the WSN. We report a first deployment of the monitoring system on a rock-wall at Jungfraujoch, 3500 m a.s.l., Switzerland. While this first deployment of the monitoring system aims to support fundamental research on processes that damage rock under cold climate, the system could serve a number of other applications, including rock-fall hazard surveillance or structural monitoring of concrete structures.

  5. A custom acoustic emission monitoring system for harsh environments: application to freezing-induced damage in alpine rock walls

    NASA Astrophysics Data System (ADS)

    Girard, L.; Beutel, J.; Gruber, S.; Hunziker, J.; Lim, R.; Weber, S.

    2012-11-01

    We present a custom acoustic emission (AE) monitoring system designed to perform long-term measurements on high-alpine rock walls. AE monitoring is a common technique for characterizing damage evolution in solid materials. The system is based on a two-channel AE sensor node (AE-node) integrated into a wireless sensor network (WSN) customized for operation in harsh environments. This wireless architecture offers flexibility in the deployment of AE-nodes at any position of the rock wall that needs to be monitored, within a range of a few hundred meters from a core station connected to the internet. The system achieves near real-time data delivery and allows the user to remotely control the AE detection threshold. In order to protect AE sensors and capture acoustic signals from specific depths of the rock wall, a special casing was developed. The monitoring system is completed by two probes that measure rock temperature and liquid water content, both probes being also integrated into the WSN. We report a first deployment of the monitoring system on a rock wall at Jungfraujoch, 3500 m a.s.l., Switzerland. While this first deployment of the monitoring system aims to support fundamental research on processes that damage rock under cold climate, the system could serve a number of other applications, including rock fall hazard surveillance or structural monitoring of concrete structures.

  6. Developmental and hormonal regulation of thermosensitive neuron potential activity in rat brain.

    PubMed

    Belugin, S; Akino, K; Takamura, N; Mine, M; Romanovsky, D; Fedoseev, V; Kubarko, A; Kosaka, M; Yamashita, S

    1999-08-01

    To understand the involvement of thyroid hormone on the postnatal development of hypothalamic thermosensitive neurons, we focused on the analysis of thermosensitive neuronal activity in the preoptic and anterior hypothalamic (PO/AH) regions of developing rats with and without hypothyroidism. In euthyroid rats, the distribution of thermosensitive neurons in PO/AH showed that in 3-week-old rats (46 neurons tested), 19.5% were warm-sensitive and 80.5% were nonsensitive. In 5- to 12-week-old euthyroid rats (122 neurons), 33.6% were warm-sensitive and 66.4% were nonsensitive. In 5- to 12-week-old hypothyroid rats (108 neurons), however, 18.5% were warm-sensitive and 81.5% were nonsensitive. Temperature thresholds of warm-sensitive neurons were lower in 12-week-old euthyroid rats (36.4+/-0.2 degrees C, n = 15, p<0.01,) than in 3-week-old and in 5-week-old euthyroid rats (38.5+/-0.5 degrees C, n = 9 and 38.0+/-0.3 degrees C, n = 15, respectively). The temperature thresholds of warm-sensitive neurons in 12-week-old hypothyroid rats (39.5+/-0.3 degrees C, n = 8) were similar to that of warm-sensitive neurons of 3-week-old raats (euthyroid and hypothyroid). In contrast, there was no difference in the thresholds of warm-sensitive neurons between hypothyroid and euthyroid rats at the age of 3-5 weeks. In conclusion, monitoring the thermosensitive neuronal tissue activity demonstrated the evidence that thyroid hormone regulates the maturation of warm-sensitive hypothalamic neurons in developing rat brain by electrophysiological analysis.

  7. Setting limits: Using air pollution thresholds to protect and restore U.S. ecosystems

    USGS Publications Warehouse

    Fenn, M.E.; Lambert, K.F.; Blett, T.F.; Burns, Douglas A.; Pardo, L.H.; Lovett, Gary M.; Haeuber, R. A.; Evers, D.C.; Driscoll, C.T.; Jeffries, D.S.

    2011-01-01

    More than four decades of research provide unequivocal evidence that sulfur, nitrogen, and mercury pollution have altered, and will continue to alter, our nation's lands and waters. The emission and deposition of air pollutants harm native plants and animals, degrade water quality, affect forest productivity, and are damaging to human health. Many air quality policies limit emissions at the source but these control measures do not always consider ecosystem impacts. Air pollution thresholds at which ecological effects are observed, such as critical loads, are effective tools for assessing the impacts of air pollution on essential ecosystem services and for informing public policy. U.S. ecosystems can be more effectively protected and restored by using a combination of emissions-based approaches and science-based thresholds of ecosystem damage. Based on the results of a comprehensive review of air pollution thresholds, we conclude: ??? Ecosystem services such as air and water purification, decomposition and detoxification of waste materials, climate regulation, regeneration of soil fertility, production and biodiversity maintenance, as well as crop, timber and fish supplies are impacted by deposition of nitrogen, sulfur, mercury and other pollutants. The consequences of these changes may be difficult or impossible to reverse as impacts cascade throughout affected ecosystems. ??? The effects of too much nitrogen are common across the U.S. and include altered plant and lichen communities, enhanced growth of invasive species, eutrophication and acidification of lands and waters, and habitat deterioration for native species, including endangered species. ??? Lake, stream and soil acidification is widespread across the eastern United States. Up to 65% of lakes within sensitive areas receive acid deposition that exceeds critical loads. ??? Mercury contamination adversely affects fish in many inland and coastal waters. Fish consumption advisories for mercury exist in all 50 states and on many tribal lands. High concentrations of mercury in wildlife are also widespread and have multiple adverse effects. ??? Air quality programs, such as those stemming from the 1990 Clean Air Act Amendments, have helped decrease air pollution even as population and energy demand have increased. Yet, they do not adequately protect ecosystems from long-term damage. Moreover they do not address ammonia emissions. ??? A stronger ecosystem basis for air pollutant policies could be established through adoption of science-based thresholds. Existing monitoring programs track vital information needed to measure the response to policies, and could be expanded to include appropriate chemical and biological indicators for terrestrial and aquatic ecosystems and establishment of a national ecosystem monitoring network for mercury. The development and use of air pollution thresholds for ecosystem protection and management is increasing in the United States, yet threshold approaches remain underutilized. Ecological thresholds for air pollution, such as critical loads for nitrogen and sulfur deposition, are not currently included in the formal regulatory process for emissions controls in the United States, although they are now considered in local management decisions by the National Park Service and U.S. Forest Service. Ecological thresholds offer a scientifically sound approach to protecting and restoring U.S. ecosystems and an important tool for natural resource management and policy. ?? The Ecological Society of America.

  8. Health Monitoring Survey of Bell 412EP Transmissions

    NASA Technical Reports Server (NTRS)

    Tucker, Brian E.; Dempsey, Paula J.

    2016-01-01

    Health and usage monitoring systems (HUMS) use vibration-based Condition Indicators (CI) to assess the health of helicopter powertrain components. A fault is detected when a CI exceeds its threshold value. The effectiveness of fault detection can be judged on the basis of assessing the condition of actual components from fleet aircraft. The Bell 412 HUMS-equipped helicopter is chosen for such an evaluation. A sample of 20 aircraft included 12 aircraft with confirmed transmission and gearbox faults (detected by CIs) and eight aircraft with no known faults. The associated CI data is classified into "healthy" and "faulted" populations based on actual condition and these populations are compared against their CI thresholds to quantify the probability of false alarm and the probability of missed detection. Receiver Operator Characteristic analysis is used to optimize thresholds. Based on the results of the analysis, shortcomings in the classification method are identified for slow-moving CI trends. Recommendations for improving classification using time-dependent receiver-operator characteristic methods are put forth. Finally, lessons learned regarding OEM-operator communication are presented.

  9. Dynamic thresholds and a summary ROC curve: Assessing prognostic accuracy of longitudinal markers.

    PubMed

    Saha-Chaudhuri, P; Heagerty, P J

    2018-04-19

    Cancer patients, chronic kidney disease patients, and subjects infected with HIV are routinely monitored over time using biomarkers that represent key health status indicators. Furthermore, biomarkers are frequently used to guide initiation of new treatments or to inform changes in intervention strategies. Since key medical decisions can be made on the basis of a longitudinal biomarker, it is important to evaluate the potential accuracy associated with longitudinal monitoring. To characterize the overall accuracy of a time-dependent marker, we introduce a summary ROC curve that displays the overall sensitivity associated with a time-dependent threshold that controls time-varying specificity. The proposed statistical methods are similar to concepts considered in disease screening, yet our methods are novel in choosing a potentially time-dependent threshold to define a positive test, and our methods allow time-specific control of the false-positive rate. The proposed summary ROC curve is a natural averaging of time-dependent incident/dynamic ROC curves and therefore provides a single summary of net error rates that can be achieved in the longitudinal setting. Copyright © 2018 John Wiley & Sons, Ltd.

  10. I: Biomarker quantification in fish exposed to crude oil as input to species sensitivity distributions and threshold values for environmental monitoring.

    PubMed

    Sanni, Steinar; Björkblom, Carina; Jonsson, Henrik; Godal, Brit F; Liewenborg, Birgitta; Lyng, Emily; Pampanin, Daniela M

    2017-04-01

    The aim of this study was to determine a suitable set of biomarker based methods for environmental monitoring in sub-arctic and temperate offshore areas using scientific knowledge on the sensitivity of fish species to dispersed crude oil. Threshold values for environmental monitoring and risk assessment were obtained based on a quantitative comparison of biomarker responses. Turbot, halibut, salmon and sprat were exposed for up to 8 weeks to five different sub-lethal concentrations of dispersed crude oil. Biomarkers assessing PAH metabolites, oxidative stress, detoxification system I activity, genotoxicity, immunotoxicity, endocrine disruption, general cellular stress and histological changes were measured. Results showed that PAH metabolites, CYP1A/EROD, DNA adducts and histopathology rendered the most robust results across the different fish species, both in terms of sensitivity and dose-responsiveness. The reported results contributed to forming links between biomonitoring and risk assessment procedures by using biomarker species sensitivity distributions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. 20 years of long-term atrazine monitoring in a shallow aquifer in western Germany.

    PubMed

    Vonberg, David; Vanderborght, Jan; Cremer, Nils; Pütz, Thomas; Herbst, Michael; Vereecken, Harry

    2014-03-01

    Atrazine was banned in Germany in 1991 due to findings of atrazine concentrations in ground- and drinking waters exceeding threshold values. Monitoring of atrazine concentrations in the groundwater since then provides information about the resilience of the groundwater quality to changing agricultural practices. In this study, we present results of a monitoring campaign of atrazine concentrations in the Zwischenscholle aquifer. This phreatic aquifer is exposed to intensive agricultural land use and susceptible to contaminants due to a shallow water table. In total 60 observation wells (OWs) have been monitored since 1991, of which 15 are sampled monthly today. Descriptive statistics of monitoring data were derived using the "regression on order statistics" (ROS) data censoring approach, estimating values for nondetects. The monitoring data shows that even 20 years after the ban of atrazine, the groundwater concentrations of sampled OWs remain on a level close to the threshold value of 0.1 μg l(-1) without any considerable decrease. The spatial distribution of atrazine concentrations is highly heterogeneous with OWs exhibiting permanently concentrations above the regulatory threshold on the one hand and OWs were concentrations are mostly below the limit of quantification (LOQ) on the other hand. A deethylatrazine-to-atrazine ratio (DAR) was used to distinguish between diffuse - and point-source contamination, with a global mean value of 0.84 indicating mainly diffuse contamination. Principle Component Analysis (PCA) of the monitoring dataset demonstrated relationships between the metabolite desisopropylatrazine, which was found to be exclusively associated with the parent compound simazine but not with atrazine, and between deethylatrazine, atrazine, nitrate, and the specific electrical conductivity. These parameters indicate agricultural impacts on groundwater quality. The findings presented in this study point at the difficulty to estimate mean concentrations of contamination for entire aquifers and to evaluate groundwater quality based on average parameters. However, analytical data of monthly sampled single observation wells provide adequate information to characterize local contamination and evolutionary trends of pollutant concentration. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. An adaptive, comprehensive monitoring strategy for chemicals of emerging concern (CECs) in California's Aquatic Ecosystems.

    PubMed

    Maruya, Keith A; Schlenk, Daniel; Anderson, Paul D; Denslow, Nancy D; Drewes, Jörg E; Olivieri, Adam W; Scott, Geoffrey I; Snyder, Shane A

    2014-01-01

    A scientific advisory panel was convened by the State of California to recommend monitoring for chemicals of emerging concern (CECs) in aquatic systems that receive discharge of municipal wastewater treatment plant (WWTP) effluent and stormwater runoff. The panel developed a risk-based screening framework that considered environmental sources and fate of CECs observed in receiving waters across the State. Using existing occurrence and risk threshold data in water, sediment, and biological tissue, the panel applied the framework to identify a priority list of CECs for initial monitoring in three representative receiving water scenarios. The initial screening list of 16 CECs identified by the panel included consumer and commercial chemicals, flame retardants, pesticides, pharmaceuticals and personal care products, and natural hormones. The panel designed an iterative, phased strategy with interpretive guidelines that direct and update management actions commensurate with potential risk identified using the risk-based framework and monitoring data. Because of the ever-changing nature of chemical use, technology, and management practices, the panel offered recommendations to improve CEC monitoring, including development of bioanalytical screening methods whose responses integrate exposure to complex mixtures and that can be linked to higher-order effects; development or refinement of models that predict the input, fate, and effects of future chemicals; and filling of key data gaps on CEC occurrence and toxicity. Finally, the panel stressed the need for adaptive management, allowing for future review of, and if warranted, modifications to the strategy to incorporate the latest science available to the water resources community. © 2013 SETAC.

  13. Can triggered electromyography monitoring throughout retraction predict postoperative symptomatic neuropraxia after XLIF? Results from a prospective multicenter trial.

    PubMed

    Uribe, Juan S; Isaacs, Robert E; Youssef, Jim A; Khajavi, Kaveh; Balzer, Jeffrey R; Kanter, Adam S; Küelling, Fabrice A; Peterson, Mark D

    2015-04-01

    This multicenter study aims to evaluate the utility of triggered electromyography (t-EMG) recorded throughout psoas retraction during lateral transpsoas interbody fusion to predict postoperative changes in motor function. Three hundred and twenty-three patients undergoing L4-5 minimally invasive lateral interbody fusion from 21 sites were enrolled. Intraoperative data collection included initial t-EMG thresholds in response to posterior retractor blade stimulation and subsequent t-EMG threshold values collected every 5 min throughout retraction. Additional data collection included dimensions/duration of retraction as well as pre-and postoperative lower extremity neurologic exams. Prior to expanding the retractor, the lowestt-EMG threshold was identified posterior to the retractor in 94 % of cases. Postoperatively, 13 (4.5 %) patients had a new motor weakness that was consistent with symptomatic neuropraxia (SN) of lumbar plexus nerves on the approach side. There were no significant differences between patients with or without a corresponding postoperative SN with respect to initial posterior blade reading (p = 0.600), or retraction dimensions (p > 0.05). Retraction time was significantly longer in those patients with SN vs. those without (p = 0.031). Stepwise logistic regression showed a significant positive relationship between the presence of new postoperative SN and total retraction time (p < 0.001), as well as change in t-EMG thresholds over time (p < 0.001), although false positive rates (increased threshold in patients with no new SN) remained high regardless of the absolute increase in threshold used to define an alarm criteria. Prolonged retraction time and coincident increases in t-EMG thresholds are predictors of declining nerve integrity. Increasing t-EMG thresholds, while predictive of injury, were also observed in a large number of patients without iatrogenic injury, with a greater predictive value in cases with extended duration. In addition to a careful approach with minimal muscle retraction and consistent lumbar plexus directional retraction, the incidence of postoperative motor neuropraxia may be reduced by limiting retraction time and utilizing t-EMG throughout retraction, while understanding that the specificity of this monitoring technique is low during initial retraction and increases with longer retraction duration.

  14. Subsidence monitoring network: an Italian example aimed at a sustainable hydrocarbon E&P activity

    NASA Astrophysics Data System (ADS)

    Dacome, M. C.; Miandro, R.; Vettorel, M.; Roncari, G.

    2015-11-01

    According to the Italian law in order to start-up any new hydrocarbon exploitation activity, an Environmental Impact Assessment study has to be presented, including a monitoring plan, addressed to foresee, measure and analyze in real time any possible impact of the project on the coastal areas and on those ones in the close inland located. The occurrence of subsidence, that could partly be related to hydrocarbon production, both on-shore and off-shore, can generate great concern in those areas where its occurrence may have impacts on the local environment. ENI, following the international scientific community recommendations on the matter, since the beginning of 90's years, implemented a cutting-edge monitoring network, with the aim to prevent, mitigate and control geodynamics phenomena generated in the activity areas, with a particular attention to conservation and protection of environmental and territorial equilibrium, taking care of what is known as "sustainable development". The current ENI implemented monitoring surveys can be divided as: - Shallow monitoring: spirit levelling surveys, continuous GPS surveys in permanent stations, SAR surveys, assestimeter subsurface compaction monitoring, ground water level monitoring, LiDAR surveys, bathymetrical surveys. - Deep monitoring: reservoir deep compaction trough radioactive markers, reservoir static (bottom hole) pressure monitoring. All the information, gathered through the monitoring network, allow: 1. to verify if the produced subsidence is evolving accordingly with the simulated forecast. 2. to provide data to revise and adjust the prediction compaction models 3. to put in place the remedial actions if the impact exceeds the threshold magnitude originally agreed among the involved parties. ENI monitoring plan to measure and monitor the subsidence process, during field production and also after the field closure, is therefore intended to support a sustainable field development and an acceptable exploitation programme in which the actual risk connected with the field production is evaluated in advance, shared and agreed among all the involved subjects: oil company, stakeholders and local community (with interests in the affected area).

  15. Sensor Alerting Capability

    NASA Astrophysics Data System (ADS)

    Henriksson, Jakob; Bermudez, Luis; Satapathy, Goutam

    2013-04-01

    There is a large amount of sensor data generated today by various sensors, from in-situ buoys to mobile underwater gliders. Providing sensor data to the users through standardized services, language and data model is the promise of OGC's Sensor Web Enablement (SWE) initiative. As the amount of data grows it is becoming difficult for data providers, planners and managers to ensure reliability of data and services and to monitor critical data changes. Intelligent Automation Inc. (IAI) is developing a net-centric alerting capability to address these issues. The capability is built on Sensor Observation Services (SOSs), which is used to collect and monitor sensor data. The alerts can be configured at the service level and at the sensor data level. For example it can alert for irregular data delivery events or a geo-temporal statistic of sensor data crossing a preset threshold. The capability provides multiple delivery mechanisms and protocols, including traditional techniques such as email and RSS. With this capability decision makers can monitor their assets and data streams, correct failures or be alerted about a coming phenomena.

  16. Monitoring of Tumor Growth with [(18)F]-FET PET in a Mouse Model of Glioblastoma: SUV Measurements and Volumetric Approaches.

    PubMed

    Holzgreve, Adrien; Brendel, Matthias; Gu, Song; Carlsen, Janette; Mille, Erik; Böning, Guido; Mastrella, Giorgia; Unterrainer, Marcus; Gildehaus, Franz J; Rominger, Axel; Bartenstein, Peter; Kälin, Roland E; Glass, Rainer; Albert, Nathalie L

    2016-01-01

    Noninvasive tumor growth monitoring is of particular interest for the evaluation of experimental glioma therapies. This study investigates the potential of positron emission tomography (PET) using O-(2-(18)F-fluoroethyl)-L-tyrosine ([(18)F]-FET) to determine tumor growth in a murine glioblastoma (GBM) model-including estimation of the biological tumor volume (BTV), which has hitherto not been investigated in the pre-clinical context. Fifteen GBM-bearing mice (GL261) and six control mice (shams) were investigated during 5 weeks by PET followed by autoradiographic and histological assessments. [(18)F]-FET PET was quantitated by calculation of maximum and mean standardized uptake values within a universal volume-of-interest (VOI) corrected for healthy background (SUVmax/BG, SUVmean/BG). A partial volume effect correction (PVEC) was applied in comparison to ex vivo autoradiography. BTVs obtained by predefined thresholds for VOI definition (SUV/BG: ≥1.4; ≥1.6; ≥1.8; ≥2.0) were compared to the histologically assessed tumor volume (n = 8). Finally, individual "optimal" thresholds for BTV definition best reflecting the histology were determined. In GBM mice SUVmax/BG and SUVmean/BG clearly increased with time, however at high inter-animal variability. No relevant [(18)F]-FET uptake was observed in shams. PVEC recovered signal loss of SUVmean/BG assessment in relation to autoradiography. BTV as estimated by predefined thresholds strongly differed from the histology volume. Strikingly, the individual "optimal" thresholds for BTV assessment correlated highly with SUVmax/BG (ρ = 0.97, p < 0.001), allowing SUVmax/BG-based calculation of individual thresholds. The method was verified by a subsequent validation study (n = 15, ρ = 0.88, p < 0.01) leading to extensively higher agreement of BTV estimations when compared to histology in contrast to predefined thresholds. [(18)F]-FET PET with standard SUV measurements is feasible for glioma imaging in the GBM mouse model. PVEC is beneficial to improve accuracy of [(18)F]-FET PET SUV quantification. Although SUVmax/BG and SUVmean/BG increase during the disease course, these parameters do not correlate with the respective tumor size. For the first time, we propose a histology-verified method allowing appropriate individual BTV estimation for volumetric in vivo monitoring of tumor growth with [(18)F]-FET PET and show that standardized thresholds from routine clinical practice seem to be inappropriate for BTV estimation in the GBM mouse model.

  17. Results of hydrologic monitoring on landslide-prone coastal bluffs near Mukilteo, Washington

    USGS Publications Warehouse

    Smith, Joel B.; Baum, Rex L.; Mirus, Benjamin B.; Michel, Abigail R.; Stark, Ben

    2017-08-31

    A hydrologic monitoring network was installed to investigate landslide hazards affecting the railway corridor along the eastern shore of Puget Sound between Seattle and Everett, near Mukilteo, Washington. During the summer of 2015, the U.S. Geological Survey installed monitoring equipment at four sites equipped with instrumentation to measure rainfall and air temperature every 15 minutes. Two of the four sites are installed on contrasting coastal bluffs, one landslide scarred and one vegetated. At these two sites, in addition to rainfall and air temperature, volumetric water content, pore pressure, soil suction, soil temperature, and barometric pressure were measured every 15 minutes. The instrumentation was designed to supplement landslide-rainfall thresholds developed by the U.S. Geological Survey with a long-term goal of advancing the understanding of the relationship between landslide potential and hydrologic forcing along the coastal bluffs. Additionally, the system was designed to function as a prototype monitoring system to evaluate criteria for site selection, instrument selection, and placement of instruments. The purpose of this report is to describe the monitoring system, present the data collected since installation, and describe significant events represented within the dataset, which is published as a separate data release. The findings provide insight for building and configuring larger, modular monitoring networks.

  18. Retrospective application of the "guidelines for monitoring mining subsurface activities for hydrocarbons exploitation, re-injection and storage activities (ILG)": insights from the analysis of 2012-2013 Emilia seismic sequence at the Cavone oilfield pilot site (Italy)

    NASA Astrophysics Data System (ADS)

    Buttinelli, M.; Chiarabba, C.; Anselmi, M.; Pezzo, G.; Improta, L.; Antoncecchi, I.

    2017-12-01

    In recent years, the debate on the interactions between wastewater disposal and induced seismicity is increasingly drawing the attention of the scientific community, since injections by high-rate wells have been directly associated to occurrence of even large seismic events. In February 2014, the Italian Ministry of Economic Development (MiSE), within the Commission on Hydrocarbon and Mining Resources (CIRM), issued the "guidelines for monitoring mining subsurface activities for hydrocarbons exploitation, re-injection and storage activities (ILG)". The ILG represent the first action in italy aimed at keeping the safety standards mostly in areas where the underground resources exploitation can induce seismicity, ground deformations and pore pressure changes of the reservoirs. Such guidelines also launched a "traffic light" operating system, for the first time defining threshold values and activation levels for such monitored parameters. To test the ILG implications (in particular of the traffic light system) we select the Cavone oilfield (Northern Italy) as test case, since this area was interested during the 2012-2013 by the Emilia Seismic sequence. Moreover, the potential influence of the Cavone oilfield activities in the 2012 earthquake trigger was debated for a long time within the scientific and not contexts, highlighting the importance of seismic monitoring in hydrocarbons exploitation, re-injection and storage areas. In this work we apply the ILG retrospectively to the Cavone oilfield and surrounding areas, just for the seismicity parameter (pore pressure and ground deformation were not taken into account because out of the traffic light system). Since each seismicity catalogue available for the 2012 sequence represents a different setting of monitoring system, we carefully analyzed how the use of such catalogues impact on the overcoming of the threshold imposed by the ILG. In particular, we focus on the use of 1D and 3D velocity models developed ad hoc or not for the investigated area. Results show that different approaches strongly affect the location of seismic event, therefore generating proper or un-proper warnings applying the traffic light system. Our analysis also highlighted the importance of accounting for local geological complexity in the seismicity location strategy.

  19. On the importance of a correct divulgation of monitoring results for an efficient management of landslide emergencies

    NASA Astrophysics Data System (ADS)

    Giordan, Daniele; Manconi, Andrea; Allasia, Paolo

    2015-04-01

    In the last decades, technological evolution has strongly increased the number of instruments that can be used to monitor landslide phenomena. Robotized Total Stations, GB-InSAR, and GPS are only few examples of the systems that can be used for the control of the topographic changes due to the landslide activity. These monitoring systems are often merged in a complex network, aimed at controlling the most important physical parameters influencing the evolution of landslide activity. The technological level reached by these systems allows us to use them for early warning purposes. Critical thresholds are identified and, when overcome, emergency actions are associated to protect population living in areas potentially involved by landslide failure. The use of these early warning systems can be very useful for the decision makers, which have to manage emergency conditions due to a landslide acceleration likely precursor of a collapse. At this stage, every instrument has a proper management system and the dataset obtained is often not compatible with the results of the others systems. The level of complexity increases with the number of monitoring systems and often could generate a paradox: the source of data are so numerous and difficult to interpret that a full understanding of the phenomenon could be hampered. Nowadays, a correct divulgation of the recent evolution of a landslide potentially dangerous for the population is very important. The Geohazard Monitoring Group of CNR IRPI developed a communication strategy to divulgate the monitoring network results based on both, a dedicated web page (for the publication in near real time of last updates), and periodical bulletins (for a deeper analysis of the available dataset). To manage the near real time application we developed a system called ADVICE (ADVanced dIsplaCement monitoring system for Early warning) that collects all the available data of a monitoring network and creates user-friendly representations of the recent landslide evolution. The system is also able to manage early warnings based on pre-defined thresholds (usually related to the analysis of displacement and/or velocity) sending emails and SMS. Starting from the same dataset, the representations are different if the information has to be delivered to the population or the technicians involved in the landslide emergency. Our communication strategy considers three different levels of representations of the acquired dataset to be able to communicate the results to the different stakeholders potentially involved in the emergency. This communication scheme has been achieved over time, thank to the experience acquired during the management of monitoring networks relevant to different case studies, such as: Mt. de La Saxe Landslide (Aosta Valley, NW Italy), Ripoli landslide (Emilia Romagna region, central Italy), Montaguto landslide (Campania region, south Italy). Here we present how the correct and user-friendly communication of the monitoring results has been an important added value to support decision makers and population during emergency scenarios.

  20. Use of Modal Acoustic Emission to Monitor Damage Progression in Carbon Fiber/Epoxy Tows and Implications for Composite Structures

    NASA Technical Reports Server (NTRS)

    Waller, Jess M.; Saulsberry, Regor L.; Nichols, Charles T.; Wentzel, Daniel J.

    2010-01-01

    This slide presentation reviews the use of Modal Acoustic Emission to monitor damage progression to carbon fiber/epoxy tows. There is a risk for catastrophic failure of composite overwrapped pressure vessels (COPVs) due to burst-before-leak (BBL) stress rupture (SR) failure of carbon-epoxy (C/Ep) COPVs. A lack of quantitative nondestructive evaluation (NDE) is causing problems in current and future spacecraft designs. It is therefore important to develop and demonstrate critical NDE that can be implemented during stages of the design process since the observed rupture can occur with little of no advanced warning. Therefore a program was required to develop quantitative acoustic emission (AE) procedures specific to C/Ep overwraps, but which also have utility for monitoring damage accumulation in composite structure in general, and to lay the groundwork for establishing critical thresholds for accumulated damage in composite structures, such as COPVs, so that precautionary or preemptive engineering steps can be implemented to minimize of obviate the risk of catastrophic failure. A computed Felicity Ratio (FR) coupled with fast Fourier Transform (FFT) frequency analysis shows promise as an analytical pass/fail criterion. The FR analysis and waveform and FFT analysis are reviewed

  1. Properties of perimetric threshold estimates from Full Threshold, SITA Standard, and SITA Fast strategies.

    PubMed

    Artes, Paul H; Iwase, Aiko; Ohno, Yuko; Kitazawa, Yoshiaki; Chauhan, Balwantray C

    2002-08-01

    To investigate the distributions of threshold estimates with the Swedish Interactive Threshold Algorithms (SITA) Standard, SITA Fast, and the Full Threshold algorithm (Humphrey Field Analyzer; Zeiss-Humphrey Instruments, Dublin, CA) and to compare the pointwise test-retest variability of these strategies. One eye of 49 patients (mean age, 61.6 years; range, 22-81) with glaucoma (Mean Deviation mean, -7.13 dB; range, +1.8 to -23.9 dB) was examined four times with each of the three strategies. The mean and median SITA Standard and SITA Fast threshold estimates were compared with a "best available" estimate of sensitivity (mean results of three Full Threshold tests). Pointwise 90% retest limits (5th and 95th percentiles of retest thresholds) were derived to assess the reproducibility of individual threshold estimates. The differences between the threshold estimates of the SITA and Full Threshold strategies were largest ( approximately 3 dB) for midrange sensitivities ( approximately 15 dB). The threshold distributions of SITA were considerably different from those of the Full Threshold strategy. The differences remained of similar magnitude when the analysis was repeated on a subset of 20 locations that are examined early during the course of a Full Threshold examination. With sensitivities above 25 dB, both SITA strategies exhibited lower test-retest variability than the Full Threshold strategy. Below 25 dB, the retest intervals of SITA Standard were slightly smaller than those of the Full Threshold strategy, whereas those of SITA Fast were larger. SITA Standard may be superior to the Full Threshold strategy for monitoring patients with visual field loss. The greater test-retest variability of SITA Fast in areas of low sensitivity is likely to offset the benefit of even shorter test durations with this strategy. The sensitivity differences between the SITA and Full Threshold strategies may relate to factors other than reduced fatigue. They are, however, small in comparison to the test-retest variability.

  2. Diagnostic pure-tone audiometry in schools: mobile testing without a sound-treated environment.

    PubMed

    Swanepoel, De Wet; Maclennan-Smith, Felicity; Hall, James W

    2013-01-01

    To validate diagnostic pure-tone audiometry in schools without a sound-treated environment using an audiometer that incorporates insert earphones covered by circumaural earcups and real-time environmental noise monitoring. A within-subject repeated measures design was employed to compare air (250 to 8000 Hz) and bone (250 to 4000 Hz) conduction pure-tone thresholds measured in natural school environments with thresholds measured in a sound-treated booth. 149 children (54% female) with an average age of 6.9 yr (SD = 0.6; range = 5-8). Average difference between the booth and natural environment thresholds was 0.0 dB (SD = 3.6) for air conduction and 0.1 dB (SD = 3.1) for bone conduction. Average absolute difference between the booth and natural environment was 2.1 dB (SD = 2.9) for air conduction and 1.6 dB (SD = 2.7) for bone conduction. Almost all air- (96%) and bone-conduction (97%) threshold comparisons between the natural and booth test environments were within 0 to 5 dB. No statistically significant differences between thresholds recorded in the natural and booth environments for air- and bone-conduction audiometry were found (p > 0.01). Diagnostic air- and bone-conduction audiometry in schools, without a sound-treated room, is possible with sufficient earphone attenuation and real-time monitoring of environmental noise. Audiological diagnosis on-site for school screening may address concerns of false-positive referrals and poor follow-up compliance and allow for direct referral to audiological and/or medical intervention. American Academy of Audiology.

  3. Groundwater-Quality Data in the Colorado River Study Unit, 2007: Results from the California GAMA Program

    USGS Publications Warehouse

    Goldrath, Dara A.; Wright, Michael T.; Belitz, Kenneth

    2010-01-01

    Groundwater quality in the 188-square-mile Colorado River Study unit (COLOR) was investigated October through December 2007 as part of the Priority Basin Project of the California State Water Resources Control Board (SWRCB) Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Project was developed in response to the Groundwater Quality Monitoring Act of 2001, and the U.S. Geological Survey (USGS) is the technical project lead. The Colorado River study was designed to provide a spatially unbiased assessment of the quality of raw groundwater used for public water supplies within COLOR, and to facilitate statistically consistent comparisons of groundwater quality throughout California. Samples were collected from 28 wells in three study areas in San Bernardino, Riverside, and Imperial Counties. Twenty wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the Study unit; these wells are termed 'grid wells'. Eight additional wells were selected to evaluate specific water-quality issues in the study area; these wells are termed `understanding wells.' The groundwater samples were analyzed for organic constituents (volatile organic compounds [VOC], gasoline oxygenates and degradates, pesticides and pesticide degradates, pharmaceutical compounds), constituents of special interest (perchlorate, 1,4-dioxane, and 1,2,3-trichlorpropane [1,2,3-TCP]), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), and radioactive constituents. Concentrations of naturally occurring isotopes (tritium, carbon-14, and stable isotopes of hydrogen and oxygen in water), and dissolved noble gases also were measured to help identify the sources and ages of the sampled groundwater. In total, approximately 220 constituents and water-quality indicators were investigated. Quality-control samples (blanks, replicates, and matrix spikes) were collected at approximately 30 percent of the wells, and the results were used to evaluate the quality of the data obtained from the groundwater samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that contamination was not a significant source of bias in the data. Differences between replicate samples were within acceptable ranges and matrix-spike recoveries were within acceptable ranges for most compounds. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, raw groundwater typically is treated, disinfected, or blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to water that is served to the consumer, not to raw groundwater. However, to provide some context for the results, concentrations of constituents measured in the raw groundwater were compared to regulatory and nonregulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and the California Department of Public Health (CDPH) and to thresholds established for aesthetic concerns by CDPH. Comparisons between data collected for this study and drinking-water thresholds are for illustrative purposes only and do not indicate compliance or noncompliance with those thresholds. The concentrations of most constituents detected in groundwater samples were below drinking-water thresholds. Volatile organic compounds (VOC) were detected in approximately 35 percent of grid well samples; all concentrations were below health-based thresholds. Pesticides and pesticide degradates were detected in about 20 percent of all samples; detections were below health-based thresholds. No concentrations of constituents of special interest or nutrients were detected above health-based thresholds. Most of the major and minor ion constituents sampled do not have health-based thresholds; the exception is chloride. Concentrations of chloride, sulfate, and total dis

  4. Drop casting of stiffness gradients for chip integration into stretchable substrates

    NASA Astrophysics Data System (ADS)

    Naserifar, Naser; LeDuc, Philip R.; Fedder, Gary K.

    2017-04-01

    Stretchable electronics have demonstrated promise within unobtrusive wearable systems in areas such as health monitoring and medical therapy. One significant question is whether it is more advantageous to develop holistic stretchable electronics or to integrate mature CMOS into stretchable electronic substrates where the CMOS process is separated from the mechanical processing steps. A major limitation with integrating CMOS is the dissimilar interface between the soft stretchable and hard CMOS materials. To address this, we developed an approach to pattern an elastomeric polymer layer with spatially varying mechanical properties around CMOS electronics to create a controllable material stiffness gradient. Our experimental approach reveals that modifying the interfaces can increase the strain failure threshold up to 30% and subsequently decreases delamination. The stiffness gradient in the polymer layer provides a safe region for electronic chips to function under a substrate tensile strain up to 150%. These results will have impacts in diverse applications including skin sensors and wearable health monitoring systems.

  5. Pervasive monitoring--an intelligent sensor pod approach for standardised measurement infrastructures.

    PubMed

    Resch, Bernd; Mittlboeck, Manfred; Lippautz, Michael

    2010-01-01

    Geo-sensor networks have traditionally been built up in closed monolithic systems, thus limiting trans-domain usage of real-time measurements. This paper presents the technical infrastructure of a standardised embedded sensing device, which has been developed in the course of the Live Geography approach. The sensor pod implements data provision standards of the Sensor Web Enablement initiative, including an event-based alerting mechanism and location-aware Complex Event Processing functionality for detection of threshold transgression and quality assurance. The goal of this research is that the resultant highly flexible sensing architecture will bring sensor network applications one step further towards the realisation of the vision of a "digital skin for planet earth". The developed infrastructure can potentially have far-reaching impacts on sensor-based monitoring systems through the deployment of ubiquitous and fine-grained sensor networks. This in turn allows for the straight-forward use of live sensor data in existing spatial decision support systems to enable better-informed decision-making.

  6. Whole body vibration training improves vibration perception threshold in healthy young adults: A randomized clinical trial pilot study.

    PubMed

    Hernandez-Mocholi, M A; Dominguez-Muñoz, F J; Corzo, H; Silva, S Cs; Adsuar, J C; Gusi, N

    2016-03-01

    Loss of foot sensitivity is a relevant parameter to assess and prevent in several diseases. It is crucial to determine the vibro-tactile sensitivity threshold response to acute conditions to explore innovative monitor tools and interventions to prevent and treat this challenge. The aims were: 1) to analyze the acute effects of a single whole body vibration session (4min-18Hz-4mm) on vibro-tactile perception threshold in healthy young adults. 2) to analyze the 48 hours effects of 3 whole body vibration sessions on vibro-tactile perception threshold in healthy young adults. A randomized controlled clinical trial over 3 sessions of whole body vibration intervention or 3 sessions of placebo intervention. Twenty-eight healthy young adults were included: 11 experimental group and 12 placebo group. The experimental group performed 3 sessions of WBV while the placebo group performed 3 sessions of placebo intervention. The vibro-tactile threshold increased right after a single WBV session in comparison with placebo. Nevertheless, after 3 whole body vibration sessions and 48 hours, the threshold decreased to values lower than the initial. The acute response of the vibro-tactile threshold to one whole body vibration session increased, but the 48 hours short-term response of this threshold decreased in healthy young adults.

  7. Technical Note: An operational landslide early warning system at regional scale based on space-time-variable rainfall thresholds

    NASA Astrophysics Data System (ADS)

    Segoni, S.; Battistini, A.; Rossi, G.; Rosi, A.; Lagomarsino, D.; Catani, F.; Moretti, S.; Casagli, N.

    2015-04-01

    We set up an early warning system for rainfall-induced landslides in Tuscany (23 000 km2). The system is based on a set of state-of-the-art intensity-duration rainfall thresholds (Segoni et al., 2014b) and makes use of LAMI (Limited Area Model Italy) rainfall forecasts and real-time rainfall data provided by an automated network of more than 300 rain gauges. The system was implemented in a WebGIS to ease the operational use in civil protection procedures: it is simple and intuitive to consult, and it provides different outputs. When switching among different views, the system is able to focus both on monitoring of real-time data and on forecasting at different lead times up to 48 h. Moreover, the system can switch between a basic data view where a synoptic scenario of the hazard can be shown all over the region and a more in-depth view were the rainfall path of rain gauges can be displayed and constantly compared with rainfall thresholds. To better account for the variability of the geomorphological and meteorological settings encountered in Tuscany, the region is subdivided into 25 alert zones, each provided with a specific threshold. The warning system reflects this subdivision: using a network of more than 300 rain gauges, it allows for the monitoring of each alert zone separately so that warnings can be issued independently. An important feature of the warning system is that the visualization of the thresholds in the WebGIS interface may vary in time depending on when the starting time of the rainfall event is set. The starting time of the rainfall event is considered as a variable by the early warning system: whenever new rainfall data are available, a recursive algorithm identifies the starting time for which the rainfall path is closest to or overcomes the threshold. This is considered the most hazardous condition, and it is displayed by the WebGIS interface. The early warning system is used to forecast and monitor the landslide hazard in the whole region, providing specific alert levels for 25 distinct alert zones. In addition, the system can be used to gather, analyze, display, explore, interpret and store rainfall data, thus representing a potential support to both decision makers and scientists.

  8. A New Performance Improvement Model: Adding Benchmarking to the Analysis of Performance Indicator Data.

    PubMed

    Al-Kuwaiti, Ahmed; Homa, Karen; Maruthamuthu, Thennarasu

    2016-01-01

    A performance improvement model was developed that focuses on the analysis and interpretation of performance indicator (PI) data using statistical process control and benchmarking. PIs are suitable for comparison with benchmarks only if the data fall within the statistically accepted limit-that is, show only random variation. Specifically, if there is no significant special-cause variation over a period of time, then the data are ready to be benchmarked. The proposed Define, Measure, Control, Internal Threshold, and Benchmark model is adapted from the Define, Measure, Analyze, Improve, Control (DMAIC) model. The model consists of the following five steps: Step 1. Define the process; Step 2. Monitor and measure the variation over the period of time; Step 3. Check the variation of the process; if stable (no significant variation), go to Step 4; otherwise, control variation with the help of an action plan; Step 4. Develop an internal threshold and compare the process with it; Step 5.1. Compare the process with an internal benchmark; and Step 5.2. Compare the process with an external benchmark. The steps are illustrated through the use of health care-associated infection (HAI) data collected for 2013 and 2014 from the Infection Control Unit, King Fahd Hospital, University of Dammam, Saudi Arabia. Monitoring variation is an important strategy in understanding and learning about a process. In the example, HAI was monitored for variation in 2013, and the need to have a more predictable process prompted the need to control variation by an action plan. The action plan was successful, as noted by the shift in the 2014 data, compared to the historical average, and, in addition, the variation was reduced. The model is subject to limitations: For example, it cannot be used without benchmarks, which need to be calculated the same way with similar patient populations, and it focuses only on the "Analyze" part of the DMAIC model.

  9. Noninvasive screening for intracranial hypertension in children with acute, severe traumatic brain injury.

    PubMed

    O'Brien, Nicole F; Maa, Tensing; Reuter-Rice, Karin

    2015-10-01

    The aim of this study was to determine the relationship between transcranial Doppler (TCD) derived pulsatility index (PI), end diastolic flow velocity (Vd), and intracranial pressure (ICP). The subjects in this study were 36 children admitted after severe traumatic brain injury (TBI) (postresuscitation Glasgow Coma Scale ≤ 8) undergoing invasive ICP monitoring. Subjects underwent a total of 148 TCD studies. TCD measurements of systolic flow velocity (Vs), Vd, and mean flow velocity (Vm) were performed on the middle cerebral artery (MCA) ipsilateral to the ICP monitor. The PI was calculated by the TCD software (Vs-Vd/Vm). ICP registrations were made in parallel with TCD measurements. Using a PI threshold of 1.3, postinjury Day 0-1 PI had 100% sensitivity and 82% specificity at predicting an ICP ≥ 20 mm Hg (n = 8). During this time frame, a moderately strong relationship was observed between the MCA PI and actual ICP (r = 0.611, p = 0.01). When using a threshold of < 25 cm/sec, postinjury Day 0-1 Vd had a 56% sensitivity to predict an ICP ≥ 20 mm Hg. Beyond the initial 24 hours from injury, the sensitivity of an MCA PI of 1.3 to detect an ICP ≥ 20 mm Hg was 47%, and a weak relationship between actual ICP values and MCA PI (r = 0.376, p = 0.01) and MCA Vd (r = -0.284, p = 0.01) was found. Postinjury Day 0-1 MCA PI > 1.3 has good sensitivity and specificity at predicting an ICP ≥ 20 mm Hg. In those children with TBI who initially do not meet clear criteria for invasive ICP monitoring but who are at risk for development of intracranial hypertension, TCD may be used as a noninvasive tool to screen for the development of elevated ICP in the first 24 hours following injury.

  10. The Development of New Composite Metrics for the Comprehensive Analytic and Visual Assessment of Hypoglycemia Using the Hypo-Triad.

    PubMed

    Thomas, Andreas; Shin, John; Jiang, Boyi; McMahon, Chantal; Kolassa, Ralf; Vigersky, Robert A

    2018-01-01

    Quantifying hypoglycemia has traditionally been limited to using the frequency of hypoglycemic events during a given time interval using data from blood glucose (BG) testing. However, continuous glucose monitoring (CGM) captures three parameters-a Hypo-Triad-unavailable with BG monitoring that can be used to better characterize hypoglycemia: area under the curve (AUC), time (duration of hypoglycemia), and frequency of daily episodes below a specified threshold. We developed two new analytic metrics to enhance the traditional Hypo-Triad of CGM-derived data to more effectively capture the intensity of hypoglycemia (IntHypo) and overall hypoglycemic environment called the "hypoglycemia risk volume" (HypoRV). We reanalyzed the CGM data from the ASPIRE In-Home study, a randomized, controlled trial of a sensor-integrated pump system with a low glucose threshold suspend feature (SIP+TS), using these new metrics and compared them to standard metrics of hypoglycemia. IntHypo and HypoRV provide additional insights into the benefit of a SIP+TS system on glycemic exposure when compared to the standard reporting methods. In addition, the visual display of these parameters provides a unique and intuitive way to understand the impact of a diabetes intervention on a cohort of subjects as well as on individual patients. The IntHypo and HypoRV are new and enhanced ways of analyzing CGM-derived data in diabetes intervention studies which could lead to new insights in diabetes management. They require validation using existing, ongoing, or planned studies to determine whether they are superior to existing metrics.

  11. Stakeholder Participation in Freshwater Monitoring and Evaluation Programs: Applying Thresholds of Potential Concern within Environmental Flows

    NASA Astrophysics Data System (ADS)

    Conallin, John; McLoughlin, Craig A.; Campbell, Josh; Knight, Roger; Bright, Troy; Fisher, Ian

    2018-03-01

    The complex nature of freshwater systems provides challenges for incorporating evidence-based techniques into management. This paper investigates the potential of participatory evidence-based techniques to involve local stakeholders and make decisions based on different "knowledge" sources within adaptive management programs. It focuses on the application of thresholds of potential concern (TPC) within strategic adaptive management (SAM) for facilitating inclusive decision-making. The study is based on the case of the Edward-Wakool (E-W) "Fish and Flows" SAM project in the Murray-Darling River Basin, Australia. We demonstrate the application of TPCs for improving collaborative decision-making within the E-W, associated with environmental watering requirements, and other natural resource management programs such as fish stocking. The development of TPCs in the E-W fish and flows SAM project helped improve stakeholder involvement and understanding of the system, and also the effectiveness of the implemented management interventions. TPCs ultimately helped inform environmental flow management activities. The TPC process complemented monitoring that was already occurring in the system and provided a mechanism for linking formal and informal knowledge to form explicit and measurable endpoints from objectives. The TPC process faced challenges due to the perceived reduction in scientific rigor within initial TPC development and use. However, TPCs must remain tangible to managers and other stakeholders, in order to aid in the implementation of adaptive management. Once accepted by stakeholders, over time TPCs should be reviewed and refined in order to increase their scientific rigor, as new information is generated.

  12. Identifying Threshold Concepts in the Careers of Educational Developers

    ERIC Educational Resources Information Center

    Timmermans, Julie A.

    2014-01-01

    The purpose of this multiple case study was to identify threshold concepts in the careers of educational developers. Twenty-one common threshold concepts emerged, with one threshold concept common among all participants: Facilitating a change process. The remaining 20 threshold concepts were captured in the following three categories: (1) Ways of…

  13. Randomness fault detection system

    NASA Technical Reports Server (NTRS)

    Russell, B. Don (Inventor); Aucoin, B. Michael (Inventor); Benner, Carl L. (Inventor)

    1996-01-01

    A method and apparatus are provided for detecting a fault on a power line carrying a line parameter such as a load current. The apparatus monitors and analyzes the load current to obtain an energy value. The energy value is compared to a threshold value stored in a buffer. If the energy value is greater than the threshold value a counter is incremented. If the energy value is greater than a high value threshold or less than a low value threshold then a second counter is incremented. If the difference between two subsequent energy values is greater than a constant then a third counter is incremented. A fault signal is issued if the counter is greater than a counter limit value and either the second counter is greater than a second limit value or the third counter is greater than a third limit value.

  14. Intraoperative identification of the facial nerve by needle electromyography stimulation with a burr

    PubMed Central

    KHAMGUSHKEEVA, N.N.; ANIKIN, I.A.; KORNEYENKOV, A.A.

    2016-01-01

    The purpose of this research is to improve the safety of surgery for patients with a pathology of the middle and inner ear by preventing damage to the facial nerve by conducting intraoperative monitoring of the facial nerve by needle electromyography with continuous stimulation with a burr. Patients and Methods The clinical part of the prospective study was carried out on 48 patients that were diagnosed with suppurative otitis media. After the surgery with intraoperative monitoring, the facial nerve with an intact bone wall was stimulated electrically in the potentially dangerous places of damage. Minimum (threshold) stimulation (mA) of the facial nerve with a threshold event of 100 μV was used to register EMG events. The anatomical part of the study was carried out on 30 unformalinized cadaver temporal bones from adult bodies. The statistical analysis of obtained data was carried out with parametric methods (Student’s t-test), non-parametric correlation (Spearman’s method) and regression analysis. Results It was found that 1 mA of threshold amperage corresponded to 0.8 mm thickness of the bone wall of the facial canal. Values of transosseous threshold stimulation in potentially dangerous sections of the injury to the facial nerve were obtained. Conclusion These data lower the risk of paresis (paralysis) of the facial muscles during otologic surgery. PMID:27142821

  15. The Borg scale as an important tool of self-monitoring and self-regulation of exercise prescription in heart failure patients during hydrotherapy. A randomized blinded controlled trial.

    PubMed

    Carvalho, Vitor Oliveira; Bocchi, Edimar Alcides; Guimarães, Guilherme Veiga

    2009-10-01

    The Borg Scale may be a useful tool for heart failure patients to self-monitor and self-regulate exercise on land or in water (hydrotherapy) by maintaining the heart rate (HR) between the anaerobic threshold and respiratory compensation point. Patients performed a cardiopulmonary exercise test to determine their anaerobic threshold/respiratory compensation points. The percentage of the mean HR during the exercise session in relation to the anaerobic threshold HR (%EHR-AT), in relation to the respiratory compensation point (%EHR-RCP), in relation to the peak HR by the exercise test (%EHR-Peak) and in relation to the maximum predicted HR (%EHR-Predicted) was calculated. Next, patients were randomized into the land or water exercise group. One blinded investigator instructed the patients in each group to exercise at a level between "relatively easy and slightly tiring". The mean HR throughout the 30-min exercise session was recorded. The %EHR-AT and %EHR-predicted did not differ between the land and water exercise groups, but they differed in the %EHR-RCP (95 +/-7 to 86 +/-7, P<0.001) and in the %EHR-Peak (85 +/-8 to 78 +/-9, P=0.007). Exercise guided by the Borg scale maintains the patient's HR between the anaerobic threshold and respiratory compensation point (ie, in the exercise training zone).

  16. Wireless battery management control and monitoring system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zumstein, James M.; Chang, John T.; Farmer, Joseph C.

    A battery management system using a sensor inside of the battery that sensor enables monitoring and detection of various events in the battery and transmission of a signal from the sensor through the battery casing to a control and data acquisition module by wireless transmission. The detection of threshold events in the battery enables remedial action to be taken to avoid catastrophic events.

  17. Aviation safety research and transportation/hazard avoidance and elimination

    NASA Technical Reports Server (NTRS)

    Sonnenschein, C. M.; Dimarzio, C.; Clippinger, D.; Toomey, D.

    1976-01-01

    Data collected by the Scanning Laser Doppler Velocimeter System (SLDVS) was analyzed to determine the feasibility of the SLDVS for monitoring aircraft wake vortices in an airport environment. Data were collected on atmospheric vortices and analyzed. Over 1600 landings were monitored at Kennedy International Airport and by the end of the test period 95 percent of the runs with large aircraft were producing usable results in real time. The transport was determined in real time and post analysis using algorithms which performed centroids on the highest amplitude in the thresholded spectrum. Making use of other parameters of the spectrum, vortex flow fields were studied along with the time histories of peak velocities and amplitudes. The post analysis of the data was accomplished with a CDC-6700 computer using several programs developed for LDV data analysis.

  18. NetMOD version 1.0 user's manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merchant, Bion John

    2014-01-01

    NetMOD (Network Monitoring for Optimal Detection) is a Java-based software package for conducting simulation of seismic networks. Specifically, NetMOD simulates the detection capabilities of seismic monitoring networks. Network simulations have long been used to study network resilience to station outages and to determine where additional stations are needed to reduce monitoring thresholds. NetMOD makes use of geophysical models to determine the source characteristics, signal attenuation along the path between the source and station, and the performance and noise properties of the station. These geophysical models are combined to simulate the relative amplitudes of signal and noise that are observed atmore » each of the stations. From these signal-to-noise ratios (SNR), the probability of detection can be computed given a detection threshold. This manual describes how to configure and operate NetMOD to perform seismic detection simulations. In addition, NetMOD is distributed with a simulation dataset for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) International Monitoring System (IMS) seismic network for the purpose of demonstrating NetMOD's capabilities and providing user training. The tutorial sections of this manual use this dataset when describing how to perform the steps involved when running a simulation.« less

  19. A prototype system for forecasting landslides in the Seattle, Washington, area

    USGS Publications Warehouse

    Chleborad, Alan F.; Baum, Rex L.; Godt, Jonathan W.; Powers, Philip S.

    2008-01-01

    Empirical rainfall thresholds and related information form the basis of a prototype system for forecasting landslides in the Seattle area. The forecasts are tied to four alert levels, and a decision tree guides the use of thresholds to determine the appropriate level. From analysis of historical landslide data, we developed a formula for a cumulative rainfall threshold (CT), P3  =  88.9 − 0.67P15, defined by rainfall amounts in millimeters during consecutive 3 d (72 h) periods, P3, and the 15 d (360 h) period before P3, P15. The variable CT captures more than 90% of historical events of three or more landslides in 1 d and 3 d periods recorded from 1978 to 2003. However, the low probability of landslide occurrence on a day when the CT is exceeded at one or more rain gauges (8.4%) justifies a low-level of alert for possible landslide occurrence, but it does trigger more vigilant monitoring of rainfall and soil wetness. Exceedance of a rainfall intensity-duration threshold I  =  82.73D−1.13, for intensity, I (mm/hr), and duration, D (hr), corresponds to a higher probability of landslide occurrence (30%) and forms the basis for issuing warnings of impending, widespread occurrence of landslides. Information about the area of exceedance and soil wetness can be used to increase the certainty of landslide forecasts (probabilities as great as 71%). Automated analysis of real-time rainfall and subsurface water data and digital quantitative precipitation forecasts are needed to fully implement a warning system based on the two thresholds.

  20. [A Smart Low-Power-Consumption ECG Monitor Based on MSP430F5529 and CC2540].

    PubMed

    Gong, Yuan; Cao, Jin; Luo, Zehui; Zhou, Guohui

    2015-07-01

    A design of ECG monitor was presented in this paper. It is based on the latest MCU and BLE4.0 technologies and can interact with multi-platform smart devices with extra low power consumption. Besides, a clinical expansion part can realize functions including displaying the real-time ECG and heart rate curve, reading abnormal ECG signals stored in the monitor, and setting alarm threshold. These functions are suitable for follow-up use.

  1. 25m-resolution Global Mosaic and Forest/Non-Forest map using PALSAR-2 data set

    NASA Astrophysics Data System (ADS)

    Itoh, T.; Shimada, M.; Motooka, T.; Hayashi, M.; Tadono, T.; DAN, R.; Isoguchi, O.; Yamanokuchi, T.

    2017-12-01

    A continuous observation of forests is important as information necessary for monitoring deforestation, climate change and environmental changes i.e. Reducing Emissions from Deforestation and Forest Degradation in Developing Countries (REDD+). Japan Aerospace Exploration Agency (JAXA) is conducting research on forest monitoring using satellite-based L-Band Synthetic Aperture Radars (SARs) continuously. Using the FBD (Fine Beam Dual polarizations) data of the Phased Array type L-band Synthetic Aperture Radar (PALSAR) onboard the Advanced Land Observing Satellite (ALOS), JAXA created the global 25 m-resolution mosaic images and the Forest/Non-Forest (FNF) maps dataset for forest monitoring. SAR can monitor forest areas under all weather conditions, and L-band is highly sensitive to forests and their changes, therefore it is suitable for forest observation. JAXA also created the global 25 m mosaics and FNF maps using ALOS-2/PALSAR-2 launched on 2014 as a successor to ALOS. FNF dataset by PALSAR and PALSAR-2 covers from 2007 to 2010, and from 2015 to 2016, respectively. Therefore, it is possible to monitor forest changes during approx. 10 years. The classification method is combination of the object-based classification and the thresholding of HH and HV polarized images, and the result of FNF was compared with Forest Resource Assessment (FRA, developed by FAO) and their inconsistency is less than 10 %. Also, by comparing with the optical image of Google Earth, rate of coincidence was 80 % or more. We will create PALSAR-2 global mosaics and FNF dataset continuously to contribute global forest monitoring.

  2. Under which conditions, additional monitoring data are worth gathering for improving decision making? Application of the VOI theory in the Bayesian Event Tree eruption forecasting framework

    NASA Astrophysics Data System (ADS)

    Loschetter, Annick; Rohmer, Jérémy

    2016-04-01

    Standard and new generation of monitoring observations provide in almost real-time important information about the evolution of the volcanic system. These observations are used to update the model and contribute to a better hazard assessment and to support decision making concerning potential evacuation. The framework BET_EF (based on Bayesian Event Tree) developed by INGV enables dealing with the integration of information from monitoring with the prospect of decision making. Using this framework, the objectives of the present work are i. to propose a method to assess the added value of information (within the Value Of Information (VOI) theory) from monitoring; ii. to perform sensitivity analysis on the different parameters that influence the VOI from monitoring. VOI consists in assessing the possible increase in expected value provided by gathering information, for instance through monitoring. Basically, the VOI is the difference between the value with information and the value without additional information in a Cost-Benefit approach. This theory is well suited to deal with situations that can be represented in the form of a decision tree such as the BET_EF tool. Reference values and ranges of variation (for sensitivity analysis) were defined for input parameters, based on data from the MESIMEX exercise (performed at Vesuvio volcano in 2006). Complementary methods for sensitivity analyses were implemented: local, global using Sobol' indices and regional using Contribution to Sample Mean and Variance plots. The results (specific to the case considered) obtained with the different techniques are in good agreement and enable answering the following questions: i. Which characteristics of monitoring are important for early warning (reliability)? ii. How do experts' opinions influence the hazard assessment and thus the decision? Concerning the characteristics of monitoring, the more influent parameters are the means rather than the variances for the case considered. For the parameters that concern expert setting, the weight attributed to monitoring measurement ω, the mean of thresholds, the economic context and the setting of the decision threshold are very influential. The interest of applying the VOI theory (more precisely the value of imperfect information) in the BET framework was demonstrated as support for helping experts in the setting of the monitoring system or for helping managers to decide the installation of additional monitoring systems. Acknowledgments: This work was carried out in the framework of the project MEDSUV. This project is funded under the call FP7 ENV.2012.6.4-2: Long-term monitoring experiment in geologically active regions of Europe prone to natural hazards: the Supersite concept. Grant agreement n°308665.

  3. Digitise This! A Quick and Easy Remote Sensing Method to Monitor the Daily Extent of Dredge Plumes

    PubMed Central

    Evans, Richard D.; Murray, Kathy L.; Field, Stuart N.; Moore, James A. Y.; Shedrawi, George; Huntley, Barton G.; Fearns, Peter; Broomhall, Mark; McKinna, Lachlan I. W.; Marrable, Daniel

    2012-01-01

    Technological advancements in remote sensing and GIS have improved natural resource managers’ abilities to monitor large-scale disturbances. In a time where many processes are heading towards automation, this study has regressed to simple techniques to bridge a gap found in the advancement of technology. The near-daily monitoring of dredge plume extent is common practice using Moderate Resolution Imaging Spectroradiometer (MODIS) imagery and associated algorithms to predict the total suspended solids (TSS) concentration in the surface waters originating from floods and dredge plumes. Unfortunately, these methods cannot determine the difference between dredge plume and benthic features in shallow, clear water. This case study at Barrow Island, Western Australia, uses hand digitising to demonstrate the ability of human interpretation to determine this difference with a level of confidence and compares the method to contemporary TSS methods. Hand digitising was quick, cheap and required very little training of staff to complete. Results of ANOSIM R statistics show remote sensing derived TSS provided similar spatial results if they were thresholded to at least 3 mg L−1. However, remote sensing derived TSS consistently provided false-positive readings of shallow benthic features as Plume with a threshold up to TSS of 6 mg L−1, and began providing false-negatives (excluding actual plume) at a threshold as low as 4 mg L−1. Semi-automated processes that estimate plume concentration and distinguish between plumes and shallow benthic features without the arbitrary nature of human interpretation would be preferred as a plume monitoring method. However, at this stage, the hand digitising method is very useful and is more accurate at determining plume boundaries over shallow benthic features and is accessible to all levels of management with basic training. PMID:23240055

  4. Rapid resolution of brain ischemic hypoxia after cerebral revascularization in moyamoya disease.

    PubMed

    Arikan, Fuat; Vilalta, Jordi; Torne, Ramon; Noguer, Montserrat; Lorenzo-Bosquet, Carles; Sahuquillo, Juan

    2015-03-01

    In moyamoya disease (MMD), cerebral revascularization is recommended in patients with recurrent or progressive ischemic events and associated reduced cerebral perfusion reserve. Low-flow bypass with or without indirect revascularization is generally the standard surgical treatment. Intraoperative monitoring of cerebral partial pressure of oxygen (PtiO2) with polarographic Clark-type probes in cerebral artery bypass surgery for MMD-induced chronic cerebral ischemia has not yet been described. To describe basal brain tissue oxygenation in MMD patients before revascularization as well as the immediate changes produced by the surgical procedure using intraoperative PtiO2 monitoring. Between October 2011 and January 2013, all patients with a diagnosis of MMD were intraoperatively monitored. Cerebral oxygenation status was analyzed based on the Ptio2/PaO2 ratio. Reference thresholds of PtiO2/PaO2 had been previously defined as below 0.1 for the lower reference threshold (hypoxia) and above 0.35 for the upper reference threshold (hyperoxia). Before STA-MCA bypass, all patients presented a situation of severe tissue hypoxia confirmed by a PtiO2/PaO2 ratio <0.1. After bypass, all patients showed a rapid and sustained increase in PtiO2, which reached normal values (PtiO2/PaO2 ratio between 0.1 and 0.35). One patient showed an initial PtiO2 improvement followed by a decrease due to bypass occlusion. After repeat anastomosis, the patient's PtiO2 increased again and stabilized. Direct anastomosis quickly improves cerebral oxygenation, immediately reducing the risk of ischemic stroke in both pediatric and adult patients. Intraoperative PtiO2 monitoring is a very reliable tool to verify the effectiveness of this revascularization procedure.

  5. Detection of critical cerebral desaturation thresholds by three regional oximeters during hypoxia: a pilot study in healthy volunteers.

    PubMed

    Tomlin, Kerry L; Neitenbach, Anna-Maria; Borg, Ulf

    2017-01-13

    Regional oximetry is increasingly used to monitor post-extraction oxygen status of the brain during surgical procedures where hemodynamic fluctuations are expected. Particularly in cardiac surgery, clinicians employ an interventional algorithm to restore baseline regional oxygen saturation (rSO 2 ) when a patient reaches a critical desaturation threshold. Evidence suggests that monitoring cardiac surgery patients and intervening to maintain rSO 2 can improve postoperative outcomes; however, evidence generated with one manufacturer's device may not be applicable to others. We hypothesized that regional oximeters from different manufacturers respond uniquely to changes in oxygen saturation in healthy volunteers. Three devices were tested: INVOS™ 5100C (Medtronic), EQUANOX™ 7600 (Nonin), and FORE-SIGHT™ (CASMED) monitors. We divided ten healthy subjects into two cohorts wearing a single sensor each from INVOS and EQUANOX (n = 6), or INVOS and FORE-SIGHT (n = 4). We induced and reversed hypoxia by adjusting the fraction of inspired oxygen. We calculated the magnitude of absolute rSO 2 change and rate of rSO 2 change during desaturation and resaturation, and determined if and when each device reached a critical interventional rSO 2 threshold during hypoxia. All devices responded to changes in oxygen directionally as expected. The median absolute rSO 2 change and the rate of rSO 2 change was significantly greater during desaturation and resaturation for INVOS compared with EQUANOX (P = 0.04). A similar but nonsignificant trend was observed for INVOS compared with FORE-SIGHT; our study was underpowered to definitively conclude there was no difference. A 10% relative decrease in rSO 2 during desaturation was detected by all three devices across the ten subjects. INVOS met a 20% relative decrease threshold in all subjects of both cohorts, compared to 1 with EQUANOX and 2 with FORE-SIGHT. Neither EQUANOX nor FORE-SIGHT reached a 50% absolute rSO 2 threshold compared with 4 and 3 subjects in each cohort with INVOS, respectively. Significant differences exist between the devices in how they respond to changes in oxygen saturation in healthy volunteers. We suggest caution when applying evidence generated with one manufacturer's device to all devices.

  6. Establishment of a standard operating procedure for predicting the time of calving in cattle

    PubMed Central

    Sauter-Louis, Carola; Braunert, Anna; Lange, Dorothee; Weber, Frank; Zerbe, Holm

    2011-01-01

    Precise calving monitoring is essential for minimizing the effects of dystocia in cows and calves. We conducted two studies in healthy cows that compared seven clinical signs (broad pelvic ligaments relaxation, vaginal secretion, udder hyperplasia, udder edema, teat filling, tail relaxation, and vulva edema) alone and in combination in order to predict the time of parturition. The relaxation of the broad pelvic ligaments combined with teat filling gave the best values for predicting either calving or no calving within 12 h. For the proposed parturition score (PS), a threshold of 4 PS points was identified below which calving within the next 12 h could be ruled out with a probability of 99.3% in cows (95.5% in heifers). Above this threshold, intermitted calving monitoring every 3 h and a progesterone rapid blood test (PRBT) would be recommended. By combining the PS and PRBT (if PS ≥ 4), the prediction of calving within the next 12 h improved from 14.9% to 53.1%, and the probability of ruling out calving was 96.8%. The PRBT was compared to the results of an enzyme immunoassay (sensitivity, 90.2%; specificity, 74.9%). The standard operating procedure developed in this study that combines the PS and PRBT will enable veterinarians to rule out or predict calving within a 12 h period in cows with high accuracy under field conditions. PMID:21586878

  7. Monitoring of the posterior cricoarytenoid muscle represents another option for neural monitoring during thyroid surgery: Normative vagal and recurrent laryngeal nerve posterior cricoarytenoid muscle electromyographic data.

    PubMed

    Liddy, Whitney; Barber, Samuel R; Lin, Brian M; Kamani, Dipti; Kyriazidis, Natalia; Lawson, Bradley; Randolph, Gregory W

    2018-01-01

    Intraoperative neural monitoring (IONM) of laryngeal nerves using electromyography (EMG) is routinely performed using endotracheal tube surface electrodes adjacent to the vocalis muscles. Other laryngeal muscles such as the posterior cricoarytenoid muscle (PCA) are indirectly monitored. The PCA may be directly and reliably monitored through an electrode placed in the postcricoid region. Herein, we describe the method and normative data for IONM using PCA EMG. Retrospective review. Data were reviewed retrospectively for thyroid and parathyroid surgery patients with IONM of laryngeal nerves from January to August 2016. Recordings of vocalis and PCA EMG amplitudes and latencies with stimulation of laryngeal nerves were obtained using endotracheal (ET) tube-based and postcricoid surface electrodes. Data comprised EMG responses in vocalis and PCA recording channels with stimulation of the vagus, recurrent laryngeal nerve (RLN), and external branch of the superior laryngeal nerve from 20 subjects (11 left, 9 right), as well as PCA EMG threshold data with RLN stimulation from 17 subjects. Mean EMG amplitude was 725.69 ± 108.58 microvolts (µV) for the ipsilateral vocalis and 329.44 ± 34.12 µV for the PCA with vagal stimulation, and 1,059.75 ± 140.40 µV for the ipsilateral vocalis and 563.88 ± 116.08 µV for the PCA with RLN stimulation. There were no statistically significant differences in mean latency. For threshold cutoffs of the PCA with RLN stimulation, mean minimum and maximum threshold intensities were 0.37 milliamperes (mA) and 0.84 mA, respectively. This study shows robust and reliable PCA EMG waveforms with direct nerve stimulation. Further studies will evaluate feasibility and application of the PCA electrode as a complementary quantitative tool in IONM. 4. Laryngoscope, 128:283-289, 2018. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  8. A zone-specific fish-based biotic index as a management tool for the Zeeschelde estuary (Belgium).

    PubMed

    Breine, Jan; Quataert, Paul; Stevens, Maarten; Ollevier, Frans; Volckaert, Filip A M; Van den Bergh, Ericia; Maes, Joachim

    2010-07-01

    Fish-based indices monitor changes in surface waters and are a valuable aid in communication by summarising complex information about the environment (Harrison and Whitfield, 2004). A zone-specific fish-based multimetric estuarine index of biotic integrity (Z-EBI) was developed based on a 13 year time series of fish surveys from the Zeeschelde estuary (Belgium). Sites were pre-classified using indicators of anthropogenic impact. Metrics showing a monotone response with pressure classes were selected for further analysis. Thresholds for the good ecological potential (GEP) were defined from references. A modified trisection was applied for the other thresholds. The Z-EBI is defined by the average of the metric scores calculated over a one year period and translated into an ecological quality ratio (EQR). The indices integrate structural and functional qualities of the estuarine fish communities. The Z-EBI performances were successfully validated for habitat degradation in the various habitat zones. Copyright 2010 Elsevier Ltd. All rights reserved.

  9. Investigation of Current Methods to Identify Helicopter Gear Health

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Lewicki, David G.; Le, Dy D.

    2007-01-01

    This paper provides an overview of current vibration methods used to identify the health of helicopter transmission gears. The gears are critical to the transmission system that provides propulsion, lift and maneuvering of the helicopter. This paper reviews techniques used to process vibration data to calculate conditions indicators (CI's), guidelines used by the government aviation authorities in developing and certifying the Health and Usage Monitoring System (HUMS), condition and health indicators used in commercial HUMS, and different methods used to set thresholds to detect damage. Initial assessment of a method to set thresholds for vibration based condition indicators applied to flight and test rig data by evaluating differences in distributions between comparable transmissions are also discussed. Gear condition indicator FM4 values are compared on an OH58 helicopter during 14 maneuvers and an OH58 transmission test stand during crack propagation tests. Preliminary results show the distributions between healthy helicopter and rig data are comparable and distributions between healthy and damaged gears show significant differences.

  10. Investigation of Current Methods to Identify Helicopter Gear Health

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Lewicki, David G.; Le, Dy D.

    2007-01-01

    This paper provides an overview of current vibration methods used to identify the health of helicopter transmission gears. The gears are critical to the transmission system that provides propulsion, lift and maneuvering of the helicopter. This paper reviews techniques used to process vibration data to calculate conditions indicators (CI s), guidelines used by the government aviation authorities in developing and certifying the Health and Usage Monitoring System (HUMS), condition and health indicators used in commercial HUMS, and different methods used to set thresholds to detect damage. Initial assessment of a method to set thresholds for vibration based condition indicators applied to flight and test rig data by evaluating differences in distributions between comparable transmissions are also discussed. Gear condition indicator FM4 values are compared on an OH58 helicopter during 14 maneuvers and an OH58 transmission test stand during crack propagation tests. Preliminary results show the distributions between healthy helicopter and rig data are comparable and distributions between healthy and damaged gears show significant differences.

  11. Rainfall thresholds as a landslide indicator for engineered slopes on the Irish Rail network

    NASA Astrophysics Data System (ADS)

    Martinović, Karlo; Gavin, Kenneth; Reale, Cormac; Mangan, Cathal

    2018-04-01

    Rainfall thresholds express the minimum levels of rainfall that need to be reached or exceeded in order for landslides to occur in a particular area. They are a common tool in expressing the temporal portion of landslide hazard analysis. Numerous rainfall thresholds have been developed for different areas worldwide, however none of these are focused on landslides occurring on the engineered slopes on transport infrastructure networks. This paper uses empirical method to develop the rainfall thresholds for landslides on the Irish Rail network earthworks. For comparison, rainfall thresholds are also developed for natural terrain in Ireland. The results show that particular thresholds involving relatively low rainfall intensities are applicable for Ireland, owing to the specific climate. Furthermore, the comparison shows that rainfall thresholds for engineered slopes are lower than those for landslides occurring on the natural terrain. This has severe implications as it indicates that there is a significant risk involved when using generic weather alerts (developed largely for natural terrain) for infrastructure management, and showcases the need for developing railway and road specific rainfall thresholds for landslides.

  12. Derivation of groundwater threshold values for analysis of impacts predicted at potential carbon sequestration sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Last, G. V.; Murray, C. J.; Bott, Y.

    2016-06-01

    The U.S. Department of Energy’s (DOE’s) National Risk Assessment Partnership (NRAP) Project is developing reduced-order models to evaluate potential impacts to groundwater quality due to carbon dioxide (CO 2) or brine leakage, should it occur from deep CO 2 storage reservoirs. These efforts targeted two classes of aquifer – an unconfined fractured carbonate aquifer based on the Edwards Aquifer in Texas, and a confined alluvium aquifer based on the High Plains Aquifer in Kansas. Hypothetical leakage scenarios focus on wellbores as the most likely conduits from the storage reservoir to an underground source of drinking water (USDW). To facilitate evaluationmore » of potential degradation of the USDWs, threshold values, below which there would be no predicted impacts, were determined for each of these two aquifer systems. These threshold values were calculated using an interwell approach for determining background groundwater concentrations that is an adaptation of methods described in the U.S. Environmental Protection Agency’s Unified Guidance for Statistical Analysis of Groundwater Monitoring Data at RCRA Facilities. Results demonstrate the importance of establishing baseline groundwater quality conditions that capture the spatial and temporal variability of the USDWs prior to CO 2 injection and storage.« less

  13. A novel method to estimate safety factor of capture by a fetal micropacemaker.

    PubMed

    Vest, Adriana Nicholson; Zhou, Li; Bar-Cohen, Yaniv; Eli Loeb, Gerald

    2016-07-01

    We have developed a rechargeable fetal micropacemaker in order to treat severe fetal bradycardia with comorbid hydrops fetalis, a life-threatening condition in pre-term non-viable fetuses for which there are no effective treatment options. The small size and minimally invasive form factor of our design limit the volume available for circuitry and a power source. The device employs a fixed-rate and fixed-amplitude relaxation oscillator and a tiny, rechargeable lithium ion power cell. For both research and clinical applications, it is valuable to monitor the electrode-myocardium interface in order to determine that adequate pacemaker output is being provided. This is typically accomplished by observing the minimal stimulus strength that achieves threshold for pacing capture. The output of our simple micropacemaker cannot be programmatically altered to determine this minimal capture threshold, but a safety factor can be inferred by determining the refractory period for ventricular capture at a given stimulus strength. This is done by measuring the minimal timing between naturally occurring QRS complexes and pacing stimuli that successfully generate a premature ventricular contraction. The method was tested in a pilot study in four fetal sheep and the data demonstrate that a relative measure of threshold is obtainable. This method provides valuable real-time information about the electrode-tissue interface.

  14. A novel method to estimate safety factor of capture by a fetal micropacemaker

    PubMed Central

    Vest, Adriana Nicholson; Zhou, Li; Bar-Cohen, Yaniv; Loeb, Gerald Eli

    2016-01-01

    We have developed a rechargeable fetal micropacemaker in order to treat severe fetal bradycardia with comorbid hydrops fetalis, a life-threatening condition in pre-term non-viable fetuses for which there are no effective treatment options. The small size and minimally invasive form factor of our design limit the volume available for circuitry and a power source. The device employs a fixed-rate and fixed-amplitude relaxation oscillator and a tiny, rechargeable lithium ion power cell. For both research and clinical applications, it is valuable to monitor the electrode-myocardium interface in order to determine that adequate pacemaker output is being provided. This is typically accomplished by observing the minimal stimulus strength that achieves threshold for pacing capture. The output of our simple micropacemaker cannot be programmatically altered to determine this minimal capture threshold, but a safety factor can be inferred by determining the refractory period for ventricular capture at a given stimulus strength. This is done by measuring the minimal timing between naturally occurring QRS complexes and pacing stimuli that successfully generate a premature ventricular contraction. The method was tested in a pilot study in 4 fetal sheep and the data demonstrate that a relative measure of threshold is obtainable. This method provides valuable real-time information about the electrode-tissue interface. PMID:27340134

  15. The excretion of theobromine in Thoroughbred racehorses after feeding compounded cubes containing cocoa husk--establishment of a threshold value in horse urine.

    PubMed

    Haywood, P E; Teale, P; Moss, M S

    1990-07-01

    Thoroughbred geldings were fed racehorse cubes containing a predetermined concentration of theobromine in the form of cocoa husk. They were offered 7 kg of cubes per day, divided between morning and evening feed, and food consumption was monitored. Urinary concentrations of theobromine were determined following the consumption of cubes containing 11.5, 6.6, 2.0 and 1.2 mg per kg of theobromine, to verify whether or not such concentrations would produce positive urine tests. Pre-dose urine samples were collected to verify the absence of theobromine before each experiment. It became apparent from the results of the first three administrations that the limit of detection of theobromine, using such procedures, would be reached at a feed level of about 1 mg per kg theobromine. Therefore the final administration, using cubes containing 1.2 mg per kg theobromine, was singled out for additional analytical work and quantitative procedures were developed to measure urinary concentrations of theobromine. It was anticipated that the results would form a basis for discussions relating to the establishment of a threshold value for theobromine in horse urine. The Stewards of the Jockey Club subsequently gave notice that they had established a threshold level for theobromine in urine of 2 micrograms/ml.

  16. Estimating soil moisture exceedance probability from antecedent rainfall

    NASA Astrophysics Data System (ADS)

    Cronkite-Ratcliff, C.; Kalansky, J.; Stock, J. D.; Collins, B. D.

    2016-12-01

    The first storms of the rainy season in coastal California, USA, add moisture to soils but rarely trigger landslides. Previous workers proposed that antecedent rainfall, the cumulative seasonal rain from October 1 onwards, had to exceed specific amounts in order to trigger landsliding. Recent monitoring of soil moisture upslope of historic landslides in the San Francisco Bay Area shows that storms can cause positive pressure heads once soil moisture values exceed a threshold of volumetric water content (VWC). We propose that antecedent rainfall could be used to estimate the probability that VWC exceeds this threshold. A major challenge to estimating the probability of exceedance is that rain gauge records are frequently incomplete. We developed a stochastic model to impute (infill) missing hourly precipitation data. This model uses nearest neighbor-based conditional resampling of the gauge record using data from nearby rain gauges. Using co-located VWC measurements, imputed data can be used to estimate the probability that VWC exceeds a specific threshold for a given antecedent rainfall. The stochastic imputation model can also provide an estimate of uncertainty in the exceedance probability curve. Here we demonstrate the method using soil moisture and precipitation data from several sites located throughout Northern California. Results show a significant variability between sites in the sensitivity of VWC exceedance probability to antecedent rainfall.

  17. Ground-Water Quality Data in the Middle Sacramento Valley Study Unit, 2006 - Results from the California GAMA Program

    USGS Publications Warehouse

    Schmitt, Stephen J.; Fram, Miranda S.; Milby Dawson, Barbara J.; Belitz, Kenneth

    2008-01-01

    Ground-water quality in the approximately 3,340 square mile Middle Sacramento Valley study unit (MSACV) was investigated from June through September, 2006, as part of the California Groundwater Ambient Monitoring and Assessment (GAMA) program. The GAMA Priority Basin Assessment project was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The Middle Sacramento Valley study was designed to provide a spatially unbiased assessment of raw ground-water quality within MSACV, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from 108 wells in Butte, Colusa, Glenn, Sutter, Tehama, Yolo, and Yuba Counties. Seventy-one wells were selected using a randomized grid-based method to provide statistical representation of the study unit (grid wells), 15 wells were selected to evaluate changes in water chemistry along ground-water flow paths (flow-path wells), and 22 were shallow monitoring wells selected to assess the effects of rice agriculture, a major land use in the study unit, on ground-water chemistry (RICE wells). The ground-water samples were analyzed for a large number of synthetic organic constituents (volatile organic compounds [VOCs], gasoline oxygenates and degradates, pesticides and pesticide degradates, and pharmaceutical compounds), constituents of special interest (perchlorate, N-nitrosodimethylamine [NDMA], and 1,2,3-trichloropropane [1,2,3-TCP]), inorganic constituents (nutrients, major and minor ions, and trace elements), radioactive constituents, and microbial indicators. Naturally occurring isotopes (tritium, and carbon-14, and stable isotopes of hydrogen, oxygen, nitrogen, and carbon), and dissolved noble gases also were measured to help identify the sources and ages of the sampled ground water. Quality-control samples (blanks, replicates, laboratory matrix spikes) were collected at approximately 10 percent of the wells, and the results for these samples were used to evaluate the quality of the data for the ground-water samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that contamination was not a noticeable source of bias in the data for the ground-water samples. Differences between replicate samples were within acceptable ranges, indicating acceptably low variability. Matrix spike recoveries were within acceptable ranges for most constituents. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, or blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to treated water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH) and thresholds established for aesthetic concerns (secondary maximum contaminant levels, SMCL-CA) by CDPH. Comparisons between data collected for this study and drinking-water thresholds are for illustrative purposes only and are not indicative of compliance or noncompliance with regulatory thresholds. Most constituents that were detected in ground-water samples were found at concentrations below drinking-water thresholds. VOCs were detected in less than one-third and pesticides and pesticide degradates in just over one-half of the grid wells, and all detections of these constituents in samples from all wells of the MSACV study unit were below health-based thresholds. All detections of trace elements in samples from MSACV grid wells were below health-based thresholds, with the exceptions of arsenic and boro

  18. Experimental on-stream elimination of resonant whirl in a large centrifugal compressor

    NASA Technical Reports Server (NTRS)

    Bhat, G. I.; Eierman, R. G.

    1984-01-01

    Resonant whirl condition during operation of a multi-stage centrifugal compressor at higher than anticipated speeds and loads was reported. The condition was diagnosed by a large scale computerized Machinery Condition Monitoring System (MACMOS). This computerized system verified that the predominant subsynchronous whirl frequency locked in on the first resonant frequency of the compressor rotor and did not vary with compressor speed. Compressor stability calculations showed the rotor system had excessive hearing stiffness and inadequate effective damping. An optimum bearing design which was developed to minimize the unbalance response and to maximize the stability threshold is presented.

  19. National Earthquake Information Center Seismic Event Detections on Multiple Scales

    NASA Astrophysics Data System (ADS)

    Patton, J.; Yeck, W. L.; Benz, H.; Earle, P. S.; Soto-Cordero, L.; Johnson, C. E.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (NEIC) monitors seismicity on local, regional, and global scales using automatic picks from more than 2,000 near-real time seismic stations. This presents unique challenges in automated event detection due to the high variability in data quality, network geometries and density, and distance-dependent variability in observed seismic signals. To lower the overall detection threshold while minimizing false detection rates, NEIC has begun to test the incorporation of new detection and picking algorithms, including multiband (Lomax et al., 2012) and kurtosis (Baillard et al., 2014) pickers, and a new bayesian associator (Glass 3.0). The Glass 3.0 associator allows for simultaneous processing of variably scaled detection grids, each with a unique set of nucleation criteria (e.g., nucleation threshold, minimum associated picks, nucleation phases) to meet specific monitoring goals. We test the efficacy of these new tools on event detection in networks of various scales and geometries, compare our results with previous catalogs, and discuss lessons learned. For example, we find that on local and regional scales, rapid nucleation of small events may require event nucleation with both P and higher-amplitude secondary phases (e.g., S or Lg). We provide examples of the implementation of a scale-independent associator for an induced seismicity sequence (local-scale), a large aftershock sequence (regional-scale), and for monitoring global seismicity. Baillard, C., Crawford, W. C., Ballu, V., Hibert, C., & Mangeney, A. (2014). An automatic kurtosis-based P-and S-phase picker designed for local seismic networks. Bulletin of the Seismological Society of America, 104(1), 394-409. Lomax, A., Satriano, C., & Vassallo, M. (2012). Automatic picker developments and optimization: FilterPicker - a robust, broadband picker for real-time seismic monitoring and earthquake early-warning, Seism. Res. Lett. , 83, 531-540, doi: 10.1785/gssrl.83.3.531.

  20. Modeling the Dynamics of High-Grade Serous Ovarian Cancer Progression for Transvaginal Ultrasound-Based Screening and Early Detection

    PubMed Central

    Lee, Jung-Min; Levy, Doron

    2016-01-01

    High-grade serous ovarian cancer (HGSOC) represents the majority of ovarian cancers and accounts for the largest proportion of deaths from the disease. A timely detection of low volume HGSOC should be the goal of any screening studies. However, numerous transvaginal ultrasound (TVU) detection-based population studies aimed at detecting low-volume disease have not yielded reduced mortality rates. A quantitative invalidation of TVU as an effective HGSOC screening strategy is a necessary next step. Herein, we propose a mathematical model for a quantitative explanation on the reported failure of TVU-based screening to improve HGSOC low-volume detectability and overall survival.We develop a novel in silico mathematical assessment of the efficacy of a unimodal TVU monitoring regimen as a strategy aimed at detecting low-volume HGSOC in cancer-positive cases, defined as cases for which the inception of the first malignant cell has already occurred. Our findings show that the median window of opportunity interval length for TVU monitoring and HGSOC detection is approximately 1.76 years. This does not translate into reduced mortality levels or improved detection accuracy in an in silico cohort across multiple TVU monitoring frequencies or detection sensitivities. We demonstrate that even a semiannual, unimodal TVU monitoring protocol is expected to miss detectable HGSOC. Lastly, we find that circa 50% of the simulated HGSOC growth curves never reach the baseline detectability threshold, and that on average, 5–7 infrequent, rate-limiting stochastic changes in the growth parameters are associated with reaching HGSOC detectability and mortality thresholds respectively. Focusing on a malignancy poorly studied in the mathematical oncology community, our model captures the dynamic, temporal evolution of HGSOC progression. Our mathematical model is consistent with recent case reports and prospective TVU screening population studies, and provides support to the empirical recommendation against frequent HGSOC screening. PMID:27257824

  1. Diesel emission reduction using internal exhaust gas recirculation

    DOEpatents

    He, Xin [Denver, CO; Durrett, Russell P [Bloomfield Hills, MI

    2012-01-24

    A method for controlling combustion in a direct-injection diesel engine includes monitoring a crankshaft rotational position of a cylinder of the engine, monitoring an engine load, determining an intake stroke within the cylinder based upon the crankshaft rotational position, and when the engine load is less than a threshold engine load, opening an exhaust valve for the cylinder during a portion of the intake stroke.

  2. Ultralow threshold graded-index separate-confinement heterostructure single quantum well (Al, Ga) As lasers

    NASA Technical Reports Server (NTRS)

    Derry, P. L.; Chen, H. Z.; Morkoc, H.; Yariv, A.; Lau, K. Y.

    1988-01-01

    Broad area graded-index separate-confinement heterostructure single quantum well lasers grown by molecular-beam epitaxy (MBE) with threshold current density as low as 93 A/sq cm (520 microns long) have been fabricated. Buried lasers formed from similarly structured MBE material with liquid phase epitaxy regrowth had threshold currents at submilliampere levels when high reflectivity coatings were applied to the end facets. A CW threshold current of 0.55 mA was obtained for a laser with facet reflectivities of about 80 percent, a cavity length of 120 micron, and an active region stripe width of 1 micron. These devices driven directly with logic level signals have switch-on delays less than 50 ps without any current prebias. Such lasers permit fully on-off switching while at the same time obviating the need for bias monitoring and feedback control.

  3. Neurophysiological mechanism of possibly confounding peripheral activation of the facial nerve during corticobulbar tract monitoring.

    PubMed

    Téllez, Maria J; Ulkatan, Sedat; Urriza, Javier; Arranz-Arranz, Beatriz; Deletis, Vedran

    2016-02-01

    To improve the recognition and possibly prevent confounding peripheral activation of the facial nerve caused by leaking transcranial electrical stimulation (TES) current during corticobulbar tract monitoring. We applied a single stimulus and a short train of electrical stimuli directly to the extracranial portion of the facial nerve. We compared the peripherally elicited compound muscle action potential (CMAP) of the facial nerve with the responses elicited by TES during intraoperative monitoring of the corticobulbar tract. A single stimulus applied directly to the facial nerve at subthreshold intensities did not evoke a CMAP, whereas short trains of subthreshold stimuli repeatedly evoked CMAPs. This is due to the phenomenon of sub- or near-threshold super excitability of the cranial nerve. Therefore, the facial responses evoked by short trains TES, when the leaked current reaches the facial nerve at sub- or near-threshold intensity, could lead to false interpretation. Our results revealed a potential pitfall in the current methodology for facial corticobulbar tract monitoring that is due to the activation of the facial nerve by subthreshold trains of stimuli. This study proposes a new criterion to exclude peripheral activation during corticobulbar tract monitoring. The failure to recognize and avoid facial nerve activation due to leaking current in the peripheral portion of the facial nerve during TES decreases the reliability of corticobulbar tract monitoring by increasing the possibility of false interpretation. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  4. The Eruption Forecasting Information System: Volcanic Eruption Forecasting Using Databases

    NASA Astrophysics Data System (ADS)

    Ogburn, S. E.; Harpel, C. J.; Pesicek, J. D.; Wellik, J.

    2016-12-01

    Forecasting eruptions, including the onset size, duration, location, and impacts, is vital for hazard assessment and risk mitigation. The Eruption Forecasting Information System (EFIS) project is a new initiative of the US Geological Survey-USAID Volcano Disaster Assistance Program (VDAP) and will advance VDAP's ability to forecast the outcome of volcanic unrest. The project supports probability estimation for eruption forecasting by creating databases useful for pattern recognition, identifying monitoring data thresholds beyond which eruptive probabilities increase, and for answering common forecasting questions. A major component of the project is a global relational database, which contains multiple modules designed to aid in the construction of probabilistic event trees and to answer common questions that arise during volcanic crises. The primary module contains chronologies of volcanic unrest. This module allows us to query eruption chronologies, monitoring data, descriptive information, operational data, and eruptive phases alongside other global databases, such as WOVOdat and the Global Volcanism Program. The EFIS database is in the early stages of development and population; thus, this contribution also is a request for feedback from the community. Preliminary data are already benefitting several research areas. For example, VDAP provided a forecast of the likely remaining eruption duration for Sinabung volcano, Indonesia, using global data taken from similar volcanoes in the DomeHaz database module, in combination with local monitoring time-series data. In addition, EFIS seismologists used a beta-statistic test and empirically-derived thresholds to identify distal volcano-tectonic earthquake anomalies preceding Alaska volcanic eruptions during 1990-2015 to retrospectively evaluate Alaska Volcano Observatory eruption precursors. This has identified important considerations for selecting analog volcanoes for global data analysis, such as differences between closed and open system volcanoes.

  5. Segmentation and tracking of anticyclonic eddies during a submarine volcanic eruption using ocean colour imagery.

    PubMed

    Marcello, Javier; Eugenio, Francisco; Estrada-Allis, Sheila; Sangrà, Pablo

    2015-04-14

    The eruptive phase of a submarine volcano located 2 km away from the southern coast of El Hierro Island started on October 2011. This extraordinary event provoked a dramatic perturbation of the water column. In order to understand and quantify the environmental impacts caused, a regular multidisciplinary monitoring was carried out using remote sensing sensors. In this context, we performed the systematic processing of every MODIS and MERIS and selected high resolution Worldview-2 imagery to provide information on the concentration of a number of biological, physical and chemical parameters. On the other hand, the eruption provided an exceptional source of tracer that allowed the study a variety of oceanographic structures. Specifically, the Canary Islands belong to a very active zone of long-lived eddies. Such structures are usually monitored using sea level anomaly fields. However these products have coarse spatial resolution and they are not suitable to perform submesoscale studies. Thanks to the volcanic tracer, detailed studies were undertaken with ocean colour imagery allowing, using the diffuse attenuation coefficient, to monitor the process of filamentation and axisymmetrization predicted by theoretical studies and numerical modelling. In our work, a novel 2-step segmentation methodology has been developed. The approach incorporates different segmentation algorithms and region growing techniques. In particular, the first step obtains an initial eddy segmentation using thresholding or clustering methods and, next, the fine detail is achieved by the iterative identification of the points to grow and the subsequent application of watershed or thresholding strategies. The methodology has demonstrated an excellent performance and robustness and it has proven to properly capture the eddy and its filaments.

  6. Monitoring bedload entrainment and transport in snowmelt-dominated forest streams of the Columbia Mountains, Canada

    NASA Astrophysics Data System (ADS)

    Green, Kim; Brardinoni, Francesco; Alila, Younes

    2014-05-01

    We monitor bedload transport and water discharge at six stations in two forested headwater streams of the Columbia Mountains, Canada. The monitoring network of sediment traps is designed to examine the effects of channel bed texture, and the influence of alluvial (i.e., step pools, and riffle pools) and semi-alluvial morphologies (i.e., boulder cascades and forced step pools) on bedload entrainment and transport. Results suggest that patterns of bedload entrainment are influenced by flow resistance while the value of the critical dimensionless shear stress for mobilization of the surface D50 varies due to channel gradient, grain sheltering effects and, to a less extent, flow resistance. Regardless of channel morphology we observe: (i) equal-threshold entrainment for all mobile grains in channels with high grain and/or form resistance; and (ii) initial equal-threshold entrainment of calibers ≤ 22mm, and subsequent size-selective entrainment of coarser material in channels with low form resistance (e.g. riffle pool). Scaled fractional analysis reveals that in reaches with high flow resistance most bedload transport occurs in partial mobility fashion relative to the available bed material and that only material finer than 16mm attains full mobility during over-bank flows. Equal mobility transport for a wider range of grain sizes is achieved in reaches with reduced flow resistance. Evaluation of bedload rating curves across sites identifies that grain effects predominate with respect to bedload flux whereas morphological effects (i.e. form resistance) play a secondary role. Application of selected empirical formulae developed in steep alpine channels present variable success in predicting transport rates in the study reaches.

  7. Measures of Groundwater Drought from the Long-term Monitoring Data in Korea

    NASA Astrophysics Data System (ADS)

    Chung, E.; Park, J.; Woo, N. C.

    2017-12-01

    Recently, drought has been increased in its severity and frequency along the climate change in Korea. There are several criteria for alarming drought, for instance, based on the no-rainfall days, the amount of stream discharge, and the water levels of reservoirs. However, farmers depending on groundwater still have been suffered in preparing drought especially in the Spring. No-rainfall days continue, groundwater exploitation increases, water table declines, stream discharge decreases, and then the effects of drought become serious. Thus, the drought index based on the groundwater level is needed for the preparedness of drought disaster. Palmer et al.(1965, USGS) has proposed a method to set the threshold for the decline of the groundwater level in 5 stages based on the daily water-level data over the last 30 years. In this study, according to Peters et al.(2003), the threshold of groundwater level was estimated using the daily water-level data at five sites with significant drought experiences in Korea. Water levels and precipitations data were obtained from the national groundwater monitoring wells and the automatic weather stations, respectively, for 10 years from 2005 to 2014. From the water-level changes, the threshold was calculated when the value of the drought criterion (c), the ratio of the deficit below the threshold to the deficit below the average, is 0.3. As a result, the monthly drought days were high in 2009 and 2011 in Uiryeong, and from 2005 to 2008 in Boeun. The validity of the approach and the threshold can be evaluated by comparing calculated monthly drought days with recorded drought in the past. Through groundwater drought research, it is expected that not only surface water also groundwater resource management should be implemented more efficiently to overcome drought disaster.

  8. Threshold Capability Development in Intensive Mode Business Units

    ERIC Educational Resources Information Center

    Crispin, Stuart; Hancock, Phil; Male, Sally Amanda; Baillie, Caroline; MacNish, Cara; Leggoe, Jeremy; Ranmuthugala, Dev; Alam, Firoz

    2016-01-01

    Purpose: The purpose of this paper is to explore: student perceptions of threshold concepts and capabilities in postgraduate business education, and the potential impacts of intensive modes of teaching on student understanding of threshold concepts and development of threshold capabilities. Design/Methodology/Approach: The student experience of…

  9. Relationship between loss of echogenicity and cavitation emissions from echogenic liposomes insonified by spectral Doppler ultrasound

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Kirthi

    Cardiovascular disease is the leading cause of death and disability in the United States and worldwide. Echogenic liposomes (ELIP) are theragonistic ultrasound contrast agents (UCAs) being developed for the early detection and treatment of cardiovascular disease. Stability of the echogenicity of ELIP in physiologic conditions is crucial to their successful translation to clinical use. The stability of ELIP echogenicity was determined in vitro under physiologic conditions of total dissolved gas concentration, temperature, and hydrodynamic pressure in porcine plasma and whole blood. Ultrasound contrast agents (UCAs) have the potential to nucleate cavitation and promote both beneficial and deleterious bioeffects in vivo. Previous studies have elucidated the pressure amplitude threshold for rapid loss of echogenicity due to UCA fragmentation as a function of pulse duration and pulse repetition frequency (PRF). Previous studies have also demonstrated that UCA fragmentation was concomitant with inertial cavitation. The purpose of this study was to evaluate the relationship between stable and inertial cavitation thresholds and loss of echogenicity of ELIP as a function of pulse duration and pulse repetition frequency. Determining the relationship between cavitation thresholds and loss of echogenicity of ELIP would enable monitoring of cavitation based upon the on-screen echogenicity in clinical applications. ELIP were insonified by a clinical ultrasound scanner in duplex spectral Doppler mode at four pulse durations and four PRFs in a static fluid and in a flow system. Cavitation emissions from the UCAs insonified by Doppler pulses were recorded using a single-element passive cavitation detection (PCD) system and a passive cavitation imaging (PCI) system. Stable and inertial cavitation thresholds were ascertained. Loss of echogenicity from ELIP was assessed within regions of interest on B-mode images. Stable cavitation thresholds were found to be lower than inertial cavitation thresholds. Stable and inertial cavitation thresholds of ELIP were found to have a weak dependence on pulse duration. However, the stable cavitation threshold of ELIP had no dependence on PRF. The inertial cavitation threshold of ELIP had a weak dependence on PRF. Cavitation thresholds ascertained using a PCI agreed with the thresholds ascertained using a single-element PCD. The azimuthal beamwidth of the cavitation emissions detected by the PCI system agreed with the calibrated beamwidth of the insonation Doppler pressure exceeding the cavitation threshold. The power of cavitation emissions was an exponential function of the loss of echogenicity over the investigated range of acoustic pressures. ELIP lost more than 80% echogenicity before the onset of stable or inertial cavitation. Once this level of echogenicity loss occurred, both stable and inertial cavitation emissions were detected in the physiologic flow phantom. These results indicate that 80% loss of echogenicity may be used as a qualitative metric to gauge the onset of stable and inertial cavitation from ELIP.

  10. Whole body vibration training improves vibration perception threshold in healthy young adults: A randomized clinical trial pilot study

    PubMed Central

    Hernandez-Mocholi, M.A.; Dominguez-Muñoz, F.J.; Corzo, H.; Silva, S.C.S.; Adsuar, J.C.; Gusi, N.

    2016-01-01

    Objectives: Loss of foot sensitivity is a relevant parameter to assess and prevent in several diseases. It is crucial to determine the vibro-tactile sensitivity threshold response to acute conditions to explore innovative monitor tools and interventions to prevent and treat this challenge. The aims were: 1) to analyze the acute effects of a single whole body vibration session (4min-18Hz-4mm) on vibro-tactile perception threshold in healthy young adults. 2) to analyze the 48 hours effects of 3 whole body vibration sessions on vibro-tactile perception threshold in healthy young adults. Methods: A randomized controlled clinical trial over 3 sessions of whole body vibration intervention or 3 sessions of placebo intervention. Twenty-eight healthy young adults were included: 11 experimental group and 12 placebo group. The experimental group performed 3 sessions of WBV while the placebo group performed 3 sessions of placebo intervention. Results: The vibro-tactile threshold increased right after a single WBV session in comparison with placebo. Nevertheless, after 3 whole body vibration sessions and 48 hours, the threshold decreased to values lower than the initial. Conclusions: The acute response of the vibro-tactile threshold to one whole body vibration session increased, but the 48 hours short-term response of this threshold decreased in healthy young adults. PMID:26944818

  11. The Relationship between the Behavioral Hearing Thresholds and Maximum Bilirubin Levels at Birth in Children with a History of Neonatal Hyperbilirubinemia

    PubMed Central

    Panahi, Rasool; Jafari, Zahra; Sheibanizade, Abdoreza; Salehi, Masoud; Esteghamati, Abdoreza; Hasani, Sara

    2013-01-01

    Introduction: Neonatal hyperbilirubinemia is one of the most important factors affecting the auditory system and can cause sensorineural hearing loss. This study investigated the relationship between behavioral hearing thresholds in children with a history of jaundice and the maximum level of bilirubin concentration in the blood. Materials and Methods: This study was performed on 18 children with a mean age of 5.6 years and with a history of neonatal hyperbilirubinemia. Behavioral hearing thresholds, transient evoked emissions and brainstem evoked responses were evaluated in all children. Results: Six children (33.3%) had normal hearing thresholds and the remaining (66.7%) had some degree of hearing loss. There was no significant relationship (r=-0.28, P=0.09) between the mean total bilirubin levels and behavioral hearing thresholds in all samples. A transient evoked emission was seen only in children with normal hearing thresholds however in eight cases brainstem evoked responses had not detected. Conclusion: Increased blood levels of bilirubin at the neonatal period were potentially one of the causes of hearing loss. There was a lack of a direct relationship between neonatal bilirubin levels and the average hearing thresholds which emphasizes on the necessity of monitoring the various amounts of bilirubin levels. PMID:24303432

  12. Himawari-8 Satellite Based Dynamic Monitoring of Grassland Fire in China-Mongolia Border Regions

    PubMed Central

    Na, Li; Bao, Yulong; Bao, Yongbin; Na, Risu; Tong, Siqin; Si, Alu

    2018-01-01

    In this study, we used bands 7, 4, and 3 of the Advance Himawari Imager (AHI) data, combined with a Threshold Algorithm and a visual interpretation method to monitor the entire process of grassland fires that occurred on the China-Mongolia border regions, between 05:40 (UTC) on April 19th to 13:50 (UTC) on April 21st 2016. The results of the AHI data monitoring are evaluated by the fire point product data, the wind field data, and the environmental information data of the area in which the fire took place. The monitoring result shows that, the grassland fire burned for two days and eight hours with a total burned area of about 2708.29 km2. It mainly spread from the northwest to the southeast, with a maximum burning speed of 20.9 m/s, a minimum speed of 2.52 m/s, and an average speed of about 12.07 m/s. Thus, using AHI data can not only quickly and accurately track the dynamic development of a grassland fire, but also estimate the spread speed and direction. The evaluation of fire monitoring results reveals that AHI data with high precision and timeliness can be highly consistent with the actual situation. PMID:29346289

  13. Testing a Threshold-Based Bed Bug Management Approach in Apartment Buildings.

    PubMed

    Singh, Narinderpal; Wang, Changlu; Zha, Chen; Cooper, Richard; Robson, Mark

    2017-07-26

    We tested a threshold-based bed bug ( Cimex lectularius L.) management approach with the goal of achieving elimination with minimal or no insecticide application. Thirty-two bed bug infested apartments were identified. These apartments were divided into four treatment groups based on apartment size and initial bed bug count, obtained through a combination of visual inspection and bed bug monitors: I- Non-chemical only in apartments with 1-12 bed bug count, II- Chemical control only in apartments with 1-12 bed bug count, III- Non-chemical and chemical control in apartments with >12 bed bug count, and IV- Chemical control only in apartments with ≥11 bed bug count. All apartments were monitored or treated once every two weeks for a maximum of 28 wk. Treatment I eliminated bed bugs in a similar amount of time to treatment II. Time to eliminate bed bugs was similar between treatment III and IV but required significantly less insecticide spray in treatment III than that in treatment IV. A threshold-based management approach (non-chemical only or non-chemical and chemical) can eliminate bed bugs in a similar amount of time, using little to no pesticide compared to a chemical only approach.

  14. Lowering thresholds for speed limit enforcement impairs peripheral object detection and increases driver subjective workload.

    PubMed

    Bowden, Vanessa K; Loft, Shayne; Tatasciore, Monica; Visser, Troy A W

    2017-01-01

    Speed enforcement reduces incidences of speeding, thus reducing traffic accidents. Accordingly, it has been argued that stricter speed enforcement thresholds could further improve road safety. Effective speed monitoring however requires driver attention and effort, and human information-processing capacity is limited. Emphasizing speed monitoring may therefore reduce resource availability for other aspects of safe vehicle operation. We investigated whether lowering enforcement thresholds in a simulator setting would introduce further competition for limited cognitive and visual resources. Eighty-four young adult participants drove under conditions where they could be fined for travelling 1, 6, or 11km/h over a 50km/h speed-limit. Stricter speed enforcement led to greater subjective workload and significant decrements in peripheral object detection. These data indicate that the benefits of reduced speeding with stricter enforcement may be at least partially offset by greater mental demands on drivers, reducing their responses to safety-critical stimuli on the road. It is likely these results under-estimate the impact of stricter speed enforcement on real-world drivers who experience significantly greater pressures to drive at or above the speed limit. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Testing a Threshold-Based Bed Bug Management Approach in Apartment Buildings

    PubMed Central

    Singh, Narinderpal; Zha, Chen; Cooper, Richard; Robson, Mark

    2017-01-01

    We tested a threshold-based bed bug (Cimex lectularius L.) management approach with the goal of achieving elimination with minimal or no insecticide application. Thirty-two bed bug infested apartments were identified. These apartments were divided into four treatment groups based on apartment size and initial bed bug count, obtained through a combination of visual inspection and bed bug monitors: I- Non-chemical only in apartments with 1–12 bed bug count, II- Chemical control only in apartments with 1–12 bed bug count, III- Non-chemical and chemical control in apartments with >12 bed bug count, and IV- Chemical control only in apartments with ≥11 bed bug count. All apartments were monitored or treated once every two weeks for a maximum of 28 wk. Treatment I eliminated bed bugs in a similar amount of time to treatment II. Time to eliminate bed bugs was similar between treatment III and IV but required significantly less insecticide spray in treatment III than that in treatment IV. A threshold-based management approach (non-chemical only or non-chemical and chemical) can eliminate bed bugs in a similar amount of time, using little to no pesticide compared to a chemical only approach. PMID:28933720

  16. The upgraded data acquisition system for beam loss monitoring at the Fermilab Tevatron and Main Injector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumbaugh, A.; Briegel, C.; Brown, B.C.

    2011-11-01

    A VME-based data acquisition system for beam-loss monitors has been developed and is in use in the Tevatron and Main Injector accelerators at the Fermilab complex. The need for enhanced beam-loss protection when the Tevatron is operating in collider-mode was the main driving force for the new design. Prior to the implementation of the present system, the beam-loss monitor system was disabled during collider operation and protection of the Tevatron magnets relied on the quench protection system. The new Beam-Loss Monitor system allows appropriate abort logic and thresholds to be set over the full set of collider operating conditions. Themore » system also records a history of beam-loss data prior to a beam-abort event for post-abort analysis. Installation of the Main Injector system occurred in the fall of 2006 and the Tevatron system in the summer of 2007. Both systems were fully operation by the summer of 2008. In this paper we report on the overall system design, provide a description of its normal operation, and show a number of examples of its use in both the Main Injector and Tevatron.« less

  17. The upgraded data acquisition system for beam loss monitoring at the Fermilab Tevatron and Main Injector

    NASA Astrophysics Data System (ADS)

    Baumbaugh, A.; Briegel, C.; Brown, B. C.; Capista, D.; Drennan, C.; Fellenz, B.; Knickerbocker, K.; Lewis, J. D.; Marchionni, A.; Needles, C.; Olson, M.; Pordes, S.; Shi, Z.; Still, D.; Thurman-Keup, R.; Utes, M.; Wu, J.

    2011-11-01

    A VME-based data acquisition system for beam-loss monitors has been developed and is in use in the Tevatron and Main Injector accelerators at the Fermilab complex. The need for enhanced beam-loss protection when the Tevatron is operating in collider-mode was the main driving force for the new design. Prior to the implementation of the present system, the beam-loss monitor system was disabled during collider operation and protection of the Tevatron magnets relied on the quench protection system. The new Beam-Loss Monitor system allows appropriate abort logic and thresholds to be set over the full set of collider operating conditions. The system also records a history of beam-loss data prior to a beam-abort event for post-abort analysis. Installation of the Main Injector system occurred in the fall of 2006 and the Tevatron system in the summer of 2007. Both systems were fully operation by the summer of 2008. In this paper we report on the overall system design, provide a description of its normal operation, and show a number of examples of its use in both the Main Injector and Tevatron.

  18. Novel wavelet threshold denoising method in axle press-fit zone ultrasonic detection

    NASA Astrophysics Data System (ADS)

    Peng, Chaoyong; Gao, Xiaorong; Peng, Jianping; Wang, Ai

    2017-02-01

    Axles are important part of railway locomotives and vehicles. Periodic ultrasonic inspection of axles can effectively detect and monitor axle fatigue cracks. However, in the axle press-fit zone, the complex interface contact condition reduces the signal-noise ratio (SNR). Therefore, the probability of false positives and false negatives increases. In this work, a novel wavelet threshold function is created to remove noise and suppress press-fit interface echoes in axle ultrasonic defect detection. The novel wavelet threshold function with two variables is designed to ensure the precision of optimum searching process. Based on the positive correlation between the correlation coefficient and SNR and with the experiment phenomenon that the defect and the press-fit interface echo have different axle-circumferential correlation characteristics, a discrete optimum searching process for two undetermined variables in novel wavelet threshold function is conducted. The performance of the proposed method is assessed by comparing it with traditional threshold methods using real data. The statistic results of the amplitude and the peak SNR of defect echoes show that the proposed wavelet threshold denoising method not only maintains the amplitude of defect echoes but also has a higher peak SNR.

  19. EVALUATING MACROINVERTEBRATE COMMUNITY ...

    EPA Pesticide Factsheets

    Since 2010, new construction in California is required to include stormwater detention and infiltration that is designed to capture rainfall from the 85th percentile of storm events in the region, preferably through green infrastructure. This study used recent macroinvertebrate community monitoring data to determine the ecological threshold for percent impervious cover prior to large scale adoption of green infrastructure using Threshold Indicator Taxa Analysis (TITAN). TITAN uses an environmental gradient and biological community data to determine individual taxa change points with respect to changes in taxa abundance and frequency across that gradient. Individual taxa change points are then aggregated to calculate the ecological threshold. This study used impervious cover data from National Land Cover Datasets and macroinvertebrate community data from California Environmental Data Exchange Network and Southern California Coastal Water Research Project. Preliminary TITAN runs for California’s Chaparral region indicated that both increasing and decreasing taxa had ecological thresholds of <1% watershed impervious cover. Next, TITAN will be used to determine shifts in the ecological threshold after the implementation of green infrastructure on a large scale. This presentation for the Society for Freshwater Scientists will discuss initial evaluation of community and taxa-specific thresholds of impairment for macroinvertebrates in California streams along

  20. Design of the intelligent smoke alarm system based on photoelectric smoke

    NASA Astrophysics Data System (ADS)

    Ma, Jiangfei; Yang, Xiufang; Wang, Peipei

    2017-02-01

    This paper designed a kind of intelligent smoke alarm system based on photoelectric smoke detector and temperature, The system takes AT89C51 MCU as the core of hardware control and Labview as the host computer monitoring center.The sensor system acquires temperature signals and smoke signals, the MCU control A/D by Sampling and converting the output analog signals , and then the two signals will be uploaded to the host computer through the serial communication. To achieve real-time monitoring of smoke and temperature in the environment, LabVIEW monitoring platform need to hold, process, analysis and display these samping signals. The intelligent smoke alarm system is suitable for large scale shopping malls and other public places, which can greatly reduce the false alarm rate of fire, The experimental results show that the system runs well and can alarm when the setting threshold is reached,and the threshold parameters can be adjusted according to the actual conditions of the field. The system is easy to operate, simple in structure, intelligent, low cost, and with strong practical value.

  1. Scanning seismic intrusion detection method and apparatus. [monitoring unwanted subterranean entry and departure

    NASA Technical Reports Server (NTRS)

    Lee, R. D. (Inventor)

    1983-01-01

    An intrusion monitoring system includes an array of seismic sensors, such as geophones, arranged along a perimeter to be monitored for unauthorized intrusion as by surface movement or tunneling. Two wires lead from each sensor to a central monitoring station. The central monitoring station has three modes of operation. In a first mode of operation, the output of all of the seismic sensors is summed into a receiver for amplification and detection. When the amplitude of the summed signals exceeds a certain predetermined threshold value an alarm is sounded. In a second mode of operation, the individual output signals from the sensors are multiplexed into the receiver for sequentially interrogating each of the sensors.

  2. Absolute Cerebral Blood Flow Infarction Threshold for 3-Hour Ischemia Time Determined with CT Perfusion and 18F-FFMZ-PET Imaging in a Porcine Model of Cerebral Ischemia

    PubMed Central

    Cockburn, Neil; Kovacs, Michael

    2016-01-01

    CT Perfusion (CTP) derived cerebral blood flow (CBF) thresholds have been proposed as the optimal parameter for distinguishing the infarct core prior to reperfusion. Previous threshold-derivation studies have been limited by uncertainties introduced by infarct expansion between the acute phase of stroke and follow-up imaging, or DWI lesion reversibility. In this study a model is proposed for determining infarction CBF thresholds at 3hr ischemia time by comparing contemporaneously acquired CTP derived CBF maps to 18F-FFMZ-PET imaging, with the objective of deriving a CBF threshold for infarction after 3 hours of ischemia. Endothelin-1 (ET-1) was injected into the brain of Duroc-Cross pigs (n = 11) through a burr hole in the skull. CTP images were acquired 10 and 30 minutes post ET-1 injection and then every 30 minutes for 150 minutes. 370 MBq of 18F-FFMZ was injected ~120 minutes post ET-1 injection and PET images were acquired for 25 minutes starting ~155–180 minutes post ET-1 injection. CBF maps from each CTP acquisition were co-registered and converted into a median CBF map. The median CBF map was co-registered to blood volume maps for vessel exclusion, an average CT image for grey/white matter segmentation, and 18F-FFMZ-PET images for infarct delineation. Logistic regression and ROC analysis were performed on infarcted and non-infarcted pixel CBF values for each animal that developed infarct. Six of the eleven animals developed infarction. The mean CBF value corresponding to the optimal operating point of the ROC curves for the 6 animals was 12.6 ± 2.8 mL·min-1·100g-1 for infarction after 3 hours of ischemia. The porcine ET-1 model of cerebral ischemia is easier to implement then other large animal models of stroke, and performs similarly as long as CBF is monitored using CTP to prevent reperfusion. PMID:27347877

  3. A practical approach to tramway track condition monitoring: vertical track defects detection and identification using time-frequency processing technique

    NASA Astrophysics Data System (ADS)

    Bocz, Péter; Vinkó, Ákos; Posgay, Zoltán

    2018-03-01

    This paper presents an automatic method for detecting vertical track irregularities on tramway operation using acceleration measurements on trams. For monitoring of tramway tracks, an unconventional measurement setup is developed, which records the data of 3-axes wireless accelerometers mounted on wheel discs. Accelerations are processed to obtain the vertical track irregularities to determine whether the track needs to be repaired. The automatic detection algorithm is based on time-frequency distribution analysis and determines the defect locations. Admissible limits (thresholds) are given for detecting moderate and severe defects using statistical analysis. The method was validated on frequented tram lines in Budapest and accurately detected severe defects with a hit rate of 100%, with no false alarms. The methodology is also sensitive to moderate and small rail surface defects at the low operational speed.

  4. Multidimensional evaluation of a radio frequency identification wi-fi location tracking system in an acute-care hospital setting

    PubMed Central

    Okoniewska, Barbara; Graham, Alecia; Gavrilova, Marina; Wah, Dannel; Gilgen, Jonathan; Coke, Jason; Burden, Jack; Nayyar, Shikha; Kaunda, Joseph; Yergens, Dean; Baylis, Barry

    2012-01-01

    Real-time locating systems (RTLS) have the potential to enhance healthcare systems through the live tracking of assets, patients and staff. This study evaluated a commercially available RTLS system deployed in a clinical setting, with three objectives: (1) assessment of the location accuracy of the technology in a clinical setting; (2) assessment of the value of asset tracking to staff; and (3) assessment of threshold monitoring applications developed for patient tracking and inventory control. Simulated daily activities were monitored by RTLS and compared with direct research team observations. Staff surveys and interviews concerning the system's effectiveness and accuracy were also conducted and analyzed. The study showed only modest location accuracy, and mixed reactions in staff interviews. These findings reveal that the technology needs to be refined further for better specific location accuracy before full-scale implementation can be recommended. PMID:22298566

  5. Detection of sub micro Gray dose levels using OSL phosphor LiMgPO4:Tb,B

    NASA Astrophysics Data System (ADS)

    Rawat, N. S.; Dhabekar, Bhushan; Muthe, K. P.; Koul, D. K.; Datta, D.

    2017-04-01

    Detection of sub micro Gray doses finds application in personnel and environmental monitoring, and nuclear forensics. Recently developed LiMgPO4:Tb,B (LMP) is highly sensitive Optically Stimulated Luminescence (OSL) phosphor with excellent dosimetric properties. The OSL emission spectrum of LMP consists of several peaks attributed to characteristic Tb3+ emission. The OSL emission peak at 380 nm is favorable for bi-alkali PMT used in RISO reader system. It is demonstrated that significant improvement in dose detection threshold can be realized for LMP by optimization of continuous wave (CW-) OSL parameters like stimulation intensity and readout time. The minimum measurable dose (MMD) as low as 0.49 μGy in readout time of less than 1 s at stimulation intensity of 32 mW/cm2 has been achieved using this phosphor. The recommendations for choice of parameters for personnel and environmental monitoring are also discussed.

  6. A framework for assessing cumulative effects in watersheds: an introduction to Canadian case studies.

    PubMed

    Dubé, Monique G; Duinker, Peter; Greig, Lorne; Carver, Martin; Servos, Mark; McMaster, Mark; Noble, Bram; Schreier, Hans; Jackson, Lee; Munkittrick, Kelly R

    2013-07-01

    From 2008 to 2013, a series of studies supported by the Canadian Water Network were conducted in Canadian watersheds in an effort to improve methods to assess cumulative effects. These studies fit under a common framework for watershed cumulative effects assessment (CEA). This article presents an introduction to the Special Series on Watershed CEA in IEAM including the framework and its impetus, a brief introduction to each of the articles in the series, challenges, and a path forward. The framework includes a regional water monitoring program that produces 3 core outputs: an accumulated state assessment, stressor-response relationships, and development of predictive cumulative effects scenario models. The framework considers core values, indicators, thresholds, and use of consistent terminology. It emphasizes that CEA requires 2 components, accumulated state quantification and predictive scenario forecasting. It recognizes both of these components must be supported by a regional, multiscale monitoring program. Copyright © 2013 SETAC.

  7. Multidimensional evaluation of a radio frequency identification wi-fi location tracking system in an acute-care hospital setting.

    PubMed

    Okoniewska, Barbara; Graham, Alecia; Gavrilova, Marina; Wah, Dannel; Gilgen, Jonathan; Coke, Jason; Burden, Jack; Nayyar, Shikha; Kaunda, Joseph; Yergens, Dean; Baylis, Barry; Ghali, William A

    2012-01-01

    Real-time locating systems (RTLS) have the potential to enhance healthcare systems through the live tracking of assets, patients and staff. This study evaluated a commercially available RTLS system deployed in a clinical setting, with three objectives: (1) assessment of the location accuracy of the technology in a clinical setting; (2) assessment of the value of asset tracking to staff; and (3) assessment of threshold monitoring applications developed for patient tracking and inventory control. Simulated daily activities were monitored by RTLS and compared with direct research team observations. Staff surveys and interviews concerning the system's effectiveness and accuracy were also conducted and analyzed. The study showed only modest location accuracy, and mixed reactions in staff interviews. These findings reveal that the technology needs to be refined further for better specific location accuracy before full-scale implementation can be recommended.

  8. Anthropogenic land uses elevate metal levels in stream water in an urbanizing watershed.

    PubMed

    Yu, Shen; Wu, Qian; Li, Qingliang; Gao, Jinbo; Lin, Qiaoying; Ma, Jun; Xu, Qiufang; Wu, Shengchun

    2014-08-01

    Land use/cover change is a dominant factor affecting surface water quality in rapidly developing areas of Asia. In this study we examined relationships between land use and instream metal loadings in a rapidly developing mixed land use watershed in southeastern China. Five developing subwatersheds and one forested reference site (head water) were instrumented with timing- and rainfall-triggered autosampler and instream loadings of anthropogenic metals (Cu, Zn, Pb, Cr, Cd, and Mn) were monitored from March 2012 to December 2013. Farm land and urban land were positively, and forest and green land were negatively associated with metal loadings (except Cr) in stream water. All developing sites had higher loadings than the reference head water site. Assessed by Chinese surface water quality standard (GB3830-2002), instream loadings of Cu and Zn occasionally exceeded the Class I thresholds at monitoring points within farmland dominated subwatersheds while Mn loadings were greater than the limit for drinking water sources at all monitoring points. Farm land use highly and positively contributed to statistical models of instream loadings of Cu, Zn, Cd, and Mn while urban land use was the dominant contributor to models of Pb and Cd loadings. Rainfall played a crucial role in metal loadings in stream water as a direct source (there were significant levels of Cu and Zn in rain water) and as a driver of watershed processes (loadings were higher in wet years and seasons). Urbanization effects on metal loadings in this watershed are likely to change rapidly with development in future years. Further monitoring to characterize these changes is clearly warranted and should help to develop plans to avoid conflicts between economic development and water quality degradation in this watershed and in watersheds throughout rapidly developing areas of Asia. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Normal standards for computer-ECG programs for prognostically and diagnostically important ECG variables derived from a large ethnically diverse female cohort: the Women's Health Initiative (WHI).

    PubMed

    Rautaharju, Pentti M; Zhang, Zhu-ming; Gregg, Richard E; Haisty, Wesley K; Z Vitolins, Mara; Curtis, Anne B; Warren, James; Horaĉek, Milan B; Zhou, Sophia H; Soliman, Elsayed Z

    2013-01-01

    Substantial new information has emerged recently about the prognostic value for a variety of new ECG variables. The objective of the present study was to establish reference standards for these novel risk predictors in a large, ethnically diverse cohort of healthy women from the Women's Health Initiative (WHI) study. The study population consisted of 36,299 healthy women. Racial differences in rate-adjusted QT end (QT(ea)) and QT peak (QT(pa)) intervals as linear functions of RR were small, leading to the conclusion that 450 and 390 ms are applicable as thresholds for prolonged and shortened QT(ea) and similarly, 365 and 295 ms for prolonged and shortened QT(pa), respectively. As a threshold for increased dispersion of global repolarization (T(peak)T(end) interval), 110 ms was established for white and Hispanic women and 120 ms for African-American and Asian women. ST elevation and depression values for the monitoring leads of each person with limb electrodes at Mason-Likar positions and chest leads at level of V1 and V2 were first computed from standard leads using lead transformation coefficients derived from 892 body surface maps, and subsequently normal standards were determined for the monitoring leads, including vessel-specific bipolar left anterior descending, left circumflex artery and right coronary artery leads. The results support the choice 150 μV as a tentative threshold for abnormal ST-onset elevation for all monitoring leads. Body mass index (BMI) had a profound effect on Cornell voltage and Sokolow-Lyon voltage in all racial groups and their utility for left ventricular hypertrophy classification remains open. Common thresholds for all racial groups are applicable for QT(ea), and QT(pa) intervals and ST elevation. Race-specific normal standards are required for many other ECG parameters. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Long-term monitoring of cardiorespiratory patterns in drug-resistant epilepsy.

    PubMed

    Goldenholz, Daniel M; Kuhn, Amanda; Austermuehle, Alison; Bachler, Martin; Mayer, Christopher; Wassertheurer, Siegfried; Inati, Sara K; Theodore, William H

    2017-01-01

    Sudden unexplained death in epilepsy (SUDEP) during inpatient electroencephalography (EEG) monitoring has been a rare but potentially preventable event, with associated cardiopulmonary markers. To date, no systematic evaluation of alarm settings for a continuous pulse oximeter (SpO 2 ) has been performed. In addition, evaluation of the interrelationship between the ictal and interictal states for cardiopulmonary measures has not been reported. Patients with epilepsy were monitored using video-EEG, SpO 2 , and electrocardiography (ECG). Alarm thresholds were tested systematically, balancing the number of false alarms with true seizure detections. Additional cardiopulmonary patterns were explored using automated ECG analysis software. One hundred ninety-three seizures (32 generalized) were evaluated from 45 patients (7,104 h recorded). Alarm thresholds of 80-86% SpO 2 detected 63-73% of all generalized convulsions and 20-28% of all focal seizures (81-94% of generalized and 25-36% of focal seizures when considering only evaluable data). These same thresholds resulted in 25-146 min between false alarms. The sequential probability of ictal SpO 2 revealed a potential common seizure termination pathway of desaturation. A statistical model of corrected QT intervals (QTc), heart rate (HR), and SpO 2 revealed close cardiopulmonary coupling ictally. Joint probability maps of QTc and SpO 2 demonstrated that many patients had baseline dysfunction in either cardiac, pulmonary, or both domains, and that ictally there was dissociation-some patients exhibited further dysfunction in one or both domains. Optimal selection of continuous pulse oximetry thresholds involves a tradeoff between seizure detection accuracy and false alarm frequency. Alarming at 86% for patients that tend to have fewer false alarms and at 80% for those who have more, would likely result in a reasonable tradeoff. The cardiopulmonary findings may lead to SUDEP biomarkers and early seizure termination therapies. Wiley Periodicals, Inc. © 2016 International League Against Epilepsy.

  11. The role of nanotechnology and nano and micro-electronics in monitoring and control of cardiovascular diseases and neurological disorders

    NASA Astrophysics Data System (ADS)

    Varadan, Vijay K.

    2007-04-01

    Nanotechnology has been broadly defined as the one for not only the creation of functional materials and devices as well as systems through control of matter at the scale of 1-100 nm, but also the exploitation of novel properties and phenomena at the same scale. Growing needs in the point-of-care (POC) that is an increasing market for improving patient's quality of life, are driving the development of nanotechnologies for diagnosis and treatment of various life threatening diseases. This paper addresses the recent development of nanodiagnostic sensors and nanotherapeutic devices with functionalized carbon nanotube and/or nanowire on a flexible organic thin film electronics to monitor and control of the three leading diseases namely 1) neurodegenerative diseases, 2) cardiovascular diseases, and 3) diabetes and metabolic diseases. The sensors developed include implantable and biocompatible devices, light weight wearable devices in wrist-watches, hats, shoes and clothes. The nanotherapeutics devices include nanobased drug delivery system. Many of these sensors are integrated with the wireless systems for the remote physiological monitoring. The author's research team has also developed a wireless neural probe using nanowires and nanotubes for monitoring and control of Parkinson's disease. Light weight and compact EEG, EOG and EMG monitoring system in a hat developed is capable of monitoring real time epileptic patients and patients with neurological and movement disorders using the Internet and cellular network. Physicians could be able to monitor these signals in realtime using portable computers or cell phones and will give early warning signal if these signals cross a pre-determined threshold level. In addition the potential impact of nanotechnology for applications in medicine is that, the devices can be designed to interact with cells and tissues at the molecular level, which allows high degree of functionality. Devices engineered at nanometer scale imply a controlled manipulation of individual molecules and atoms that can interact with the human body at sub-cellular level. The recent progress in microelectronics and nanosensors crates very powerful tools for the early detection and diagnosis. The nanowire integrated potassium and dopamine sensors are ideal for the monitoring and control of many cardiovascular diseases and neurological disorders. Selected movies illustrating the applications of nanodevices to patients will be shown at the talk.

  12. Environmental Suitability of Vibrio Infections in a Warming Climate: An Early Warning System.

    PubMed

    Semenza, Jan C; Trinanes, Joaquin; Lohr, Wolfgang; Sudre, Bertrand; Löfdahl, Margareta; Martinez-Urtaza, Jaime; Nichols, Gordon L; Rocklöv, Joacim

    2017-10-10

    Some Vibrio spp. are pathogenic and ubiquitous in marine waters with low to moderate salinity and thrive with elevated sea surface temperature (SST). Our objective was to monitor and project the suitability of marine conditions for Vibrio infections under climate change scenarios. The European Centre for Disease Prevention and Control (ECDC) developed a platform (the ECDC Vibrio Map Viewer) to monitor the environmental suitability of coastal waters for Vibrio spp. using remotely sensed SST and salinity. A case-crossover study of Swedish cases was conducted to ascertain the relationship between SST and Vibrio infection through a conditional logistic regression. Climate change projections for Vibrio infections were developed for Representative Concentration Pathway (RCP) 4.5 and RCP 8.5. The ECDC Vibrio Map Viewer detected environmentally suitable areas for Vibrio spp. in the Baltic Sea in July 2014 that were accompanied by a spike in cases and one death in Sweden. The estimated exposure-response relationship for Vibrio infections at a threshold of 16°C revealed a relative risk (RR)=1.14 (95% CI: 1.02, 1.27; p=0.024) for a lag of 2 wk; the estimated risk increased successively beyond this SST threshold. Climate change projections for SST under the RCP 4.5 and RCP 8.5 scenarios indicate a marked upward trend during the summer months and an increase in the relative risk of these infections in the coming decades. This platform can serve as an early warning system as the risk of further Vibrio infections increases in the 21st century due to climate change. https://doi.org/10.1289/EHP2198.

  13. Environmental Suitability of Vibrio Infections in a Warming Climate: An Early Warning System

    PubMed Central

    Trinanes, Joaquin; Lohr, Wolfgang; Sudre, Bertrand; Löfdahl, Margareta; Martinez-Urtaza, Jaime; Nichols, Gordon L.; Rocklöv, Joacim

    2017-01-01

    Background: Some Vibrio spp. are pathogenic and ubiquitous in marine waters with low to moderate salinity and thrive with elevated sea surface temperature (SST). Objectives: Our objective was to monitor and project the suitability of marine conditions for Vibrio infections under climate change scenarios. Methods: The European Centre for Disease Prevention and Control (ECDC) developed a platform (the ECDC Vibrio Map Viewer) to monitor the environmental suitability of coastal waters for Vibrio spp. using remotely sensed SST and salinity. A case-crossover study of Swedish cases was conducted to ascertain the relationship between SST and Vibrio infection through a conditional logistic regression. Climate change projections for Vibrio infections were developed for Representative Concentration Pathway (RCP) 4.5 and RCP 8.5. Results: The ECDC Vibrio Map Viewer detected environmentally suitable areas for Vibrio spp. in the Baltic Sea in July 2014 that were accompanied by a spike in cases and one death in Sweden. The estimated exposure–response relationship for Vibrio infections at a threshold of 16°C revealed a relative risk (RR)=1.14 (95% CI: 1.02, 1.27; p=0.024) for a lag of 2 wk; the estimated risk increased successively beyond this SST threshold. Climate change projections for SST under the RCP 4.5 and RCP 8.5 scenarios indicate a marked upward trend during the summer months and an increase in the relative risk of these infections in the coming decades. Conclusions: This platform can serve as an early warning system as the risk of further Vibrio infections increases in the 21st century due to climate change. https://doi.org/10.1289/EHP2198 PMID:29017986

  14. Long-Term Solar and Cosmic Radiation Data Bases

    DTIC Science & Technology

    1991-01-01

    determine the magnitude of the variations in the cosmic ray intensity caused by solar activity. Neutron monitors, with their much lower energy threshold...expression that neutron monitors are sensors on spacecraft EARTH. Here we will consider cosmic ray detectors to measure two components of cosmic ...A comparison with the solar cycle as illustrated by the sunspot number in Fig. 1. shows that the maximum cosmic ray intensity occurs near sunspot

  15. Seismic intrusion detector system

    DOEpatents

    Hawk, Hervey L.; Hawley, James G.; Portlock, John M.; Scheibner, James E.

    1976-01-01

    A system for monitoring man-associated seismic movements within a control area including a geophone for generating an electrical signal in response to seismic movement, a bandpass amplifier and threshold detector for eliminating unwanted signals, pulse counting system for counting and storing the number of seismic movements within the area, and a monitoring system operable on command having a variable frequency oscillator generating an audio frequency signal proportional to the number of said seismic movements.

  16. Long-time cavitation threshold of silica water mixture under acoustic drive

    NASA Astrophysics Data System (ADS)

    Bussonniére, Adrien; Liu, Qingxia; Tsai, Peichun Amy

    2017-11-01

    The low cavitation threshold of water observed experimentally has been attributed to the presence of pre-existing tiny bubbles stabilized by impurities. However, the origin and stability of these cavitation nuclei remain unresolved. We therefore investigate the long-time cavitation evolution of water seeded with micron-sized silica particles under the influences of several parameters. Experimentally, cavitation is induced by a High Intensity Focused Ultrasound and subsequently detected by monitoring the backscattered sound. Degassed or aerated solutions of different concentrations are subjected to acoustic pulses (with the amplitude ranging from 0.1 to 1.7 MPa and a fixed repetition frequency between 0.1 and 6.5 Hz). The cavitation threshold was measured by fitting the cavitation probability curve, averaged over 1000 pulses. Surprisingly, our results shown that the cavitation threshold stabilizes at a reproducible value after a few thousand pulses. Moreover, this long-time threshold was found to decrease with increasing particle concentration, pulse period, and initial oxygen level. In contrast to the depletion of nuclei expected under long acoustic cavitation, the results suggest stabilized nuclei population depending on concentration, oxygen level, and driving period.

  17. Use of statistical analysis of long term displacement rate time series for the definition of Early Warning thresholds. The case studies of the Ruinon and Mont de La Saxe landslides (N Italy)

    NASA Astrophysics Data System (ADS)

    Alberti, Stefano; Battista Crosta, Giovanni; Rivolta, Carlo

    2016-04-01

    Rockslides are characterized by complex spatial and temporal evolution. Forecasting their behaviour is a hard task, due to non-linear displacement trends and the significant effects of seasonal or occasional events. The displacement rate and the landslide evolution are influenced by various factors like lithology, structural and hydrological settings, as well as meteo-climatic factors (e.g. snowmelt and rainfall). The nature of the relationships among these factors is clearly non linear, site specific and even specific to each sector that can be individuated within the main landslide mass. In this contribution, total displacement and displacement rate time series are extracted from Ground-based Interferometric synthetic aperture radar (GB-InSAR) surveys, monitoring of optical targets by total stations, a GPS network and multi-parametric borehole probes. Different Early Warning domains, characterized by different velocity regimes (slow to fast domains) and with different sensitivity to external perturbations (e.g. snowmelt and rainfall), have been identified in previous studies at the two sites. The Mont de La Saxe rockslide (ca. 8 x 106 m3) is located in the Upper Aosta Valley, and it has been intensively monitored since 2009 by the Valle D'Aosta Geological Survey. The Ruinon landslide (ca. 15 x 106 to 20 x 106 m3) is located in the Upper Valtellina (Lombardy region) and monitoring data are available starting since 2006 and have been provided by ARPA Lombardia. Both phenomena are alpine deep-seated rockslides characterized by different displacement velocity, from few centimetres to over 1 meter per year, and which have undergone exceptional accelerations during some specific events. We experiment the use of normal probability plots for the analysis of displacement rates of specific points belonging to different landslide sectors and recorded during almost ten years of monitoring. This analyses allow us to define: (i) values with a specific probability value expressed in terms of percentiles; (ii) values for which a specific change in behaviour is observed which could be associated to a specific type of triggering event (e.g. rainfall intensity, duration or amount; snowmelt amount) . These values could be used to support the choice of threshold values for the management of Early Warning System, by considering also the minimization of false alarms. The analyses have been performed by using data averaged over different time intervals so to study the effects of noise on the threshold values. Analyses of false alarm triggered by the choice of different threshold values (i.e. different percentiles) have been performed and analysed. This could be an innovative approach to define velocity thresholds of Early Warning system and to analyse the quantitative data derived from remote sensing monitoring and filed surveys, by linking them to both spatial and temporal changes.

  18. Standardization of barostat procedures for testing smooth muscle tone and sensory thresholds in the gastrointestinal tract. The Working Team of Glaxo-Wellcome Research, UK.

    PubMed

    Whitehead, W E; Delvaux, M

    1997-02-01

    An international working team of 13 investigators met on two occasions to develop guidelines for standardizing the procedures used to test gastrointestinal muscle tone and sensory thresholds using a barostat. General recommendations were: (1) Use a thin-walled plastic bag that is infinitely compliant until its capacity is reached. Maximum diameter of the bag should be much greater than the maximum diameter of the viscus. (2) The pump should be able to inflate the bag at up to 40 ml/sec. (3) Pressure should be monitored inside the bag, not in the pump or inflation line. (4) Subjects should be positioned so that the bag is close to the uppermost surface of the body. (5) For rectal tests, bowel cleansing should be limited to a tap water enema to minimize rectal irritation. Oral colonic lavage is recommended for studies of the proximal colon, and magnesium citrate enemas for the descending colon and sigmoid. (6) If sedation is required for colonic probe placement, allow at least one hour for drug washout and clearance of insufflated air. Ten to 20 min of adaptation before testing is adequate if no air or drugs were used. (7) The volumes reported must be corrected for the compressibility of gas and the compliance of the pump, which is greater for bellows pumps than for piston pumps. (8) Subjects should be tested in the fasted state. For evaluation of muscle tone: (9) The volume of the bag should be monitored for at least 15 min. For evaluation of sensory thresholds; (10) It is recommended that phasic distensions be > or = 60 sec long and that they be separated by > or = 60 sec. (11) Sensory thresholds should be reported as bag pressure rather than (or in addition to) bag volume because pressure is less vulnerable to measurement error. (12) Tests for sensory threshold should minimize psychological influences on perception by making the amount of each distension unpredictable to the subject. (13) Pain or other sensations should be reported on a graduated scale; not "yes-no." The working team recommends verbal descriptor scales, containing approximately seven steps, or visual analog scales in which subjects place a mark on a straight line marked "none" on one end and "maximum" on the other end. (14) It is recommended that subjects should be asked to rate the unpleasantness of distensions separately from their intensity.

  19. Towards a monitoring system of temperature extremes in Europe

    NASA Astrophysics Data System (ADS)

    Lavaysse, Christophe; Cammalleri, Carmelo; Dosio, Alessandro; van der Schrier, Gerard; Toreti, Andrea; Vogt, Jürgen

    2018-01-01

    Extreme-temperature anomalies such as heat and cold waves may have strong impacts on human activities and health. The heat waves in western Europe in 2003 and in Russia in 2010, or the cold wave in southeastern Europe in 2012, generated a considerable amount of economic loss and resulted in the death of several thousands of people. Providing an operational system to monitor extreme-temperature anomalies in Europe is thus of prime importance to help decision makers and emergency services to be responsive to an unfolding extreme event. In this study, the development and the validation of a monitoring system of extreme-temperature anomalies are presented. The first part of the study describes the methodology based on the persistence of events exceeding a percentile threshold. The method is applied to three different observational datasets, in order to assess the robustness and highlight uncertainties in the observations. The climatology of extreme events from the last 21 years is then analysed to highlight the spatial and temporal variability of the hazard, and discrepancies amongst the observational datasets are discussed. In the last part of the study, the products derived from this study are presented and discussed with respect to previous studies. The results highlight the accuracy of the developed index and the statistical robustness of the distribution used to calculate the return periods.

  20. LANDMON a new integrated system for the management of landslide monitoring networks

    NASA Astrophysics Data System (ADS)

    Wrzesniak, Aleksandra; Giordan, Daniele; Allasia, Paolo

    2017-04-01

    Over the last decades, technological development has strongly increased the number of instruments that can be used to monitor landslide phenomena. Robotized Total Stations, GB-InSAR and GPS are only few examples of the devices that can be adapted to monitor the topographic changes due to mass movements. They are often organized in a complex network, aimed at controlling physical parameters related to the evolution of landslide activity. The level of complexity of these monitoring networks increases with the number of new available monitoring devices and this could generate a paradox: the source of data is so numerous and difficult to interpret that a full understanding of the phenomenon could be hampered. The Geohazard Monitoring Group (GMG) of Italian National Research Council (CNR) has a long experience in landslide monitoring. Over the years, GMG has developed a multidisciplinary approach for landslide management strategy called LANDMON (LANDslide MOnitoring Network). It is an automatic hybrid system focused not only on capturing and elaborating data from monitored site but also on web applications and on publishing bulletins aimed to disseminate monitoring results and to support decision makers. LANDMON is currently active in many landslide sites distributed in several areas in Italy and in Europe. LANDMON is derived from the previously developed systems like ADVICE (ADVanced dIsplaCement monitoring system for Early warning) and 3DA (three-dimensional Displacement Analysis). These systems are aimed to collect and to process monitoring dataset, to manage early warning application based on pre-defined thresholds, and to publish three-dimensional displacement maps in near real time. In addition, LANDMON integrates several new features, such as WebGIS application, modelling using inverse velocity method, and management of webcam monitoring system, meteorological parameters and borehole inclinometric data. Moreover, LANDMON is a communication strategy that focuses on dissemination of the landslide monitoring results in order to obtain a user-friendly system. In fact, this kind of results are usually very complex and they are dedicated exclusively to professionals with a proper background. This approach may be inefficient during the management of emergencies, especially when groups of non-expert people (decision and policy makers, population) are involved. For this purpose, an automatic procedure to produce a single page bulletin has been developed. The algorithm performs the analysis of complex data regarding the activity of the monitored landslide, but the results shared with end-users of LANDMON are summarized in a clear, illustrative and quickly interpretable manner. Therefore, LANDMON is a complex, complete and self-contained system, designed for efficient acquisition, analysis, representation and appropriate dissemination of the monitoring data.

  1. Rainfall estimation for real time flood monitoring using geostationary meteorological satellite data

    NASA Astrophysics Data System (ADS)

    Veerakachen, Watcharee; Raksapatcharawong, Mongkol

    2015-09-01

    Rainfall estimation by geostationary meteorological satellite data provides good spatial and temporal resolutions. This is advantageous for real time flood monitoring and warning systems. However, a rainfall estimation algorithm developed in one region needs to be adjusted for another climatic region. This work proposes computationally-efficient rainfall estimation algorithms based on an Infrared Threshold Rainfall (ITR) method calibrated with regional ground truth. Hourly rain gauge data collected from 70 stations around the Chao-Phraya river basin were used for calibration and validation of the algorithms. The algorithm inputs were derived from FY-2E satellite observations consisting of infrared and water vapor imagery. The results were compared with the Global Satellite Mapping of Precipitation (GSMaP) near real time product (GSMaP_NRT) using the probability of detection (POD), root mean square error (RMSE) and linear correlation coefficient (CC) as performance indices. Comparison with the GSMaP_NRT product for real time monitoring purpose shows that hourly rain estimates from the proposed algorithm with the error adjustment technique (ITR_EA) offers higher POD and approximately the same RMSE and CC with less data latency.

  2. The Canarian Seismic Monitoring Network: design, development and first result

    NASA Astrophysics Data System (ADS)

    D'Auria, Luca; Barrancos, José; Padilla, Germán D.; García-Hernández, Rubén; Pérez, Aaron; Pérez, Nemesio M.

    2017-04-01

    Tenerife is an active volcanic island which experienced several eruptions of moderate intensity in historical times, and few explosive eruptions in the Holocene. The increasing population density and the consistent number of tourists are constantly raising the volcanic risk. In June 2016 Instituto Volcanologico de Canarias started the deployment of a seismological volcano monitoring network consisting of 15 broadband seismic stations. The network began its full operativity in November 2016. The aim of the network are both volcano monitoring and scientific research. Currently data are continuously recorded and processed in real-time. Seismograms, hypocentral parameters, statistical informations about the seismicity and other data are published on a web page. We show the technical characteristics of the network and an estimate of its detection threshold and earthquake location performances. Furthermore we present other near-real time procedures on the data: analysis of the ambient noise for determining the shallow velocity model and temporal velocity variations, detection of earthquake multiplets through massive data mining of the seismograms and automatic relocation of events through double-difference location.

  3. Extended high-frequency thresholds in college students: effects of music player use and other recreational noise.

    PubMed

    Le Prell, Colleen G; Spankovich, Christopher; Lobariñas, Edward; Griffiths, Scott K

    2013-09-01

    Human hearing is sensitive to sounds from as low as 20 Hz to as high as 20,000 Hz in normal ears. However, clinical tests of human hearing rarely include extended high-frequency (EHF) threshold assessments, at frequencies extending beyond 8000 Hz. EHF thresholds have been suggested for use monitoring the earliest effects of noise on the inner ear, although the clinical usefulness of EHF threshold testing is not well established for this purpose. The primary objective of this study was to determine if EHF thresholds in healthy, young adult college students vary as a function of recreational noise exposure. A retrospective analysis of a laboratory database was conducted; all participants with both EHF threshold testing and noise history data were included. The potential for "preclinical" EHF deficits was assessed based on the measured thresholds, with the noise surveys used to estimate recreational noise exposure. EHF thresholds measured during participation in other ongoing studies were available from 87 participants (34 male and 53 female); all participants had hearing within normal clinical limits (≤25 HL) at conventional frequencies (0.25-8 kHz). EHF thresholds closely matched standard reference thresholds [ANSI S3.6 (1996) Annex C]. There were statistically reliable threshold differences in participants who used music players, with 3-6 dB worse thresholds at the highest test frequencies (10-16 kHz) in participants who reported long-term use of music player devices (>5 yr), or higher listening levels during music player use. It should be possible to detect small changes in high-frequency hearing for patients or participants who undergo repeated testing at periodic intervals. However, the increased population-level variability in thresholds at the highest frequencies will make it difficult to identify the presence of small but potentially important deficits in otherwise normal-hearing individuals who do not have previously established baseline data. American Academy of Audiology.

  4. Error, Power, and Blind Sentinels: The Statistics of Seagrass Monitoring

    PubMed Central

    Schultz, Stewart T.; Kruschel, Claudia; Bakran-Petricioli, Tatjana; Petricioli, Donat

    2015-01-01

    We derive statistical properties of standard methods for monitoring of habitat cover worldwide, and criticize them in the context of mandated seagrass monitoring programs, as exemplified by Posidonia oceanica in the Mediterranean Sea. We report the novel result that cartographic methods with non-trivial classification errors are generally incapable of reliably detecting habitat cover losses less than about 30 to 50%, and the field labor required to increase their precision can be orders of magnitude higher than that required to estimate habitat loss directly in a field campaign. We derive a universal utility threshold of classification error in habitat maps that represents the minimum habitat map accuracy above which direct methods are superior. Widespread government reliance on blind-sentinel methods for monitoring seafloor can obscure the gradual and currently ongoing losses of benthic resources until the time has long passed for meaningful management intervention. We find two classes of methods with very high statistical power for detecting small habitat cover losses: 1) fixed-plot direct methods, which are over 100 times as efficient as direct random-plot methods in a variable habitat mosaic; and 2) remote methods with very low classification error such as geospatial underwater videography, which is an emerging, low-cost, non-destructive method for documenting small changes at millimeter visual resolution. General adoption of these methods and their further development will require a fundamental cultural change in conservation and management bodies towards the recognition and promotion of requirements of minimal statistical power and precision in the development of international goals for monitoring these valuable resources and the ecological services they provide. PMID:26367863

  5. Extended range radiation dose-rate monitor

    DOEpatents

    Valentine, Kenneth H.

    1988-01-01

    An extended range dose-rate monitor is provided which utilizes the pulse pileup phenomenon that occurs in conventional counting systems to alter the dynamic response of the system to extend the dose-rate counting range. The current pulses from a solid-state detector generated by radiation events are amplified and shaped prior to applying the pulses to the input of a comparator. The comparator generates one logic pulse for each input pulse which exceeds the comparator reference threshold. These pulses are integrated and applied to a meter calibrated to indicate the measured dose-rate in response to the integrator output. A portion of the output signal from the integrator is fed back to vary the comparator reference threshold in proportion to the output count rate to extend the sensitive dynamic detection range by delaying the asymptotic approach of the integrator output toward full scale as measured by the meter.

  6. Differential pathologies resulting from sound exposure: Tinnitus vs hearing loss

    NASA Astrophysics Data System (ADS)

    Longenecker, Ryan James

    The first step in identifying the mechanism(s) responsible for tinnitus development would be to discover a neural correlate that is differentially expressed in tinnitus-positive compared to tinnitus negative animals. Previous research has identified several neural correlates of tinnitus in animals that have tested positive for tinnitus. However it is unknown whether all or some of these correlates are linked to tinnitus or if they are a byproduct of hearing loss, a common outcome of tinnitus induction. Abnormally high spontaneous activity has frequently been linked to tinnitus. However, while some studies demonstrate that hyperactivity positively correlates with behavioral evidence of tinnitus, others show that when all animals develop hyperactivity to sound exposure, not all exposed animals show evidence of tinnitus. My working hypothesis is that certain aspects of hyperactivity are linked to tinnitus while other aspects are linked to hearing loss. The first specific aim utilized the gap induced prepulse inhibition of the acoustic startle reflex (GIPAS) to monitor the development of tinnitus in CBA/CaJ mice during one year following sound exposure. Immediately after sound exposure, GIPAS testing revealed widespread gap detection deficits across all frequencies, which was likely due to temporary threshold shifts. However, three months after sound exposure these deficits were limited to a narrow frequency band and were consistently detected up to one year after exposure. This suggests the development of chronic tinnitus is a long lasting and highly dynamic process. The second specific aim assessed hearing loss in sound exposed mice using several techniques. Acoustic brainstem responses recorded initially after sound exposure reveal large magnitude deficits in all exposed mice. However, at the three month period, thresholds return to control levels in all mice suggesting that ABRs are not a reliable tool for assessing permanent hearing loss. Input/output functions of the acoustic startle reflex show that after sound exposure the magnitude of startle responses decrease in most mice, to varying degrees. Lastly, PPI audiometry was able to detect specific behavioral threshold deficits for each mouse after sound exposure. These deficits persist past initial threshold shifts and are able to detect frequency specific permanent threshold shifts. The third specific aim examined hyperactivity and increased bursting activity in the inferior colliculus after sound exposure in relation to tinnitus and hearing loss. Spontaneous firing rates were increased in all mice after sound exposure regardless of behavioral evidence of tinnitus. However, abnormal increased bursting activity was not found in the animals identified with tinnitus but was exhibited in a mouse with broad-band severe threshold deficits. CBA/CaJ mice are a good model for both tinnitus development and noise-induced hearing loss studies. Hyperactivity which was evident in all exposed animals does not seem to be well correlated with behavioral evidence of tinnitus but more likely to be a general result of acoustic over exposure. Data from one animal strongly suggest that wide-spread severe threshold deficits are linked to an elevation of bursting activity predominantly ipsilateral to the side of sound exposure. This result is intriguing and should be followed up in further studies. Data obtained in this study provide new insights into underlying neural pathologies following sound exposure and have possible clinical applications for development of effective treatments and diagnostic tools for tinnitus and hearing loss.

  7. Assessing and Adapting LiDAR-Derived Pit-Free Canopy Height Model Algorithm for Sites with Varying Vegetation Structure

    NASA Astrophysics Data System (ADS)

    Scholl, V.; Hulslander, D.; Goulden, T.; Wasser, L. A.

    2015-12-01

    Spatial and temporal monitoring of vegetation structure is important to the ecological community. Airborne Light Detection and Ranging (LiDAR) systems are used to efficiently survey large forested areas. From LiDAR data, three-dimensional models of forests called canopy height models (CHMs) are generated and used to estimate tree height. A common problem associated with CHMs is data pits, where LiDAR pulses penetrate the top of the canopy, leading to an underestimation of vegetation height. The National Ecological Observatory Network (NEON) currently implements an algorithm to reduce data pit frequency, which requires two height threshold parameters, increment size and range ceiling. CHMs are produced at a series of height increments up to a height range ceiling and combined to produce a CHM with reduced pits (referred to as a "pit-free" CHM). The current implementation uses static values for the height increment and ceiling (5 and 15 meters, respectively). To facilitate the generation of accurate pit-free CHMs across diverse NEON sites with varying vegetation structure, the impacts of adjusting the height threshold parameters were investigated through development of an algorithm which dynamically selects the height increment and ceiling. A series of pit-free CHMs were generated using three height range ceilings and four height increment values for three ecologically different sites. Height threshold parameters were found to change CHM-derived tree heights up to 36% compared to original CHMs. The extent of the parameters' influence on modelled tree heights was greater than expected, which will be considered during future CHM data product development at NEON. (A) Aerial image of Harvard National Forest, (B) standard CHM containing pits, appearing as black speckles, (C) a pit-free CHM created with the static algorithm implementation, and (D) a pit-free CHM created through varying the height threshold ceiling up to 82 m and the increment to 1 m.

  8. High-frequency (8 to 16 kHz) reference thresholds and intrasubject threshold variability relative to ototoxicity criteria using a Sennheiser HDA 200 earphone.

    PubMed

    Frank, T

    2001-04-01

    The first purpose of this study was to determine high-frequency (8 to 16 kHz) thresholds for standardizing reference equivalent threshold sound pressure levels (RETSPLs) for a Sennheiser HDA 200 earphone. The second and perhaps more important purpose of this study was to determine whether repeated high-frequency thresholds using a Sennheiser HDA 200 earphone had a lower intrasubject threshold variability than the ASHA 1994 significant threshold shift criteria for ototoxicity. High-frequency thresholds (8 to 16 kHz) were obtained for 100 (50 male, 50 female) normally hearing (0.25 to 8 kHz) young adults (mean age of 21.2 yr) in four separate test sessions using a Sennheiser HDA 200 earphone. The mean and median high-frequency thresholds were similar for each test session and increased as frequency increased. At each frequency, the high-frequency thresholds were not significantly (p > 0.05) different for gender, test ear, or test session. The median thresholds at each frequency were similar to the 1998 interim ISO RETSPLs; however, large standard deviations and wide threshold distributions indicated very high intersubject threshold variability, especially at 14 and 16 kHz. Threshold repeatability was determined by finding the threshold differences between each possible test session comparison (N = 6). About 98% of all of the threshold differences were within a clinically acceptable range of +/-10 dB from 8 to 14 kHz. The threshold differences between each subject's second, third, and fourth minus their first test session were also found to determine whether intrasubject threshold variability was less than the ASHA 1994 criteria for determining a significant threshold shift due to ototoxicity. The results indicated a false-positive rate of 0% for a threshold shift > or = 20 dB at any frequency and a false-positive rate of 2% for a threshold shift >10 dB at two consecutive frequencies. This study verified that the output of high-frequency audiometers at 0 dB HL using Sennheiser HDA 200 earphones should equal the 1998 interim ISO RETSPLs from 8 to 16 kHz. Further, because the differences between repeated thresholds were well within +/-10 dB and had an extremely low false-positive rate in reference to the ASHA 1994 criteria for a significant threshold shift due to ototoxicity, a Sennheiser HDA 200 earphone can be used for serial monitoring to determine whether significant high-frequency threshold shifts have occurred for patients receiving potentially ototoxic drug therapy.

  9. 24 CFR 599.301 - Initial determination of threshold requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 3 2011-04-01 2010-04-01 true Initial determination of threshold... Nominating Renewal Communities § 599.301 Initial determination of threshold requirements. (a) Two threshold... meets both of the following thresholds: (1) Eligibility of the nominated area. This threshold is met if...

  10. A Hydrologically-based Method for Calculating Sustainable Yield under California's Sustainable Groundwater Management Act

    NASA Astrophysics Data System (ADS)

    Miro, M.; Famiglietti, J. S.

    2016-12-01

    In California, traditional water management has focused heavily on surface water, leaving many basins in a state of critical overdraft and lacking in established frameworks for groundwater management. However, new groundwater legislation, the 2014 Sustainable Groundwater Management Act (SGMA), presents an important opportunity for water managers and hydrologists to develop novel methods for managing statewide groundwater resources. Integrating scientific advances in groundwater monitoring with hydrologically-sound methods can go a long way in creating a system that can better govern the resource. SGMA mandates that groundwater management agencies employ the concept of sustainable yield as their primary management goal but does not clearly define a method to calculate it. This study will develop a hydrologically-based method to quantify sustainable yield that follows the threshold framework under SGMA. Using this method, sustainable yield will be calculated for two critically-overdrafted groundwater basins in California's Central Valley. This measure will also utilize groundwater monitoring data and downscaled remote sensing estimates of groundwater storage change from NASA's GRACE satellite to illustrate why data matters for successful management. This method can be used as a basis for the development of SGMA's groundwater management plans (GSPs) throughout California.

  11. Consistent and reproducible positioning in longitudinal imaging for phenotyping genetically modified swine

    NASA Astrophysics Data System (ADS)

    Hammond, Emily; Dilger, Samantha K. N.; Stoyles, Nicholas; Judisch, Alexandra; Morgan, John; Sieren, Jessica C.

    2015-03-01

    Recent growth of genetic disease models in swine has presented the opportunity to advance translation of developed imaging protocols, while characterizing the genotype to phenotype relationship. Repeated imaging with multiple clinical modalities provides non-invasive detection, diagnosis, and monitoring of disease to accomplish these goals; however, longitudinal scanning requires repeatable and reproducible positioning of the animals. A modular positioning unit was designed to provide a fixed, stable base for the anesthetized animal through transit and imaging. Post ventilation and sedation, animals were placed supine in the unit and monitored for consistent vitals. Comprehensive imaging was performed with a computed tomography (CT) chest-abdomen-pelvis scan at each screening time point. Longitudinal images were rigidly registered, accounting for rotation, translation, and anisotropic scaling, and the skeleton was isolated using a basic thresholding algorithm. Assessment of alignment was quantified via eleven pairs of corresponding points on the skeleton with the first time point as the reference. Results were obtained with five animals over five screening time points. The developed unit aided in skeletal alignment within an average of 13.13 +/- 6.7 mm for all five subjects providing a strong foundation for developing qualitative and quantitative methods of disease tracking.

  12. Spatial-temporal distortion metric for in-service quality monitoring of any digital video system

    NASA Astrophysics Data System (ADS)

    Wolf, Stephen; Pinson, Margaret H.

    1999-11-01

    Many organizations have focused on developing digital video quality metrics which produce results that accurately emulate subjective responses. However, to be widely applicable a metric must also work over a wide range of quality, and be useful for in-service quality monitoring. The Institute for Telecommunication Sciences (ITS) has developed spatial-temporal distortion metrics that meet all of these requirements. These objective metrics are described in detail and have a number of interesting properties, including utilization of (1) spatial activity filters which emphasize long edges on the order of 10 arc min while simultaneously performing large amounts of noise suppression, (2) the angular direction of the spatial gradient, (3) spatial-temporal compression factors of at least 384:1 (spatial compression of at least 64:1 and temporal compression of at least 6:1, and 4) simple perceptibility thresholds and spatial-temporal masking functions. Results are presented that compare the objective metric values with mean opinion scores from a wide range of subjective data bases spanning many different scenes, systems, bit-rates, and applications.

  13. The salt-taste threshold in untreated hypertensive patients.

    PubMed

    Kim, Chang-Yeon; Ye, Mi-Kyung; Lee, Young Soo

    2017-01-01

    The salt-taste threshold can influence the salt appetite, and is thought to be another marker of sodium intake. Many studies have mentioned the relationship between the sodium intake and blood pressure (BP). The aim of this study was to evaluate the relationship between the salt-taste threshold and urinary sodium excretion in normotensive and hypertensive groups. We analyzed 199 patients (mean age 52 years, male 47.3%) who underwent 24-h ambulatory BP monitoring (ABPM). Hypertension was diagnosed as an average daytime systolic BP of ≥135 mmHg or diastolic BP of ≥85 mmHg by the ABPM. We assessed the salt-taste threshold using graded saline solutions. The salt-taste threshold, 24-h urinary sodium and potassium excretion, and echocardiographic data were compared between the control and hypertensive groups. The detection and recognition threshold of the salt taste did not significantly differ between the control and hypertensive groups. The 24-h urinary sodium excretion of hypertensive patients was significantly higher than that of the control group (140.9 ± 59.8 vs. 117.9 ± 57.2 mEq/day, respectively, p  = 0.011). Also, the urinary sodium-potassium ratio was significantly higher in the hypertensive patients. There was no correlation between the salt-taste threshold and 24-h urinary sodium excretion. The salt-taste threshold might not be related to the BP status as well as the 24-h urinary sodium excretion.

  14. Evaluation of surveillance methods for monitoring house fly abundance and activity on large commercial dairy operations.

    PubMed

    Gerry, Alec C; Higginbotham, G E; Periera, L N; Lam, A; Shelton, C R

    2011-06-01

    Relative house fly, Musca domestica L., activity at three large dairies in central California was monitored during the peak fly activity period from June to August 2005 by using spot cards, fly tapes, bait traps, and Alsynite traps. Counts for all monitoring methods were significantly related at two of three dairies; with spot card counts significantly related to fly tape counts recorded the same week, and both spot card counts and fly tape counts significantly related to bait trap counts 1-2 wk later. Mean fly counts differed significantly between dairies, but a significant interaction between dairies sampled and monitoring methods used demonstrates that between-dairy comparisons are unwise. Estimate precision was determined by the coefficient of variability (CV) (or SE/mean). Using a CV = 0.15 as a desired level of estimate precision and assuming an integrate pest management (IPM) action threshold near the peak house fly activity measured by each monitoring method, house fly monitoring at a large dairy would require 12 spot cards placed in midafternoon shaded fly resting sites near cattle or seven bait traps placed in open areas near cattle. Software (FlySpotter; http://ucanr.org/ sites/FlySpotter/download/) using computer vision technology was developed to count fly spots on a scanned image of a spot card to dramatically reduce time invested in monitoring house flies. Counts provided by the FlySpotter software were highly correlated to visual counts. The use of spot cards for monitoring house flies is recommended for dairy IPM programs.

  15. Non-Invasive Ultrasonic Diagnosing and Monitoring of Intracranial Pressure/Volume

    DTIC Science & Technology

    2001-10-01

    these risks, intracranial pressure is usually monitored only in the severely head injured patients and only if there is clinical or CT or MRI ...proportion of patients for whom the computer tomography (CT) or magnetic resonance imaging ( MRI ) scan do not show the evidence of the raised pressure...thresholds and the Cushing phenomenon induced by upper brainstem ischemia. During ICH it was demonstrated that there is an increase of sympathetic

  16. Ink dating part II: Interpretation of results in a legal perspective.

    PubMed

    Koenig, Agnès; Weyermann, Céline

    2018-01-01

    The development of an ink dating method requires an important investment of resources in order to step from the monitoring of ink ageing on paper to the determination of the actual age of a questioned ink entry. This article aimed at developing and evaluating the potential of three interpretation models to date ink entries in a legal perspective: (1) the threshold model comparing analytical results to tabulated values in order to determine the maximal possible age of an ink entry, (2) the trend tests that focusing on the "ageing status" of an ink entry, and (3) the likelihood ratio calculation comparing the probabilities to observe the results under at least two alternative hypotheses. This is the first report showing ink dating interpretation results on a ballpoint be ink reference population. In the first part of this paper three ageing parameters were selected as promising from the population of 25 ink entries aged during 4 to 304days: the quantity of phenoxyethanol (PE), the difference between the PE quantities contained in a naturally aged sample and an artificially aged sample (R NORM ) and the solvent loss ratio (R%). In the current part, each model was tested using the three selected ageing parameters. Results showed that threshold definition remains a simple model easily applicable in practice, but that the risk of false positive cannot be completely avoided without reducing significantly the feasibility of the ink dating approaches. The trend tests from the literature showed unreliable results and an alternative had to be developed yielding encouraging results. The likelihood ratio calculation introduced a degree of certainty to the ink dating conclusion in comparison to the threshold approach. The proposed model remains quite simple to apply in practice, but should be further developed in order to yield reliable results in practice. Copyright © 2017 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.

  17. Risk assessment for pesticide contamination of groundwater with sparse available data

    NASA Astrophysics Data System (ADS)

    Bardowicks, K.; Heredia, O.; Billib, M.; Fernández Cirelli, A.; Boochs, P.

    2009-04-01

    The contamination of the water resources by agrochemicals is recognized in industrial countries as a very important environmental problem, nevertheless in most of developing and threshold countries the risks for health and environmental problems are not considered. In these countries agrochemicals, which are forbidden since several years in Europe (e.g. atrazine), are still in use. In some threshold countries monitoring systems are already installed for nutrients (N, P) and also a few for heavy metals, but so far the contamination by pesticides is hardly ever controlled, thus there is no data available about pesticide concentrations in soil and water. The aim of this research is to develop a methodology to show farmers and other water users (water agencies, drinking water supply companies) in basins of developing or threshold countries with sparse available data the risk of contamination of the groundwater resources by pesticides. A few data like pesticide application, precipitation, irrigation, potential evaporation and soil types are available in some regions. If these data is reliable it can be used together with some justified estimated parameters to perform simulations of the fate of pesticides to the groundwater. Therefore in two study cases in Argentina and Chile pesticide models (e.g. PESTAN, IPTM-CS) were used to evaluate the risk of contamination of the groundwater. The results were compared with contamination indicators, like one developed by O. Heredia, for checking their plausibility. Afterwards the results of the models were used as input data for simulations at the catchment scale, for instance for a groundwater simulation model (VISUAL MODFLOW). The results show a great risk for the contamination of the groundwater resources in the selected study areas, especially by atrazine. On this account the findings will be used by local researchers to improve the knowledge and the awareness of farmers and other stakeholders about the contamination of the water resources by pesticides.

  18. The development and application of an injury prediction model for noncontact, soft-tissue injuries in elite collision sport athletes.

    PubMed

    Gabbett, Tim J

    2010-10-01

    Limited information exists on the training dose-response relationship in elite collision sport athletes. In addition, no study has developed an injury prediction model for collision sport athletes. The purpose of this study was to develop an injury prediction model for noncontact, soft-tissue injuries in elite collision sport athletes. Ninety-one professional rugby league players participated in this 4-year prospective study. This study was conducted in 2 phases. Firstly, training load and injury data were prospectively recorded over 2 competitive seasons in elite collision sport athletes. Training load and injury data were modeled using a logistic regression model with a binomial distribution (injury vs. no injury) and logit link function. Secondly, training load and injury data were prospectively recorded over a further 2 competitive seasons in the same cohort of elite collision sport athletes. An injury prediction model based on planned and actual training loads was developed and implemented to determine if noncontact, soft-tissue injuries could be predicted and therefore prevented in elite collision sport athletes. Players were 50-80% likely to sustain a preseason injury within the training load range of 3,000-5,000 units. These training load 'thresholds' were considerably reduced (1,700-3,000 units) in the late-competition phase of the season. A total of 159 noncontact, soft-tissue injuries were sustained over the latter 2 seasons. The percentage of true positive predictions was 62.3% (n = 121), whereas the total number of false positive and false negative predictions was 20 and 18, respectively. Players that exceeded the training load threshold were 70 times more likely to test positive for noncontact, soft-tissue injury, whereas players that did not exceed the training load threshold were injured 1/10 as often. These findings provide information on the training dose-response relationship and a scientific method of monitoring and regulating training load in elite collision sport athletes.

  19. Study on De-noising Technology of Radar Life Signal

    NASA Astrophysics Data System (ADS)

    Yang, Xiu-Fang; Wang, Lian-Huan; Ma, Jiang-Fei; Wang, Pei-Pei

    2016-05-01

    Radar detection is a kind of novel life detection technology, which can be applied to medical monitoring, anti-terrorism and disaster relief street fighting, etc. As the radar life signal is very weak, it is often submerged in the noise. Because of non-stationary and randomness of these clutter signals, it is necessary to denoise efficiently before extracting and separating the useful signal. This paper improves the radar life signal's theoretical model of the continuous wave, does de-noising processing by introducing lifting wavelet transform and determine the best threshold function through comparing the de-noising effects of different threshold functions. The result indicates that both SNR and MSE of the signal are better than the traditional ones by introducing lifting wave transform and using a new improved soft threshold function de-noising method..

  20. Integration of community structure data reveals observable effects below sediment guideline thresholds in a large estuary.

    PubMed

    Tremblay, Louis A; Clark, Dana; Sinner, Jim; Ellis, Joanne I

    2017-09-20

    The sustainable management of estuarine and coastal ecosystems requires robust frameworks due to the presence of multiple physical and chemical stressors. In this study, we assessed whether ecological health decline, based on community structure composition changes along a pollution gradient, occurred at levels below guideline threshold values for copper, zinc and lead. Canonical analysis of principal coordinates (CAP) was used to characterise benthic communities along a metal contamination gradient. The analysis revealed changes in benthic community distribution at levels below the individual guideline values for the three metals. These results suggest that field-based measures of ecological health analysed with multivariate tools can provide additional information to single metal guideline threshold values to monitor large systems exposed to multiple stressors.

  1. An approach to derive groundwater and stream threshold values for total nitrogen and ensure good ecological status of associated aquatic ecosystems - example from a coastal catchment to a vulnerable Danish estuary.

    NASA Astrophysics Data System (ADS)

    Hinsby, Klaus; Markager, Stiig; Kronvang, Brian; Windolf, Jørgen; Sonnenborg, Torben; Sørensen, Lærke

    2015-04-01

    Nitrate, which typically makes up the major part (~>90%) of dissolved inorganic nitrogen in groundwater and surface water, is the most frequent pollutant responsible for European groundwater bodies failing to meet the good status objectives of the European Water Framework Directive generally when comparing groundwater monitoring data with the nitrate quality standard of the Groundwater Directive (50 mg/l = the WHO drinking water standard). Still, while more than 50 % of the European surface water bodies do not meet the objective of good ecological status "only" 25 % of groundwater bodies do not meet the objective of good chemical status according to the river basin management plans reported by the EU member states. However, based on a study on interactions between groundwater, streams and a Danish estuary we argue that nitrate threshold values for aerobic groundwater often need to be significantly below the nitrate quality standard to ensure good ecological status of associated surface water bodies, and hence that the chemical status of European groundwater is worse than indicated by the present assessments. Here we suggest a methodology for derivation of groundwater and stream threshold values for total nitrogen ("nitrate") in a coastal catchment based on assessment of maximum acceptable nitrogen loadings (thresholds) to the associated vulnerable estuary. The applied method use existing information on agricultural practices and point source emissions in the catchment, groundwater, stream quantity and quality monitoring data that all feed data to an integrated groundwater and surface water modelling tool enabling us to conduct an assessment of total nitrogen loads and threshold concentrations derived to ensure/restore good ecological status of the investigated estuary. For the catchment to the Horsens estuary in Denmark we estimate the stream and groundwater thresholds for total nitrogen to be about 13 and 27 mg/l (~ 12 and 25 mg/l of nitrate). The shown example of deriving nitrogen threshold concentrations is for groundwater and streams in a coastal catchment discharging to a vulnerable estuary in Denmark, but the principles may be applied to large river basins with sub-catchments in several countries such as e.g. the Danube or the Rhine. In this case the relevant countries need to collaborate on derivation of nitrogen thresholds based on e.g. maximum acceptable nitrogen loadings to the Black Sea / the North Sea, and finally agree on thresholds for different parts of the river basin. Phosphorus is another nutrient which frequently results in or contributes to the eutrophication of surface waters. The transport and retention processes of total phosphorus (TP) is more complex than for nitrate (or alternatively total N), and presently we are able to establish TP thresholds for streams but not for groundwater. Derivation of TP thresholds is covered in an accompanying paper by Kronvang et al.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savi, Daniel, E-mail: d.savi@umweltchemie.ch; Kasser, Ueli; Ott, Thomas

    Highlights: • We’ve analysed data on the dismantling of electronic and electrical appliances. • Ten years of mass balance data of more than recycling companies have been considered. • Percentages of dismantled batteries, capacitors and PWB have been studied. • Threshold values and benchmarks for batteries and capacitors have been identified. • No benchmark for the dismantling of printed wiring boards should be set. - Abstract: The article compiles and analyses sample data for toxic components removed from waste electronic and electrical equipment (WEEE) from more than 30 recycling companies in Switzerland over the past ten years. According to Europeanmore » and Swiss legislation, toxic components like batteries, capacitors and printed wiring boards have to be removed from WEEE. The control bodies of the Swiss take back schemes have been monitoring the activities of WEEE recyclers in Switzerland for about 15 years. All recyclers have to provide annual mass balance data for every year of operation. From this data, percentage shares of removed batteries and capacitors are calculated in relation to the amount of each respective WEEE category treated. A rationale is developed, why such an indicator should not be calculated for printed wiring boards. The distributions of these de-pollution indicators are analysed and their suitability for defining lower threshold values and benchmarks for the depollution of WEEE is discussed. Recommendations for benchmarks and threshold values for the removal of capacitors and batteries are given.« less

  3. Biomechanical properties of concussions in high school football.

    PubMed

    Broglio, Steven P; Schnebel, Brock; Sosnoff, Jacob J; Shin, Sunghoon; Fend, Xingdong; He, Xuming; Zimmerman, Jerrad

    2010-11-01

    Sport concussion represents the majority of brain injuries occurring in the United States with 1.6–3.8 million cases annually. Understanding the biomechanical properties of this injury will support the development of better diagnostics and preventative techniques. We monitored all football related head impacts in 78 high school athletes (mean age = 16.7 yr) from 2005 to 2008 to better understand the biomechanical characteristics of concussive impacts. Using the Head Impact Telemetry System, a total of 54,247 impacts were recorded, and 13 concussive episodes were captured for analysis. A classification and regression tree analysis of impacts indicated that rotational acceleration (95582.3 rad·s−²), linear acceleration (996.1g), and impact location (front, top, and back) yielded the highest predictive value of concussion. These threshold values are nearly identical with those reported at the collegiate and professional level. If the Head Impact Telemetry System were implemented for medical use, sideline personnel can expect to diagnose one of every five athletes with a concussion when the impact exceeds these tolerance levels. Why all athletes did not sustain a concussion when the impacts generated variables in excess of our threshold criteria is not entirely clear, although individual differences between participants may play a role. A similar threshold to concussion in adolescent athletes compared with their collegiate and professional counterparts suggests an equal concussion risk at all levels of play.

  4. Infrared Instrument for Detecting Hydrogen Fires

    NASA Technical Reports Server (NTRS)

    Youngquist, Robert; Ihlefeld, Curtis; Immer, Christopher; Oostdyk, Rebecca; Cox, Robert; Taylor, John

    2006-01-01

    The figure shows an instrument incorporating an infrared camera for detecting small hydrogen fires. The instrument has been developed as an improved replacement for prior infrared and ultraviolet instruments used to detect hydrogen fires. The need for this or any such instrument arises because hydrogen fires (e.g., those associated with leaks from tanks, valves, and ducts) pose a great danger, yet they emit so little visible light that they are mostly undetectable by the unaided human eye. The main performance advantage offered by the present instrument over prior hydrogen-fire-detecting instruments lies in its greater ability to avoid false alarms by discriminating against reflected infrared light, including that originating in (1) the Sun, (2) welding torches, and (3) deliberately ignited hydrogen flames (e.g., ullage-burn-off flames) that are nearby but outside the field of view intended to be monitored by the instrument. Like prior such instruments, this instrument is based mostly on the principle of detecting infrared emission above a threshold level. However, in addition, this instrument utilizes information on the spatial distribution of infrared light from a source that it detects. Because the combination of spatial and threshold information about a flame tends to constitute a unique signature that differs from that of reflected infrared light originating in a source not in the field of view, the incidence of false alarms is reduced substantially below that of related prior threshold- based instruments.

  5. The Natural History of IgE-Mediated Food Allergy: Can Skin Prick Tests and Serum-Specific IgE Predict the Resolution of Food Allergy?

    PubMed Central

    Peters, Rachel L.; Gurrin, Lyle C.; Dharmage, Shyamali C.; Koplin, Jennifer J.; Allen, Katrina J.

    2013-01-01

    IgE-mediated food allergy is a transient condition for some children, however there are few indices to predict when and in whom food allergy will resolve. Skin prick test (SPT) and serum-specific IgE levels (sIgE) are usually monitored in the management of food allergy and are used to predict the development of tolerance or persistence of food allergy. The aim of this article is to review the published literature that investigated the predictive value of SPT and sIgE in development of tolerance in children with a previous diagnosis of peanut, egg and milk allergy. A systematic search identified twenty-six studies, of which most reported SPT or sIgE thresholds which predicted persistent or resolved allergy. However, results were inconsistent between studies. Previous research was hampered by several limitations including the absence of gold standard test to diagnose food allergy or tolerance, biased samples in retrospective audits and lack of systematic protocols for triggering re-challenges. There is a need for population-based, prospective studies that use the gold standard oral food challenge (OFC) to diagnose food allergy at baseline and follow-up to develop SPT and sIgE thresholds that predict the course of food allergy. PMID:24132133

  6. Mathematics of quantitative kinetic PCR and the application of standard curves.

    PubMed

    Rutledge, R G; Côté, C

    2003-08-15

    Fluorescent monitoring of DNA amplification is the basis of real-time PCR, from which target DNA concentration can be determined from the fractional cycle at which a threshold amount of amplicon DNA is produced. Absolute quantification can be achieved using a standard curve constructed by amplifying known amounts of target DNA. In this study, the mathematics of quantitative PCR are examined in detail, from which several fundamental aspects of the threshold method and the application of standard curves are illustrated. The construction of five replicate standard curves for two pairs of nested primers was used to examine the reproducibility and degree of quantitative variation using SYBER Green I fluorescence. Based upon this analysis the application of a single, well- constructed standard curve could provide an estimated precision of +/-6-21%, depending on the number of cycles required to reach threshold. A simplified method for absolute quantification is also proposed, in which quantitative scale is determined by DNA mass at threshold.

  7. Ground-Water Quality Data in the Southern Sierra Study Unit, 2006 - Results from the California GAMA Program

    USGS Publications Warehouse

    Fram, Miranda S.; Belitz, Kenneth

    2007-01-01

    Ground-water quality in the approximately 1,800 square-mile Southern Sierra study unit (SOSA) was investigated in June 2006 as part of the Statewide Basin Assessment Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Statewide Basin Assessment Project was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The Southern Sierra study was designed to provide a spatially unbiased assessment of raw ground-water quality within SOSA, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from fifty wells in Kern and Tulare Counties. Thirty-five of the wells were selected using a randomized grid-based method to provide statistical representation of the study area, and fifteen were selected to evaluate changes in water chemistry along ground-water flow paths. The ground-water samples were analyzed for a large number of synthetic organic constituents [volatile organic compounds (VOCs), pesticides and pesticide degradates, pharmaceutical compounds, and wastewater-indicator compounds], constituents of special interest [perchlorate, N-nitrosodimethylamine (NDMA), and 1,2,3-trichloropropane (1,2,3-TCP)], naturally occurring inorganic constituents [nutrients, major and minor ions, and trace elements], radioactive constituents, and microbial indicators. Naturally occurring isotopes [tritium, and carbon-14, and stable isotopes of hydrogen and oxygen in water], and dissolved noble gases also were measured to help identify the source and age of the sampled ground water. Quality-control samples (blanks, replicates, and samples for matrix spikes) were collected for approximately one-eighth of the wells, and the results for these samples were used to evaluate the quality of the data for the ground-water samples. Assessment of the quality-control information resulted in censoring of less than 0.2 percent of the data collected for ground-water samples. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, or blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to treated water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH) and thresholds established for aesthetic concerns (secondary maximum contaminant levels, SMCL-CA) by CDPH. VOCs and pesticides were detected in less than one-third of the grid wells, and all detections in samples from SOSA wells were below health-based thresholds. All detections of trace elements and nutrients in samples from SOSA wells were below health-based thresholds, with the exception of four detections of arsenic that were above the USEPA maximum contaminant level (MCL-US) and one detection of boron that was above the CDPH notification level (NL-CA). All detections of radioactive constituents were below health-based thresholds, although four samples had activities of radon-222 above the proposed MCL-US. Most of the samples from SOSA wells had concentrations of major elements, total dissolved solids, and trace elements below the non-enforceable thresholds set for aesthetic concerns. A few samples contained iron, manganese, or total dissolved solids at concentrations above the SMCL-CA thresholds.

  8. Groundwater Quality Data for the Northern Sacramento Valley, 2007: Results from the California GAMA Program

    USGS Publications Warehouse

    Bennett, Peter A.; Bennett, George L.; Belitz, Kenneth

    2009-01-01

    Groundwater quality in the approximately 1,180-square-mile Northern Sacramento Valley study unit (REDSAC) was investigated in October 2007 through January 2008 as part of the Priority Basin Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Project was developed in response to the Groundwater Quality Monitoring Act of 2001, and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The study was designed to provide a spatially unbiased assessment of the quality of raw groundwater used for public water supplies within REDSAC and to facilitate statistically consistent comparisons of groundwater quality throughout California. Samples were collected from 66 wells in Shasta and Tehama Counties. Forty-three of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study area (grid wells), and 23 were selected to aid in evaluation of specific water-quality issues (understanding wells). The groundwater samples were analyzed for a large number of synthetic organic constituents (volatile organic compounds [VOC], pesticides and pesticide degradates, and pharmaceutical compounds), constituents of special interest (perchlorate and N-nitrosodimethylamine [NDMA]), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), radioactive constituents, and microbial constituents. Naturally occurring isotopes (tritium, and carbon-14, and stable isotopes of nitrogen and oxygen in nitrate, stable isotopes of hydrogen and oxygen of water), and dissolved noble gases also were measured to help identify the sources and ages of the sampled ground water. In total, over 275 constituents and field water-quality indicators were investigated. Three types of quality-control samples (blanks, replicates, and sampmatrix spikes) were collected at approximately 8 to 11 percent of the wells, and the results for these samples were used to evaluate the quality of the data obtained from the groundwater samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that contamination was not a noticeable source of bias in the data for the groundwater samples. Differences between replicate samples were within acceptable ranges for nearly all compounds, indicating acceptably low variability. Matrix-spike recoveries were within acceptable ranges for most compounds. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, raw groundwater typically is treated, disinfected, or blended with other waters to maintain water quality. Regulatory thresholds apply to water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw groundwater were compared with regulatory and nonregulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH) and with aesthetic and technical thresholds established by CDPH. Comparisons between data collected for this study and drinking-water thresholds are for illustrative purposes only and do not indicate compliance or noncompliance with those thresholds. The concentrations of most constituents detected in groundwater samples from REDSAC were below drinking-water thresholds. Volatile organic compounds (VOC) and pesticides were detected in less than one-quarter of the samples and were generally less than a hundredth of any health-based thresholds. NDMA was detected in one grid well above the NL-CA. Concentrations of all nutrients and trace elements in samples from REDSAC wells were below the health-based thresholds except those of arsenic in three samples, which were above the USEPA maximum contaminant level (MCL-US). However

  9. Groundwater Quality Data for the Tahoe-Martis Study Unit, 2007: Results from the California GAMA Program

    USGS Publications Warehouse

    Fram, Miranda S.; Munday, Cathy; Belitz, Kenneth

    2009-01-01

    Groundwater quality in the approximately 460-square-mile Tahoe-Martis study unit was investigated in June through September 2007 as part of the Priority Basin Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Project was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The study was designed to provide a spatially unbiased assessment of the quality of raw groundwater used for public water supplies within the Tahoe-Martis study unit (Tahoe-Martis) and to facilitate statistically consistent comparisons of groundwater quality throughout California. Samples were collected from 52 wells in El Dorado, Placer, and Nevada Counties. Forty-one of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study area (grid wells), and 11 were selected to aid in evaluation of specific water-quality issues (understanding wells). The groundwater samples were analyzed for a large number of synthetic organic constituents (volatile organic compounds [VOC], pesticides and pesticide degradates, and pharmaceutical compounds), constituents of special interest (perchlorate and N-nitrosodimethylamine [NDMA]), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), radioactive constituents, and microbial indicators. Naturally occurring isotopes (tritium, carbon-14, strontium isotope ratio, and stable isotopes of hydrogen and oxygen of water), and dissolved noble gases also were measured to help identify the sources and ages of the sampled groundwater. In total, 240 constituents and water-quality indicators were investigated. Three types of quality-control samples (blanks, replicates, and samples for matrix spikes) each were collected at 12 percent of the wells, and the results obtained from these samples were used to evaluate the quality of the data for the groundwater samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that data for the groundwater samples were not compromised by possible contamination during sample collection, handling or analysis. Differences between replicate samples were within acceptable ranges. Matrix spike recoveries were within acceptable ranges for most compounds. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, raw water typically is treated, disinfected, or blended with other waters to maintain water quality. Regulatory thresholds apply to water that is served to the consumer, not to raw groundwater. However, to provide some context for the results, concentrations of constituents measured in the raw groundwater were compared with regulatory and nonregulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and the California Department of Public Health (CDPH), and with aesthetic and technical thresholds established by CDPH. Comparisons between data collected for this study and drinking-water thresholds are for illustrative purposes only and do not indicate of compliance or noncompliance with regulatory thresholds. The concentrations of most constituents detected in groundwater samples from the Tahoe-Martis wells were below drinking-water thresholds. Organic compounds (VOCs and pesticides) were detected in about 40 percent of the samples from grid wells, and most concentrations were less than 1/100th of regulatory and nonregulatory health-based thresholds, although the conentration of perchloroethene in one sample was above the USEPA maximum contaminant level (MCL-US). Concentrations of all trace elements and nutrients in samples from grid wells were below regulatory and nonregulatory health-based thresholds, with five exceptions. Concentra

  10. Utilization of Satellite Data to Identify and Monitor Changes in Frequency of Meteorological Events

    NASA Astrophysics Data System (ADS)

    Mast, J. C.; Dessler, A. E.

    2017-12-01

    Increases in temperature and climate variability due to human-induced climate change is increasing the frequency and magnitude of extreme heat events (i.e., heatwaves). This will have a detrimental impact on the health of human populations and habitability of certain land locations. Here we seek to utilize satellite data records to identify and monitor extreme heat events. We analyze satellite data sets (MODIS and AIRS land surface temperatures (LST) and water vapor profiles (WV)) due to their global coverage and stable calibration. Heat waves are identified based on the frequency of maximum daily temperatures above a threshold, determined as follows. Land surface temperatures are gridded into uniform latitude/longitude bins. Maximum daily temperatures per bin are determined and probability density functions (PDF) of these maxima are constructed monthly and seasonally. For each bin, a threshold is calculated at the 95th percentile of the PDF of maximum temperatures. Per each bin, an extreme heat event is defined based on the frequency of monthly and seasonal days exceeding the threshold. To account for the decreased ability of the human body to thermoregulate with increasing moisture, and to assess lethality of the heat events, we determine the wet-bulb temperature at the locations of extreme heat events. Preliminary results will be presented.

  11. The Development of the Text Reception Threshold Test: A Visual Analogue of the Speech Reception Threshold Test

    ERIC Educational Resources Information Center

    Zekveld, Adriana A.; George, Erwin L. J.; Kramer, Sophia E.; Goverts, S. Theo; Houtgast, Tammo

    2007-01-01

    Purpose: In this study, the authors aimed to develop a visual analogue of the widely used Speech Reception Threshold (SRT; R. Plomp & A. M. Mimpen, 1979b) test. The Text Reception Threshold (TRT) test, in which visually presented sentences are masked by a bar pattern, enables the quantification of modality-aspecific variance in speech-in-noise…

  12. The warning-sign hierarchy between quantitative subcortical motor mapping and continuous motor evoked potential monitoring during resection of supratentorial brain tumors.

    PubMed

    Seidel, Kathleen; Beck, Jürgen; Stieglitz, Lennart; Schucht, Philippe; Raabe, Andreas

    2013-02-01

    Mapping and monitoring are believed to provide an early warning sign to determine when to stop tumor removal to avoid mechanical damage to the corticospinal tract (CST). The objective of this study was to systematically compare subcortical monopolar stimulation thresholds (1-20 mA) with direct cortical stimulation (DCS)-motor evoked potential (MEP) monitoring signal abnormalities and to correlate both with new postoperative motor deficits. The authors sought to define a mapping threshold and DCS-MEP monitoring signal changes indicating a minimal safe distance from the CST. A consecutive cohort of 100 patients underwent tumor surgery adjacent to the CST while simultaneous subcortical motor mapping and DCS-MEP monitoring was used. Evaluation was done regarding the lowest subcortical mapping threshold (monopolar stimulation, train of 5 stimuli, interstimulus interval 4.0 msec, pulse duration 500 μsec) and signal changes in DCS-MEPs (same parameters, 4 contact strip electrode). Motor function was assessed 1 day after surgery, at discharge, and at 3 months postoperatively. The lowest individual motor thresholds (MTs) were as follows (MT in mA, number of patients): > 20 mA, n = 12; 11-20 mA, n = 13; 6-10 mA, n = 20; 4-5 mA, n = 30; and 1-3 mA, n = 25. Direct cortical stimulation showed stable signals in 70 patients, unspecific changes in 18, irreversible alterations in 8, and irreversible loss in 4 patients. At 3 months, 5 patients had a postoperative new or worsened motor deficit (lowest mapping MT 20 mA, 13 mA, 6 mA, 3 mA, and 1 mA). In all 5 patients DCS-MEP monitoring alterations were documented (2 sudden irreversible threshold increases and 3 sudden irreversible MEP losses). Of these 5 patients, 2 had vascular ischemic lesions (MT 20 mA, 13 mA) and 3 had mechanical CST damage (MT 1 mA, 3 mA, and 6 mA; in the latter 2 cases the resection continued after mapping and severe DCS-MEP alterations occurred thereafter). In 80% of patients with a mapping MT of 1-3 mA and in 75% of patients with a mapping MT of 1 mA, DCS-MEPs were stable or showed unspecific reversible changes, and none had a permanent motor worsening at 3 months. In contrast, 25% of patients with irreversible DCS-MEP changes and 75% of patients with irreversible DCS-MEP loss had permanent motor deficits. Mapping should primarily guide tumor resection adjacent to the CST. DCS-MEP is a useful predictor of deficits, but its value as a warning sign is limited because signal alterations were reversible in only approximately 60% of the present cases and irreversibility is a post hoc definition. The true safe mapping MT is lower than previously thought. The authors postulate a mapping MT of 1 mA or less where irreversible DCS-MEP changes and motor deficits regularly occur. Therefore, they recommend stopping tumor resection at an MT of 2 mA at the latest. The limited spatial and temporal coverage of contemporary mapping may increase error and may contribute to false, higher MTs.

  13. Threshold Assessment of Gear Diagnostic Tools on Flight and Test Rig Data

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Mosher, Marianne; Huff, Edward M.

    2003-01-01

    A method for defining thresholds for vibration-based algorithms that provides the minimum number of false alarms while maintaining sensitivity to gear damage was developed. This analysis focused on two vibration based gear damage detection algorithms, FM4 and MSA. This method was developed using vibration data collected during surface fatigue tests performed in a spur gearbox rig. The thresholds were defined based on damage progression during tests with damage. The thresholds false alarm rates were then evaluated on spur gear tests without damage. Next, the same thresholds were applied to flight data from an OH-58 helicopter transmission. Results showed that thresholds defined in test rigs can be used to define thresholds in flight to correctly classify the transmission operation as normal.

  14. Monitoring the development of volcanic eruptions through volcanic lightning - Using a lightning mapping array, seismic and infrasound array, and visual plume analysis

    NASA Astrophysics Data System (ADS)

    Smith, C. M.; Thompson, G.; McNutt, S. R.; Behnke, S. A.; Edens, H. E.; Van Eaton, A. R.; Gaudin, D.; Thomas, R. J.

    2017-12-01

    The period of 28 May - 7 June 2015 at Sakurajima Volcano, Japan witnessed a multitude of Vulcanian eruptive events, which resulted in plumes reaching 500-3000m above the vent. These plumes varied from white, gas-rich plumes to dark grey and black ash-rich plumes, and were recorded on lowlight and infrared cameras. A nine-station lightning mapping array (LMA) was deployed to locate sources of VHF (67-73 MHz) radiation produced by lightning flashes and other types of electrical activity such as `continuous RF (radio frequency)'. Two Nanometrics Trillium broadband seismometers and six BSU infrasound sensors were deployed. Over this ten day period we recorded 1556 events that consisted of both seismic and infrasound signals, indicating explosive activity. There are an additional 1222 events that were recorded as only seismic or infrasound signals, which may be a result of precursory seismic signals or noise contamination. Plume discharge types included both distinct lightning flashes and `continuous RF'. The LMA ran continuously for the duration of the experiment. On 30 May 2015 at least seven lightning flashes were also detected by the Vaisala Global Lightning Detection 360 network, which detects VLF (3-30 kHz) radiation. However the University of Washington's World Wide Lightning Location Network, which also detects VLF radiation, detected no volcanic lightning flashes in this time period. This indicates that the electrical activity in Sakurajima's plume occurs near the lower limits of the VLF detection threshold. We investigate relationships between the plume dynamics, the geophysical signal and the corresponding electrical activity through: plume velocity and height; event waveform cross-correlation; volcano acoustic-seismic ratios; overall geophysical energy; RSAM records; and VHF sources detected by the LMA. By investigating these relationships we hope to determine the seismic/infrasound energy threshold required to generate measurable electrical activity. Seismic and infrasound are two of the most common volcanic monitoring methods. By developing the relationships between plume electrification and these geophysical methods we hope to expand the use of lightning for active volcano monitoring.

  15. Novel Insights in the Fecal Egg Count Reduction Test for Monitoring Drug Efficacy against Soil-Transmitted Helminths in Large-Scale Treatment Programs

    PubMed Central

    Levecke, Bruno; Speybroeck, Niko; Dobson, Robert J.; Vercruysse, Jozef; Charlier, Johannes

    2011-01-01

    Background The fecal egg count reduction test (FECRT) is recommended to monitor drug efficacy against soil-transmitted helminths (STHs) in public health. However, the impact of factors inherent to study design (sample size and detection limit of the fecal egg count (FEC) method) and host-parasite interactions (mean baseline FEC and aggregation of FEC across host population) on the reliability of FECRT is poorly understood. Methodology/Principal Findings A simulation study was performed in which FECRT was assessed under varying conditions of the aforementioned factors. Classification trees were built to explore critical values for these factors required to obtain conclusive FECRT results. The outcome of this analysis was subsequently validated on five efficacy trials across Africa, Asia, and Latin America. Unsatisfactory (<85.0%) sensitivity and specificity results to detect reduced efficacy were found if sample sizes were small (<10) or if sample sizes were moderate (10–49) combined with highly aggregated FEC (k<0.25). FECRT remained inconclusive under any evaluated condition for drug efficacies ranging from 87.5% to 92.5% for a reduced-efficacy-threshold of 90% and from 92.5% to 97.5% for a threshold of 95%. The most discriminatory study design required 200 subjects independent of STH status (including subjects who are not excreting eggs). For this sample size, the detection limit of the FEC method and the level of aggregation of the FEC did not affect the interpretation of the FECRT. Only for a threshold of 90%, mean baseline FEC <150 eggs per gram of stool led to a reduced discriminatory power. Conclusions/Significance This study confirms that the interpretation of FECRT is affected by a complex interplay of factors inherent to both study design and host-parasite interactions. The results also highlight that revision of the current World Health Organization guidelines to monitor drug efficacy is indicated. We, therefore, propose novel guidelines to support future monitoring programs. PMID:22180801

  16. The Global Detection Capability of the IMS Seismic Network in 2013 Inferred from Ambient Seismic Noise Measurements

    NASA Astrophysics Data System (ADS)

    Gaebler, P. J.; Ceranna, L.

    2016-12-01

    All nuclear explosions - on the Earth's surface, underground, underwater or in the atmosphere - are banned by the Comprehensive Nuclear-Test-Ban Treaty (CTBT). As part of this treaty, a verification regime was put into place to detect, locate and characterize nuclear explosion testings at any time, by anyone and everywhere on the Earth. The International Monitoring System (IMS) plays a key role in the verification regime of the CTBT. Out of the different monitoring techniques used in the IMS, the seismic waveform approach is the most effective technology for monitoring nuclear underground testing and to identify and characterize potential nuclear events. This study introduces a method of seismic threshold monitoring to assess an upper magnitude limit of a potential seismic event in a certain given geographical region. The method is based on ambient seismic background noise measurements at the individual IMS seismic stations as well as on global distance correction terms for body wave magnitudes, which are calculated using the seismic reflectivity method. From our investigations we conclude that a global detection threshold of around mb 4.0 can be achieved using only stations from the primary seismic network, a clear latitudinal dependence for the detection thresholdcan be observed between northern and southern hemisphere. Including the seismic stations being part of the auxiliary seismic IMS network results in a slight improvement of global detection capability. However, including wave arrivals from distances greater than 120 degrees, mainly PKP-wave arrivals, leads to a significant improvement in average global detection capability. In special this leads to an improvement of the detection threshold on the southern hemisphere. We further investigate the dependence of the detection capability on spatial (latitude and longitude) and temporal (time) parameters, as well as on parameters such as source type and percentage of operational IMS stations.

  17. Research on snow cover monitoring of Northeast China using Fengyun Geostationary Satellite

    NASA Astrophysics Data System (ADS)

    Wu, Tong; Gu, Lingjia; Ren, Ruizhi; Zhou, TIngting

    2017-09-01

    Snow cover information has great significance for monitoring and preventing snowstorms. With the development of satellite technology, geostationary satellites are playing more important roles in snow monitoring. Currently, cloud interference is a serious problem for obtaining accurate snow cover information. Therefore, the cloud pixels located in the MODIS snow products are usually replaced by cloud-free pixels around the day, which ignores snow cover dynamics. FengYun-2(FY-2) is the first generation of geostationary satellite in our country which complements the polar orbit satellite. The snow cover monitoring of Northeast China using FY-2G data in January and February 2016 is introduced in this paper. First of all, geometric and radiometric corrections are carried out for visible and infrared channels. Secondly, snow cover information is extracted according to its characteristics in different channels. Multi-threshold judgment methods for the different land types and similarity separation techniques are combined to discriminate snow and cloud. Furthermore, multi-temporal data is used to eliminate cloud effect. Finally, the experimental results are compared with the MOD10A1 and MYD10A1 (MODIS daily snow cover) product. The MODIS product can provide higher resolution of the snow cover information in cloudless conditions. Multi-temporal FY-2G data can get more accurate snow cover information in cloudy conditions, which is beneficial for monitoring snowstorms and climate changes.

  18. 24 CFR 570.405 - The insular areas.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...) Threshold requirements. HUD shall review each grantee's progress on outstanding grants made under this..., HUD will consider program achievements and the applicant's effectiveness in using program funds. Effectiveness in using program funds shall be measured by reviewing audit, monitoring and performance reports...

  19. Depollution benchmarks for capacitors, batteries and printed wiring boards from waste electrical and electronic equipment (WEEE).

    PubMed

    Savi, Daniel; Kasser, Ueli; Ott, Thomas

    2013-12-01

    The article compiles and analyses sample data for toxic components removed from waste electronic and electrical equipment (WEEE) from more than 30 recycling companies in Switzerland over the past ten years. According to European and Swiss legislation, toxic components like batteries, capacitors and printed wiring boards have to be removed from WEEE. The control bodies of the Swiss take back schemes have been monitoring the activities of WEEE recyclers in Switzerland for about 15 years. All recyclers have to provide annual mass balance data for every year of operation. From this data, percentage shares of removed batteries and capacitors are calculated in relation to the amount of each respective WEEE category treated. A rationale is developed, why such an indicator should not be calculated for printed wiring boards. The distributions of these de-pollution indicators are analysed and their suitability for defining lower threshold values and benchmarks for the depollution of WEEE is discussed. Recommendations for benchmarks and threshold values for the removal of capacitors and batteries are given. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Liquid impact and fracture of free-standing CVD diamond

    NASA Astrophysics Data System (ADS)

    Kennedy, Claire F.; Telling, Robert H.; Field, John E.

    1999-07-01

    The Cavendish Laboratory has developed extensive facilities for studies of liquid and solid particle erosion. This paper describes the high-speed liquid impact erosion of thin CVD diamond discs and the variation with grain sizes of the absolute damage threshold velocity (ADTV), viz., the threshold below which the specimen shows no damage. All specimens fail by rear surface cracking and there is shown to be a shallow dependence of rear surface ADTV on grain size. Fracture propagation in CVD diamond has also been monitored using a specially-designed double-torsion apparatus and data for K1C are presented. Tentatively, the results suggest that finer-grained CVD diamond exhibits a higher fracture toughness, although the differences are slight even over a fourfold variation in the mean grain size. No preference for intergranular fracture was observed and one may conclude from this that the grain boundaries themselves do not seriously weaken the material. The large pre-existing flaws, both within and between grains, whose size varies the grain size are believed to be the dominant source of weakness.

  1. Tuning Piezo ion channels to detect molecular-scale movements relevant for fine touch

    PubMed Central

    Poole, Kate; Herget, Regina; Lapatsina, Liudmila; Ngo, Ha-Duong; Lewin, Gary R.

    2014-01-01

    In sensory neurons, mechanotransduction is sensitive, fast and requires mechanosensitive ion channels. Here we develop a new method to directly monitor mechanotransduction at defined regions of the cell-substrate interface. We show that molecular-scale (~13 nm) displacements are sufficient to gate mechanosensitive currents in mouse touch receptors. Using neurons from knockout mice, we show that displacement thresholds increase by one order of magnitude in the absence of stomatin-like protein 3 (STOML3). Piezo1 is the founding member of a class of mammalian stretch-activated ion channels, and we show that STOML3, but not other stomatin-domain proteins, brings the activation threshold for Piezo1 and Piezo2 currents down to ~10 nm. Structure–function experiments localize the Piezo modulatory activity of STOML3 to the stomatin domain, and higher-order scaffolds are a prerequisite for function. STOML3 is the first potent modulator of Piezo channels that tunes the sensitivity of mechanically gated channels to detect molecular-scale stimuli relevant for fine touch. PMID:24662763

  2. Detecting Intra-Fraction Motion in Patients Undergoing Radiation Treatment Using a Low-Cost Wireless Accelerometer

    PubMed Central

    Farahmand, Farid; Khadivi, Kevin O.; Rodrigues, Joel J. P. C.

    2009-01-01

    The utility of a novel, high-precision, non-intrusive, wireless, accelerometer-based patient orientation monitoring system (APOMS) in determining orientation change in patients undergoing radiation treatment is reported here. Using this system a small wireless accelerometer sensor is placed on a patient’s skin, broadcasting its orientation to the receiving station connected to a PC in the control area. A threshold-based algorithm is developed to identify the exact amount of the patient’s head orientation change. Through real-time measurements, an audible alarm can alert the radiation therapist if the user-defined orientation threshold is violated. Our results indicate that, in spite of its low-cost and simplicity, the APOMS is highly sensitive and offers accurate measurements. Furthermore, the APOMS is patient friendly, vendor neutral, and requires minimal user training. The versatile architecture of the APOMS makes it potentially suitable for variety of applications, including study of correlation between external and internal markers during Image-Guided Radiation Therapy (IGRT), with no major changes in hardware setup or algorithm. PMID:22423196

  3. Variability of space climate and its extremes with successive solar cycles

    NASA Astrophysics Data System (ADS)

    Chapman, Sandra; Hush, Phillip; Tindale, Elisabeth; Dunlop, Malcolm; Watkins, Nicholas

    2016-04-01

    Auroral geomagnetic indices coupled with in situ solar wind monitors provide a comprehensive data set, spanning several solar cycles. Space climate can be considered as the distribution of space weather. We can then characterize these observations in terms of changing space climate by quantifying how the statistical properties of ensembles of these observed variables vary between different phases of the solar cycle. We first consider the AE index burst distribution. Bursts are constructed by thresholding the AE time series; the size of a burst is the sum of the excess in the time series for each time interval over which the threshold is exceeded. The distribution of burst sizes is two component with a crossover in behaviour at thresholds ≈ 1000 nT. Above this threshold, we find[1] a range over which the mean burst size is almost constant with threshold for both solar maxima and minima. The burst size distribution of the largest events has a functional form which is exponential. The relative likelihood of these large events varies from one solar maximum and minimum to the next. If the relative overall activity of a solar maximum/minimum can be estimated, these results then constrain the likelihood of extreme events of a given size for that solar maximum/minimum. We next develop and apply a methodology to quantify how the full distribution of geomagnetic indices and upstream solar wind observables are changing between and across different solar cycles. This methodology[2] estimates how different quantiles of the distribution, or equivalently, how the return times of events of a given size, are changing. [1] Hush, P., S. C. Chapman, M. W. Dunlop, and N. W. Watkins (2015), Robust statistical properties of the size of large burst events in AE, Geophys. Res. Lett.,42 doi:10.1002/2015GL066277 [2] Chapman, S. C., D. A. Stainforth, N. W. Watkins, (2013) On estimating long term local climate trends , Phil. Trans. Royal Soc., A,371 20120287 DOI:10.1098/rsta.2012.0287

  4. Orion MPCV Touchdown Detection Threshold Development and Testing

    NASA Technical Reports Server (NTRS)

    Daum, Jared; Gay, Robert

    2013-01-01

    A robust method of detecting Orion Multi-Purpose Crew Vehicle (MPCV) splashdown is necessary to ensure crew and hardware safety during descent and after touchdown. The proposed method uses a triple redundant system to inhibit Reaction Control System (RCS) thruster firings, detach parachute risers from the vehicle, and transition to the post-landing segment of the Flight Software (FSW). An in-depth trade study was completed to determine optimal characteristics of the touchdown detection method resulting in an algorithm monitoring filtered, lever-arm corrected, 200 Hz Inertial Measurement Unit (IMU) vehicle acceleration magnitude data against a tunable threshold using persistence counter logic. Following the design of the algorithm, high fidelity environment and vehicle simulations, coupled with the actual vehicle FSW, were used to tune the acceleration threshold and persistence counter value to result in adequate performance in detecting touchdown and sufficient safety margin against early detection while descending under parachutes. An analytical approach including Kriging and adaptive sampling allowed for a sufficient number of finite element analysis (FEA) impact simulations to be completed using minimal computation time. The combination of a persistence counter of 10 and an acceleration threshold of approximately 57.3 ft/s2 resulted in an impact performance factor of safety (FOS) of 1.0 and a safety FOS of approximately 2.6 for touchdown declaration. An RCS termination acceleration threshold of approximately 53.1 ft/s(exp)2 with a persistence counter of 10 resulted in an increased impact performance FOS of 1.2 at the expense of a lowered under-parachutes safety factor of 2.2. The resulting tuned algorithm was then tested on data from eight Capsule Parachute Assembly System (CPAS) flight tests, showing an experimental minimum safety FOS of 6.1. The formulated touchdown detection algorithm will be flown on the Orion MPCV FSW during the Exploration Flight Test 1 (EFT-1) mission in the second half of 2014.

  5. Modeling applications for precision agriculture in the California Central Valley

    NASA Astrophysics Data System (ADS)

    Marklein, A. R.; Riley, W. J.; Grant, R. F.; Mezbahuddin, S.; Mekonnen, Z. A.; Liu, Y.; Ying, S.

    2017-12-01

    Drought in California has increased the motivation to develop precision agriculture, which uses observations to make site-specific management decisions throughout the growing season. In agricultural systems that are prone to drought, these efforts often focus on irrigation efficiency. Recent improvements in soil sensor technology allow the monitoring of plant and soil status in real-time, which can then inform models aimed at improving irrigation management. But even on farms with resources to deploy soil sensors across the landscape, leveraging that sensor data to design an efficient irrigation scheme remains a challenge. We conduct a modeling experiment aimed at simulating precision agriculture to address several questions: (1) how, when, and where does irrigation lead to optimal yield? and (2) What are the impacts of different precision irrigation schemes on yields, soil organic carbon (SOC), and total water use? We use the ecosys model to simulate precision agriculture in a conventional tomato-corn rotation in the California Central Valley with varying soil water content thresholds for irrigation and soil water sensor depths. This model is ideal for our question because it includes explicit process-based functions for the plant growth, plant water use, soil hydrology, and SOC, and has been tested extensively in agricultural ecosystems. Low irrigation thresholds allows the soil to become drier before irrigating compared to high irrigation thresholds; as such, we found that the high irrigation thresholds use more irrigation over the course of the season, have higher yields, and have lower water use efficiency. The irrigation threshold did not affect SOC. Yields and water use are highest at sensor depths of 0.5 to 0.15 m, but water use efficiency was also lowest at these depths. We found SOC to be significantly affected by sensor depth, with the highest SOC at the shallowest sensor depths. These results will help regulate irrigation water while maintaining yield in California, especially with uncertain precipitation regimes.

  6. 24 CFR 954.104 - Performance thresholds.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Performance thresholds. 954.104... DEVELOPMENT INDIAN HOME PROGRAM Applying for Assistance § 954.104 Performance thresholds. Applicants must have... HOME program must have performed adequately. In cases of previously documented deficient performance...

  7. 24 CFR 954.104 - Performance thresholds.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Performance thresholds. 954.104... DEVELOPMENT INDIAN HOME PROGRAM Applying for Assistance § 954.104 Performance thresholds. Applicants must have... HOME program must have performed adequately. In cases of previously documented deficient performance...

  8. A new qualitative acoustic emission parameter based on Shannon's entropy for damage monitoring

    NASA Astrophysics Data System (ADS)

    Chai, Mengyu; Zhang, Zaoxiao; Duan, Quan

    2018-02-01

    An important objective of acoustic emission (AE) non-destructive monitoring is to accurately identify approaching critical damage and to avoid premature failure by means of the evolutions of AE parameters. One major drawback of most parameters such as count and rise time is that they are strongly dependent on the threshold and other settings employed in AE data acquisition system. This may hinder the correct reflection of original waveform generated from AE sources and consequently bring difficulty for the accurate identification of the critical damage and early failure. In this investigation, a new qualitative AE parameter based on Shannon's entropy, i.e. AE entropy is proposed for damage monitoring. Since it derives from the uncertainty of amplitude distribution of each AE waveform, it is independent of the threshold and other time-driven parameters and can characterize the original micro-structural deformations. Fatigue crack growth test on CrMoV steel and three point bending test on a ductile material are conducted to validate the feasibility and effectiveness of the proposed parameter. The results show that the new parameter, compared to AE amplitude, is more effective in discriminating the different damage stages and identifying the critical damage.

  9. Acoustic Reflexes in Normal-Hearing Adults, Typically Developing Children, and Children with Suspected Auditory Processing Disorder: Thresholds, Real-Ear Corrections, and the Role of Static Compliance on Estimates.

    PubMed

    Saxena, Udit; Allan, Chris; Allen, Prudence

    2017-06-01

    Previous studies have suggested elevated reflex thresholds in children with auditory processing disorders (APDs). However, some aspects of the child's ear such as ear canal volume and static compliance of the middle ear could possibly affect the measurements of reflex thresholds and thus impact its interpretation. Sound levels used to elicit reflexes in a child's ear may be higher than predicted by calibration in a standard 2-cc coupler, and lower static compliance could make visualization of very small changes in impedance at threshold difficult. For this purpose, it is important to evaluate threshold data with consideration of differences between children and adults. A set of studies were conducted. The first compared reflex thresholds obtained using standard clinical procedures in children with suspected APD to that of typically developing children and adults to test the replicability of previous studies. The second study examined the impact of ear canal volume on estimates of reflex thresholds by applying real-ear corrections. Lastly, the relationship between static compliance and reflex threshold estimates was explored. The research is a set of case-control studies with a repeated measures design. The first study included data from 20 normal-hearing adults, 28 typically developing children, and 66 children suspected of having an APD. The second study included 28 normal-hearing adults and 30 typically developing children. In the first study, crossed and uncrossed reflex thresholds were measured in 5-dB step size. Reflex thresholds were analyzed using repeated measures analysis of variance (RM-ANOVA). In the second study, uncrossed reflex thresholds, real-ear correction, ear canal volume, and static compliance were measured. Reflex thresholds were measured using a 1-dB step size. The effect of real-ear correction and static compliance on reflex threshold was examined using RM-ANOVA and Pearson correlation coefficient, respectively. Study 1 replicated previous studies showing elevated reflex thresholds in many children with suspected APD when compared to data from adults using standard clinical procedures, especially in the crossed condition. The thresholds measured in children with suspected APD tended to be higher than those measured in the typically developing children. There were no significant differences between the typically developing children and adults. However, when real-ear calibrated stimulus levels were used, it was found that children's thresholds were elicited at higher levels than in the adults. A significant relationship between reflex thresholds and static compliance was found in the adult data, showing a trend for higher thresholds in ears with lower static compliance, but no such relationship was found in the data from the children. This study suggests that reflex measures in children should be adjusted for real-ear-to-coupler differences before interpretation. The data in children with suspected APD support previous studies suggesting abnormalities in reflex thresholds. The lack of correlation between threshold and static compliance estimates in children as was observed in the adults may suggest a nonmechanical explanation for age and clinically related effects. American Academy of Audiology

  10. Feeling the force: how pollen tubes deal with obstacles.

    PubMed

    Burri, Jan T; Vogler, Hannes; Läubli, Nino F; Hu, Chengzhi; Grossniklaus, Ueli; Nelson, Bradley J

    2018-06-15

    Physical forces are involved in the regulation of plant development and morphogenesis by translating mechanical stress into the modification of physiological processes, which, in turn, can affect cellular growth. Pollen tubes respond rapidly to external stimuli and provide an ideal system to study the effect of mechanical cues at the single-cell level. Here, pollen tubes were exposed to mechanical stress while monitoring the reconfiguration of their growth and recording the generated forces in real-time. We combined a lab-on-a-chip device with a microelectromechanical systems (MEMS)-based capacitive force sensor to mimic and quantify the forces that are involved in pollen tube navigation upon confronting mechanical obstacles. Several stages of obstacle avoidance were identified, including force perception, growth adjustment and penetration. We have experimentally determined the perceptive force threshold, which is the force threshold at which the pollen tube reacts to an obstacle, for Lilium longiflorum and Arabidopsis thaliana. In addition, the method we developed provides a way to calculate turgor pressure based on force and optical data. Pollen tubes sense physical barriers and actively adjust their growth behavior to overcome them. Furthermore, our system offers an ideal platform to investigate intracellular activity during force perception and growth adaption in tip growing cells. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  11. Exhaustive Thresholds and Resistance Checkpoints

    NASA Technical Reports Server (NTRS)

    Easton, Charles; Khuzadi, Mbuyi

    2008-01-01

    Once deployed, all intricate systems that operate for a long time (such as an airplane or chemical processing plant) experience degraded performance during operational lifetime. These can result from losses of integrity in subsystems and parts that generally do not materially impact the operation of the vehicle (e.g., the light behind the button that opens the sliding door of the minivan). Or it can result from loss of more critical parts or subsystems. Such losses need to be handled quickly in order to avoid loss of personnel, mission, or part of the system itself. In order to manage degraded systems, knowledge of its potential problem areas and the means by which these problems are detected should be developed during the initial development of the system. Once determined, a web of sensors is employed and their outputs are monitored with other system parameters while the system is in preparation or operation. Just gathering the data is only part of the story. The interpretation of the data itself and the response of the system must be carefully developed as well to avoid a mishap. Typically, systems use a test-threshold-response paradigm to process potential system faults. However, such processing sub-systems can suffer from errors and oversights of a consistent type, causing system aberrant behavior instead of expected system and recovery operations. In our study, we developed a complete checklist for determining the completeness of a fault system and its robustness to common processing and response difficulties.

  12. A survey of topsoil arsenic and mercury concentrations across France.

    PubMed

    Marchant, B P; Saby, N P A; Arrouays, D

    2017-08-01

    Even at low concentrations, the presence of arsenic and mercury in soils can lead to ecological and health impacts. The recent European-wide LUCAS Topsoil Survey found that the arsenic concentration of a large proportion of French soils exceeded a threshold which indicated that further investigation was required. A much smaller proportion of soils exceeded the corresponding threshold for mercury but the impacts of mining and industrial activities on mercury concentrations are not well understood. We use samples from the French national soil monitoring network (RMQS: Réseau de Mesures de la Qualité des Sols) to explore the variation of topsoil arsenic and mercury concentrations across mainland France at a finer spatial resolution than was reported by LUCAS Topsoil. We use geostatistical methods to map the expected concentrations of these elements in the topsoil and the probabilities that the legislative thresholds are exceeded. We find that, with the exception of some areas where the geogenic concentrations and soil adsorption capacities are very low, arsenic concentrations are generally larger than the threshold which indicates that further assessment of the area is required. The lower of two other guideline values indicating risks to ecology or health is exceeded in fewer than 5% of RMQS samples. These exceedances occur in localised hot-spots primarily associated with mining and mineralization. The probabilities of mercury concentrations exceeding the further assessment threshold value are everywhere less than 0.01 and none of the RMQS samples exceed either of the ecological and health risk thresholds. However, there are some regions with elevated concentrations which can be related to volcanic material, natural mineralizations and industrial contamination. These regions are more diffuse than the hot-spots of arsenic reflecting the greater volatility of mercury and therefore the greater ease with which it can be transported and redeposited. The maps provide a baseline against which future phases of the RMQS can be compared and highlight regions where the threat of soil contamination and its impacts should be more closely monitored. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Monitoring Conformance and Containment for Geological Carbon Storage: Can Technology Meet Policy and Public Requirements?

    NASA Astrophysics Data System (ADS)

    Lawton, D. C.; Osadetz, K.

    2014-12-01

    The Province of Alberta, Canada identified carbon capture and storage (CCS) as a key element of its 2008 Climate Change strategy. The target is a reduction in CO2 emissions of 139 Mt/year by 2050. To encourage uptake of CCS by industry, the province has provided partial funding to two demonstration scale projects, namely the Quest Project by Shell and partners (CCS), and the Alberta Carbon Trunk Line Project (pipeline and CO2-EOR). Important to commercial scale implementation of CCS will be the requirement to prove conformance and containment of the CO2 plume injected during the lifetime of the CCS project. This will be a challenge for monitoring programs. The Containment and Monitoring Institute (CaMI) is developing a Field Research Station (FRS) to calibrate various monitoring technologies for CO2 detection thresholds at relatively shallow depths. The objective being assessed with the FRS is sensitivity for early detection of loss of containment from a deeper CO2 storage project. In this project, two injection wells will be drilled to sandstone reservoir targets at depths of 300 m and 700 m. Up to four observation wells will be drilled with monitoring instruments installed. Time-lapse surface and borehole monitoring surveys will be undertaken to evaluate the movement and fate of the CO2 plume. These will include seismic, microseismic, cross well, electrical resistivity, electromagnetic, gravity, geodetic and geomechanical surveys. Initial baseline seismic data from the FRS will presented.

  14. Optimal Design of Air Quality Monitoring Network and its Application in an Oil Refinery Plant: An Approach to Keep Health Status of Workers.

    PubMed

    ZoroufchiBenis, Khaled; Fatehifar, Esmaeil; Ahmadi, Javad; Rouhi, Alireza

    2015-01-01

    Industrial air pollution is a growing challenge to humane health, especially in developing countries, where there is no systematic monitoring of air pollution. Given the importance of the availability of valid information on population exposure to air pollutants, it is important to design an optimal Air Quality Monitoring Network (AQMN) for assessing population exposure to air pollution and predicting the magnitude of the health risks to the population. A multi-pollutant method (implemented as a MATLAB program) was explored for configur-ing an AQMN to detect the highest level of pollution around an oil refinery plant. The method ranks potential monitoring sites (grids) according to their ability to represent the ambient concentration. The term of cluster of contiguous grids that exceed a threshold value was used to calculate the Station Dosage. Selection of the best configuration of AQMN was done based on the ratio of a sta-tion's dosage to the total dosage in the network. Six monitoring stations were needed to detect the pollutants concentrations around the study area for estimating the level and distribution of exposure in the population with total network efficiency of about 99%. An analysis of the design procedure showed that wind regimes have greatest effect on the location of monitoring stations. The optimal AQMN enables authorities to implement an effective program of air quality management for protecting human health.

  15. Optimal Design of Air Quality Monitoring Network and its Application in an Oil Refinery Plant: An Approach to Keep Health Status of Workers

    PubMed Central

    ZoroufchiBenis, Khaled; Fatehifar, Esmaeil; Ahmadi, Javad; Rouhi, Alireza

    2015-01-01

    Background: Industrial air pollution is a growing challenge to humane health, especially in developing countries, where there is no systematic monitoring of air pollution. Given the importance of the availability of valid information on population exposure to air pollutants, it is important to design an optimal Air Quality Monitoring Network (AQMN) for assessing population exposure to air pollution and predicting the magnitude of the health risks to the population. Methods: A multi-pollutant method (implemented as a MATLAB program) was explored for configur­ing an AQMN to detect the highest level of pollution around an oil refinery plant. The method ranks potential monitoring sites (grids) according to their ability to represent the ambient concentration. The term of cluster of contiguous grids that exceed a threshold value was used to calculate the Station Dosage. Selection of the best configuration of AQMN was done based on the ratio of a sta­tion’s dosage to the total dosage in the network. Results: Six monitoring stations were needed to detect the pollutants concentrations around the study area for estimating the level and distribution of exposure in the population with total network efficiency of about 99%. An analysis of the design procedure showed that wind regimes have greatest effect on the location of monitoring stations. Conclusion: The optimal AQMN enables authorities to implement an effective program of air quality management for protecting human health. PMID:26933646

  16. Landscape and bio- geochemical strategy for monitoring transformation and reclamation of the soil mining sites

    NASA Astrophysics Data System (ADS)

    Korobova, Elena

    2010-05-01

    Sites of active or abandoned mining represent areas of considerable technogenic impact and need scientifically ground organization of their monitoring and reclamation. The strategy of monitoring and reclamation depends on the scale and character of the physical, chemical and biological consequences of the disturbances. The geochemical studies for monitoring and rehabilitation of the career-dump complexes should methodically account of formation of the particular new landforms and the changes in circulation of the remobilized elements of the soil cover. However, the general strategy should account of both the initial and transformed landscape geochemical structure of the area with due regard to the natural and new content of chemical elements in the environmental components. For example the tailings and waste rocks present new geochemical fields with specifically different concentration of chemical elements that cause formation of new geochemical barriers and landscapes. The way of colonization of the newly formed landscapes depends upon the new geochemical features of the technogenic environment and the adaptive ability of local and intrusive flora. The newly formed biogeochemical anomalies need organization of permanent monitoring not only within the anomaly itself but also of its impact zones. Spatial landscape geochemical monitoring combined with bio-geochemical criteria of threshold concentrations seems to be a helpful tool for decision making on reclamation and operation of the soil mining sites to provide a long-term ecologically sustainable development of the impact zone as a whole.

  17. Extending helium partial pressure measurement technology to JET DTE2 and ITER.

    PubMed

    Klepper, C C; Biewer, T M; Kruezi, U; Vartanian, S; Douai, D; Hillis, D L; Marcus, C

    2016-11-01

    The detection limit for helium (He) partial pressure monitoring via the Penning discharge optical emission diagnostic, mainly used for tokamak divertor effluent gas analysis, is shown here to be possible for He concentrations down to 0.1% in predominantly deuterium effluents. This result from a dedicated laboratory study means that the technique can now be extended to intrinsically (non-injected) He produced as fusion reaction ash in deuterium-tritium experiments. The paper also examines threshold ionization mass spectroscopy as a potential backup to the optical technique, but finds that further development is needed to attain with plasma pulse-relevant response times. Both these studies are presented in the context of continuing development of plasma pulse-resolving, residual gas analysis for the upcoming JET deuterium-tritium campaign (DTE2) and for ITER.

  18. Getting the message across: using ecological integrity to communicate with resource managers

    USGS Publications Warehouse

    Mitchell, Brian R.; Tierney, Geraldine L.; Schweiger, E. William; Miller, Kathryn M.; Faber-Langendoen, Don; Grace, James B.

    2014-01-01

    This chapter describes and illustrates how concepts of ecological integrity, thresholds, and reference conditions can be integrated into a research and monitoring framework for natural resource management. Ecological integrity has been defined as a measure of the composition, structure, and function of an ecosystem in relation to the system’s natural or historical range of variation, as well as perturbations caused by natural or anthropogenic agents of change. Using ecological integrity to communicate with managers requires five steps, often implemented iteratively: (1) document the scale of the project and the current conceptual understanding and reference conditions of the ecosystem, (2) select appropriate metrics representing integrity, (3) define externally verified assessment points (metric values that signify an ecological change or need for management action) for the metrics, (4) collect data and calculate metric scores, and (5) summarize the status of the ecosystem using a variety of reporting methods. While we present the steps linearly for conceptual clarity, actual implementation of this approach may require addressing the steps in a different order or revisiting steps (such as metric selection) multiple times as data are collected. Knowledge of relevant ecological thresholds is important when metrics are selected, because thresholds identify where small changes in an environmental driver produce large responses in the ecosystem. Metrics with thresholds at or just beyond the limits of a system’s range of natural variability can be excellent, since moving beyond the normal range produces a marked change in their values. Alternatively, metrics with thresholds within but near the edge of the range of natural variability can serve as harbingers of potential change. Identifying thresholds also contributes to decisions about selection of assessment points. In particular, if there is a significant resistance to perturbation in an ecosystem, with threshold behavior not occurring until well beyond the historical range of variation, this may provide a scientific basis for shifting an ecological assessment point beyond the historical range. We present two case studies using ongoing monitoring by the US National Park Service Vital Signs program that illustrate the use of an ecological integrity approach to communicate ecosystem status to resource managers. The Wetland Ecological Integrity in Rocky Mountain National Park case study uses an analytical approach that specifically incorporates threshold detection into the process of establishing assessment points. The Forest Ecological Integrity of Northeastern National Parks case study describes a method for reporting ecological integrity to resource managers and other decision makers. We believe our approach has the potential for wide applicability for natural resource management.

  19. APPLICATION OF BAYESIAN AND GEOSTATISTICAL MODELING TO THE ENVIRONMENTAL MONITORING OF CS-137 AT THE IDAHO NATIONAL LABORATORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kara G. Eby

    2010-08-01

    At the Idaho National Laboratory (INL) Cs-137 concentrations above the U.S. Environmental Protection Agency risk-based threshold of 0.23 pCi/g may increase the risk of human mortality due to cancer. As a leader in nuclear research, the INL has been conducting nuclear activities for decades. Elevated anthropogenic radionuclide levels including Cs-137 are a result of atmospheric weapons testing, the Chernobyl accident, and nuclear activities occurring at the INL site. Therefore environmental monitoring and long-term surveillance of Cs-137 is required to evaluate risk. However, due to the large land area involved, frequent and comprehensive monitoring is limited. Developing a spatial model thatmore » predicts Cs-137 concentrations at unsampled locations will enhance the spatial characterization of Cs-137 in surface soils, provide guidance for an efficient monitoring program, and pinpoint areas requiring mitigation strategies. The predictive model presented herein is based on applied geostatistics using a Bayesian analysis of environmental characteristics across the INL site, which provides kriging spatial maps of both Cs-137 estimates and prediction errors. Comparisons are presented of two different kriging methods, showing that the use of secondary information (i.e., environmental characteristics) can provide improved prediction performance in some areas of the INL site.« less

  20. Statistical models for fever forecasting based on advanced body temperature monitoring.

    PubMed

    Jordan, Jorge; Miro-Martinez, Pau; Vargas, Borja; Varela-Entrecanales, Manuel; Cuesta-Frau, David

    2017-02-01

    Body temperature monitoring provides health carers with key clinical information about the physiological status of patients. Temperature readings are taken periodically to detect febrile episodes and consequently implement the appropriate medical countermeasures. However, fever is often difficult to assess at early stages, or remains undetected until the next reading, probably a few hours later. The objective of this article is to develop a statistical model to forecast fever before a temperature threshold is exceeded to improve the therapeutic approach to the subjects involved. To this end, temperature series of 9 patients admitted to a general internal medicine ward were obtained with a continuous monitoring Holter device, collecting measurements of peripheral and core temperature once per minute. These series were used to develop different statistical models that could quantify the probability of having a fever spike in the following 60 minutes. A validation series was collected to assess the accuracy of the models. Finally, the results were compared with the analysis of some series by experienced clinicians. Two different models were developed: a logistic regression model and a linear discrimination analysis model. Both of them exhibited a fever peak forecasting accuracy greater than 84%. When compared with experts' assessment, both models identified 35 (97.2%) of 36 fever spikes. The models proposed are highly accurate in forecasting the appearance of fever spikes within a short period in patients with suspected or confirmed febrile-related illnesses. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Diffraction measurements using the LHC Beam Loss Monitoring System

    NASA Astrophysics Data System (ADS)

    Kalliokoski, Matti

    2017-03-01

    The Beam Loss Monitoring (BLM) system of the Large Hadron Collider protects the machine from beam induced damage by measuring the absorbed dose rates of beam losses, and by triggering beam dump if the rates increase above the allowed threshold limits. Although the detection time scales are optimized for multi-turn losses, information on fast losses can be recovered from the loss data. In this paper, methods in using the BLM system in diffraction studies are discussed.

  2. Using Color, Texture and Object-Based Image Analysis of Multi-Temporal Camera Data to Monitor Soil Aggregate Breakdown

    PubMed Central

    Ymeti, Irena; van der Werff, Harald; Shrestha, Dhruba Pikha; Jetten, Victor G.; Lievens, Caroline; van der Meer, Freek

    2017-01-01

    Remote sensing has shown its potential to assess soil properties and is a fast and non-destructive method for monitoring soil surface changes. In this paper, we monitor soil aggregate breakdown under natural conditions. From November 2014 to February 2015, images and weather data were collected on a daily basis from five soils susceptible to detachment (Silty Loam with various organic matter content, Loam and Sandy Loam). Three techniques that vary in image processing complexity and user interaction were tested for the ability of monitoring aggregate breakdown. Considering that the soil surface roughness causes shadow cast, the blue/red band ratio is utilized to observe the soil aggregate changes. Dealing with images with high spatial resolution, image texture entropy, which reflects the process of soil aggregate breakdown, is used. In addition, the Huang thresholding technique, which allows estimation of the image area occupied by soil aggregate, is performed. Our results show that all three techniques indicate soil aggregate breakdown over time. The shadow ratio shows a gradual change over time with no details related to weather conditions. Both the entropy and the Huang thresholding technique show variations of soil aggregate breakdown responding to weather conditions. Using data obtained with a regular camera, we found that freezing–thawing cycles are the cause of soil aggregate breakdown. PMID:28556803

  3. Using Color, Texture and Object-Based Image Analysis of Multi-Temporal Camera Data to Monitor Soil Aggregate Breakdown.

    PubMed

    Ymeti, Irena; van der Werff, Harald; Shrestha, Dhruba Pikha; Jetten, Victor G; Lievens, Caroline; van der Meer, Freek

    2017-05-30

    Remote sensing has shown its potential to assess soil properties and is a fast and non-destructive method for monitoring soil surface changes. In this paper, we monitor soil aggregate breakdown under natural conditions. From November 2014 to February 2015, images and weather data were collected on a daily basis from five soils susceptible to detachment (Silty Loam with various organic matter content, Loam and Sandy Loam). Three techniques that vary in image processing complexity and user interaction were tested for the ability of monitoring aggregate breakdown. Considering that the soil surface roughness causes shadow cast, the blue/red band ratio is utilized to observe the soil aggregate changes. Dealing with images with high spatial resolution, image texture entropy, which reflects the process of soil aggregate breakdown, is used. In addition, the Huang thresholding technique, which allows estimation of the image area occupied by soil aggregate, is performed. Our results show that all three techniques indicate soil aggregate breakdown over time. The shadow ratio shows a gradual change over time with no details related to weather conditions. Both the entropy and the Huang thresholding technique show variations of soil aggregate breakdown responding to weather conditions. Using data obtained with a regular camera, we found that freezing-thawing cycles are the cause of soil aggregate breakdown.

  4. Pacing threshold changes after transvenous catheter countershock.

    PubMed

    Yee, R; Jones, D L; Klein, G J

    1984-02-01

    The serial changes in pacing threshold and R-wave amplitude were examined after insertion of a countershock catheter in 12 patients referred for management of recurrent ventricular tachyarrhythmias. In 6 patients, values before and immediately after catheter countershock were monitored. Pacing threshold increased (from 1.4 +/- 0.2 to 2.4 +/- 0.5 V, mean +/- standard error of the mean, p less than 0.05) while the R-wave amplitude decreased (bipolar R wave from 5.9 +/- 1.1 to 3.4 +/- 0.7 mV, p less than 0.01; unipolar R wave recorded from the distal ventricular electrode from 8.9 +/- 1.8 to 4.6 +/- 1.2 mV, p less than 0.01; and proximal ventricular electrode from 7.7 +/- 1.5 to 5.0 +/- 1.0 mV, p less than 0.01). A return to control values occurred within 10 minutes. In all patients, pacing threshold increased by 154 +/- 30% (p less than 0.001) during the first 7 days that the catheter was in place. It is concluded that catheter countershock causes an acute increase in pacing threshold and decrease in R-wave amplitude. A catheter used for countershock may not be acceptable as a backup pacing catheter.

  5. Estimating phonation threshold pressure.

    PubMed

    Fisher, K V; Swank, P R

    1997-10-01

    Phonation threshold pressure (PTP) is the minimum subglottal pressure required to initiate vocal fold oscillation. Although potentially useful clinically, PTP is difficult to estimate noninvasively because of limitations to vocal motor control near the threshold of soft phonation. Previous investigators observed, for example, that trained subjects were unable to produce flat, consistent oral pressure peaks during/pae/syllable strings when they attempted to phonate as softly as possible (Verdolini-Marston, Titze, & Druker, 1990). The present study aimed to determine if nasal airflow or vowel context affected phonation threshold pressure as estimated from oral pressure (Smitheran & Hixon, 1981) in 5 untrained female speakers with normal velopharyngeal and voice function. Nasal airflow during /p/occlusion was observed for 3 of 5 participants when they attempted to phonate near threshold pressure. When the nose was occluded, nasal airflow was reduced or eliminated during /p/;however, individuals then evidenced compensatory changes in glottal adduction and/or respiratory effort that may be expected to alter PTP estimates. Results demonstrate the importance of monitoring nasal flow (or the flow zero point in undivided masks) when obtaining PTP measurements noninvasively. Results also highlight the need to pursue improved methods for noninvasive estimation of PTP.

  6. A classification model of Hyperion image base on SAM combined decision tree

    NASA Astrophysics Data System (ADS)

    Wang, Zhenghai; Hu, Guangdao; Zhou, YongZhang; Liu, Xin

    2009-10-01

    Monitoring the Earth using imaging spectrometers has necessitated more accurate analyses and new applications to remote sensing. A very high dimensional input space requires an exponentially large amount of data to adequately and reliably represent the classes in that space. On the other hand, with increase in the input dimensionality the hypothesis space grows exponentially, which makes the classification performance highly unreliable. Traditional classification algorithms Classification of hyperspectral images is challenging. New algorithms have to be developed for hyperspectral data classification. The Spectral Angle Mapper (SAM) is a physically-based spectral classification that uses an ndimensional angle to match pixels to reference spectra. The algorithm determines the spectral similarity between two spectra by calculating the angle between the spectra, treating them as vectors in a space with dimensionality equal to the number of bands. The key and difficulty is that we should artificial defining the threshold of SAM. The classification precision depends on the rationality of the threshold of SAM. In order to resolve this problem, this paper proposes a new automatic classification model of remote sensing image using SAM combined with decision tree. It can automatic choose the appropriate threshold of SAM and improve the classify precision of SAM base on the analyze of field spectrum. The test area located in Heqing Yunnan was imaged by EO_1 Hyperion imaging spectrometer using 224 bands in visual and near infrared. The area included limestone areas, rock fields, soil and forests. The area was classified into four different vegetation and soil types. The results show that this method choose the appropriate threshold of SAM and eliminates the disturbance and influence of unwanted objects effectively, so as to improve the classification precision. Compared with the likelihood classification by field survey data, the classification precision of this model heightens 9.9%.

  7. Validation of a novel wearable, wireless technology to estimate oxygen levels and lactate threshold power in the exercising muscle.

    PubMed

    Farzam, Parisa; Starkweather, Zack; Franceschini, Maria A

    2018-04-01

    There is a growing interest in monitoring muscle oxygen saturation (SmO 2 ), which is a localized measure of muscle oxidative metabolism and can be acquired continuously and noninvasively using near-infrared spectroscopy (NIRS) methods. Most NIRS systems are cumbersome, expensive, fiber coupled devices, with use limited to lab settings. A novel, low cost, wireless, wearable has been developed for use in athletic training. In this study, we evaluate the advantages and limitations of this new simple continuous-wave (CW) NIRS device with respect to a benchtop, frequency-domain near-infrared spectroscopy (FDNIRS) system. Oxygen saturation and hemoglobin/myoglobin concentration in the exercising muscles of 17 athletic individuals were measured simultaneously with the two systems, while subjects performed an incremental test on a stationary cycle ergometer. In addition, blood lactate concentration was measured at the end of each increment with a lactate analyzer. During exercise, the correlation coefficients of the SmO 2 and hemoglobin/myoglobin concentrations between the two systems were over 0.70. We also found both systems were insensitive to the presence of thin layers of varying absorption, mimicking different skin colors. Neither system was able to predict the athletes' lactate threshold power accurately by simply using SmO 2 thresholds. Instead, the proprietary software of the wearable device was able to predict the athletes' lactate threshold power within half of one power increment of the cycling test. These results indicate this novel wearable device may provide a physiological indicator of athlete's exertion. © 2018 The Authors. Physiological Reports published by Wiley Periodicals, Inc. on behalf of The Physiological Society and the American Physiological Society.

  8. A Novel approach for monitoring cyanobacterial blooms using an ensemble based system from MODIS imagery downscaled to 250 metres spatial resolution

    NASA Astrophysics Data System (ADS)

    El Alem, A.; Chokmani, K.; Laurion, I.; El-Adlouni, S. E.

    2014-12-01

    In reason of inland freshwaters sensitivity to Harmful algae blooms (HAB) development and the limits coverage of standards monitoring programs, remote sensing data have become increasingly used for monitoring HAB extension. Usually, HAB monitoring using remote sensing data is based on empirical and semi-empirical models. Development of such models requires a great number of continuous in situ measurements to reach an acceptable accuracy. However, Ministries and water management organizations often use two thresholds, established by the World Health Organization, to determine water quality. Consequently, the available data are ordinal «semi-qualitative» and they are mostly unexploited. Use of such databases with remote sensing data and statistical classification algorithms can produce hazard management maps linked to the presence of cyanobacteria. Unlike standard classification algorithms, which are generally unstable, classifiers based on ensemble systems are more general and stable. In the present study, an ensemble based classifier was developed and compared to a standard classification method called CART (Classification and Regression Tree) in a context of HAB monitoring in freshwaters using MODIS images downscaled to 250 spatial resolution and ordinal in situ data. Calibration and validation data on cyanobacteria densities were collected by the Ministère du Développement durable, de l'Environnement et de la Lutte contre les changements climatiques on 22 waters bodies between 2000 and 2010. These data comprise three density classes: waters poorly (< 20,000 cells mL-1), moderately (20,000 - 100,000 cells mL-1), and highly (> 100,000 cells mL-1) loaded in cyanobacteria. Results were very interesting and highlighted that inland waters exhibit different spectral response allowing them to be classified into the three above classes for water quality monitoring. On the other, even if the accuracy (Kappa-index = 0.86) of the proposed approach is relatively lower than that of the CART algorithm (Kappa-index = 0.87), but its robustness is higher with a standard-deviation of 0.05 versus 0.06, specifically when applied on MODIS images. A new accurate, robust, and quick approach is thus proposed for a daily near real-time monitoring of HAB in southern Quebec freshwaters.

  9. Monitoring Architectural Heritage by Wireless Sensors Networks: San Gimignano — A Case Study

    PubMed Central

    Mecocci, Alessandro; Abrardo, Andrea

    2014-01-01

    This paper describes a wireless sensor network (WSN) used to monitor the health state of architectural heritage in real-time. The WSN has been deployed and tested on the “Rognosa” tower in the medieval village of San Gimignano, Tuscany, Italy. This technology, being non-invasive, mimetic, and long lasting, is particularly well suited for long term monitoring and on-line diagnosis of the conservation state of heritage buildings. The proposed monitoring system comprises radio-equipped nodes linked to suitable sensors capable of monitoring crucial parameters like: temperature, humidity, masonry cracks, pouring rain, and visual light. The access to data is granted by a user interface for remote control. The WSN can autonomously send remote alarms when predefined thresholds are reached. PMID:24394600

  10. Pool desiccation and developmental thresholds in the common frog, Rana temporaria.

    PubMed

    Lind, Martin I; Persbo, Frida; Johansson, Frank

    2008-05-07

    The developmental threshold is the minimum size or condition that a developing organism must have reached in order for a life-history transition to occur. Although developmental thresholds have been observed for many organisms, inter-population variation among natural populations has not been examined. Since isolated populations can be subjected to strong divergent selection, population divergence in developmental thresholds can be predicted if environmental conditions favour fast or slow developmental time in different populations. Amphibian metamorphosis is a well-studied life-history transition, and using a common garden approach we compared the development time and the developmental threshold of metamorphosis in four island populations of the common frog Rana temporaria: two populations originating from islands with only temporary breeding pools and two from islands with permanent pools. As predicted, tadpoles from time-constrained temporary pools had a genetically shorter development time than those from permanent pools. Furthermore, the variation in development time among females from temporary pools was low, consistent with the action of selection on rapid development in this environment. However, there were no clear differences in the developmental thresholds between the populations, indicating that the main response to life in a temporary pool is to shorten the development time.

  11. 24 CFR 1003.302 - Project specific threshold requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 24 Housing and Urban Development 4 2014-04-01 2014-04-01 false Project specific threshold requirements. 1003.302 Section 1003.302 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN... Purpose Grant Application and Selection Process § 1003.302 Project specific threshold requirements. (a...

  12. Methods, apparatus and system for selective duplication of subtasks

    DOEpatents

    Andrade Costa, Carlos H.; Cher, Chen-Yong; Park, Yoonho; Rosenburg, Bryan S.; Ryu, Kyung D.

    2016-03-29

    A method for selective duplication of subtasks in a high-performance computing system includes: monitoring a health status of one or more nodes in a high-performance computing system, where one or more subtasks of a parallel task execute on the one or more nodes; identifying one or more nodes as having a likelihood of failure which exceeds a first prescribed threshold; selectively duplicating the one or more subtasks that execute on the one or more nodes having a likelihood of failure which exceeds the first prescribed threshold; and notifying a messaging library that one or more subtasks were duplicated.

  13. Ontogenetic improvement of visual function in the medaka Oryzias latipes based on an optomotor testing system for larval and adult fish

    USGS Publications Warehouse

    Carvalho, Paulo S. M.; Noltie, Douglas B.; Tillitt, D.E.

    2002-01-01

    We developed a system for evaluation of visual function in larval and adult fish. Both optomotor (swimming) and optokinetic (eye movement) responses were monitored and recorded using a system of rotating stripes. The system allowed manipulation of factors such as width of the stripes used, rotation speed of the striped drum, and light illuminance levels within both the scotopic and photopic ranges. Precise control of these factors allowed quantitative measurements of visual acuity and motion detection. Using this apparatus, we tested the hypothesis that significant posthatch ontogenetic improvements in visual function occur in the medaka Oryzias latipes, and also that this species shows significant in ovo neuronal development. Significant improvements in the acuity angle alpha (ability to discriminate detail) were observed from approximately 5 degrees at hatch to 1 degree in the oldest adult stages. In addition, we measured a significant improvement in flicker fusion thresholds (motion detection skills) between larval and adult life stages within both the scotopic and photopic ranges of light illuminance. Ranges of flicker fusion thresholds (X±SD) at log I=1.96 (photopic) varied from 37.2±1.6 cycles/s in young adults to 18.6±1.6 cycles/s in young larvae 10 days posthatch. At log I=−2.54 (scotopic), flicker fusion thresholds varied from 5.8±0.7 cycles/s in young adults to 1.7±0.4 cycles/s in young larvae 10 days posthatch. Light sensitivity increased approximately 2.9 log units from early hatched larval stages to adults. The demonstrated ontogenetic improvements in visual function probably enable the fish to explore new resources, thereby enlarging their fundamental niche.

  14. A New Vegetation Segmentation Approach for Cropped Fields Based on Threshold Detection from Hue Histograms

    PubMed Central

    Hassanein, Mohamed; El-Sheimy, Naser

    2018-01-01

    Over the last decade, the use of unmanned aerial vehicle (UAV) technology has evolved significantly in different applications as it provides a special platform capable of combining the benefits of terrestrial and aerial remote sensing. Therefore, such technology has been established as an important source of data collection for different precision agriculture (PA) applications such as crop health monitoring and weed management. Generally, these PA applications depend on performing a vegetation segmentation process as an initial step, which aims to detect the vegetation objects in collected agriculture fields’ images. The main result of the vegetation segmentation process is a binary image, where vegetations are presented in white color and the remaining objects are presented in black. Such process could easily be performed using different vegetation indexes derived from multispectral imagery. Recently, to expand the use of UAV imagery systems for PA applications, it was important to reduce the cost of such systems through using low-cost RGB cameras Thus, developing vegetation segmentation techniques for RGB images is a challenging problem. The proposed paper introduces a new vegetation segmentation methodology for low-cost UAV RGB images, which depends on using Hue color channel. The proposed methodology follows the assumption that the colors in any agriculture field image can be distributed into vegetation and non-vegetations colors. Therefore, four main steps are developed to detect five different threshold values using the hue histogram of the RGB image, these thresholds are capable to discriminate the dominant color, either vegetation or non-vegetation, within the agriculture field image. The achieved results for implementing the proposed methodology showed its ability to generate accurate and stable vegetation segmentation performance with mean accuracy equal to 87.29% and standard deviation as 12.5%. PMID:29670055

  15. Dose-response effects of corneal anesthetics.

    PubMed

    Polse, K A; Keener, R J; Jauregui, M J

    1978-01-01

    With double-masking procedures, the dose-response curves for 0.1, 0.2, and 0.4% benoxinate and 0.125, 0.25, and 0.50% proparacaine hydrochloride were determined by monitoring changes in corneal touch threshold after applying each anesthetic. The level of corneal anesthesia necessary for applanation tonometry was also determined. The maximum increase in threshold that could be measured following instillation of 50 microliter of the drug was 200 mg/mm2 All 6 anesthetic solutions produced this amount of decreased corneal sensitivity. Recovery from the anesthetic was exponential for all concentrations; however, the lower doses had the shortest duration. For applanation tonometry, the corneal threshold for touch must be 75 mg/mm2 or higher. We conclude that a quarter to a half of the commonly used anesthetic dose is sufficient for routine tonometric evaluation.

  16. Monitoring of HIV viral load, CD4 cell count, and clinical assessment versus clinical monitoring alone for antiretroviral therapy in low-resource settings (Stratall ANRS 12110/ESTHER): a cost-effectiveness analysis.

    PubMed

    Boyer, Sylvie; March, Laura; Kouanfack, Charles; Laborde-Balen, Gabrièle; Marino, Patricia; Aghokeng, Avelin Fobang; Mpoudi-Ngole, Eitel; Koulla-Shiro, Sinata; Delaporte, Eric; Carrieri, Maria Patrizia; Spire, Bruno; Laurent, Christian; Moatti, Jean-Paul

    2013-07-01

    In low-income countries, the use of laboratory monitoring of patients taking antiretroviral therapy (ART) remains controversial in view of persistent resource constraints. The Stratall trial did not show that clinical monitoring alone was non-inferior to laboratory and clinical monitoring in terms of immunological recovery. We aimed to evaluate the costs and cost-effectiveness of the ART monitoring approaches assessed in the Stratall trial. The randomised, controlled, non-inferiority Stratall trial was done in a decentralised setting in Cameroon. Between May 23, 2006, and Jan 31, 2008, ART-naive adults were randomly assigned (1:1) to clinical monitoring (CLIN) or viral load and CD4 cell count plus clinical monitoring (LAB) and followed up for 24 months. We calculated costs, number of life-years saved (LYS), and incremental cost-effectiveness ratios (ICERs) with data from patients who had been followed up for at least 6 months. We considered two cost scenarios in which viral load plus CD4 cell count tests cost either US$95 (scenario 1; Abbott RealTime HIV-1 assay) or $63 (scenario 2; generic assay). We compared ICERs with a WHO-recommended threshold of three times the per-person gross domestic product (GDP) for Cameroon ($3670-3800) and an alternative lower threshold of $2385 to determine cost-effectiveness. We assessed uncertainty with one-way sensitivity analyses and cost-effectiveness acceptability curves. 188 participants who underwent LAB and 197 who underwent CLIN were followed up for at least 6 months. In scenario 1, LAB increased costs by a mean of $489 (SD 430) per patient and saved 0·103 life-years compared with CLIN (ICER of $4768 [95% CI 3926-5613] per LYS). In scenario 2, the incremental mean cost of LAB was $343 (SD 425) -ie, an ICER of $3339 (2507-4173) per LYS. A combined strategy in which LAB would only be used in patients starting ART with a CD4 count of 200 cells per μL or fewer suggests that 0·120 life-years would be saved at an additional cost of $259 per patient in scenario 1 (ICER of $2167 [95% CI 1314-3020] per LYS) and $181 in scenario 2 (ICER of $1510 [692-2329] per LYS) when compared with CLIN. Laboratory monitoring was not cost effective in 2006-10 compared with clinical monitoring when the Abbott RealTime HIV-1 assay was used according to the $3670 cost-effectiveness threshold (three times per-person GDP in Cameroon), but it might be cost effective if a generic in-house assay is used. French National Agency for Research on AIDS and Viral Hepatitis (ANRS) and Ensemble pour une Solidarité Thérapeutique Hospitalière En Réseau (ESTHER). Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Ozone levels in the Spanish Sierra de Guadarrama mountain range are above the thresholds for plant protection: analysis at 2262, 1850, and 995 m a.s.l.

    PubMed

    Elvira, S; González-Fernández, I; Alonso, R; Sanz, J; Bermejo-Bermejo, V

    2016-10-01

    The Sierra de Guadarrama mountain range, located at 60 km from Madrid City (Spain), includes high valuable ecosystems following an altitude gradient, some of them protected under the Sierra de Guadarrama National Park. The characteristic Mediterranean climatic conditions and the precursors emitted from Madrid favor a high photochemical production of ozone (O 3 ) in the region. However, very little information is available about the patterns and levels of O 3 and other air pollutants in the high elevation areas and their potential effects on vegetation. Ozone levels were monitored at three altitudes (2262, 1850, and 995 m a.s.l.) for at least 3 years within the 2005-2011 period. NO x and SO 2 were also recorded at the highest and lowest altitude sites. Despite the inter-annual and seasonal variations detected in the O 3 concentrations, the study revealed that SG is exposed to a chronic O 3 pollution. The two high elevation sites showed high O 3 levels even in winter and at nighttime, having low correlation with local meteorological variables. At the lower elevation site, O 3 levels were more related with local meteorological and pollution conditions. Ozone concentrations at the three sites exceeded the thresholds for the protection of human health and vegetation according to the European Air Quality Directive (EU/50/2008) and the thresholds for vegetation protection of the CLRTAP. Ozone should be considered as a stress factor for the health of the Sierra de Guadarrama mountain ecosystems. Furthermore, since O 3 levels at foothills differ from concentration in high elevation, monitoring stations in mountain ranges should be incorporated in regional air quality monitoring networks.

  18. Therapeutic drug monitoring of thiopurine metabolites in adult thiopurine tolerant IBD patients on maintenance therapy.

    PubMed

    Gilissen, Lennard P L; Wong, Dennis R; Engels, Leopold G J B; Bierau, Jörgen; Bakker, Jaap A; Paulussen, Aimée D C; Romberg-Camps, Mariëlle J; Stronkhorst, Arnold; Bus, Paul; Bos, Laurens P; Hooymans, Piet M; Stockbrügger, Reinhold W; Neef, Cees; Masclee, Ad A M

    2012-07-01

    Therapeutic drug monitoring of active metabolites of thiopurines, azathioprine and 6-mercaptopurine, is relatively new. The proposed therapeutic threshold level of the active 6-thioguanine nucleotides (6-TGN) is ≥235 pmol/8×10(8) erythrocytes. The aim of this prospective cross-sectional study was to compare 6-TGN levels in adult thiopurine tolerant IBD patients with an exacerbation with those in remission, and to determine the therapeutic 6-TGN cut-off level. Hundred IBD patients were included. Outcome measures were thiopurine metabolite levels, calculated therapeutic 6-TGN cut-off level, CDAI/CAI scores, thiopurine dose and TPMT enzyme activity. Forty-one patients had an exacerbation, 59 patients were in remission. In 17% of all patients 6-TGN levels were compatible with non-compliance. The median 6-TGN levels were not significantly different between the exacerbation and remission group (227 versus 263 pmol/8×10(8) erythrocytes, p=0.29). The previous reported therapeutic 6-TGN cut-off level of 235 pmol/8×10(8) erythrocytes was confirmed in this study. Twenty-six of the 41 patients (63%) with active disease had 6-TGN levels below this threshold and 24 of 59 IBD patients (41%) in clinical remission (p=0.04). Thiopurine non-compliance occurs frequently both in active and quiescent disease. 6-TGN levels below or above the therapeutic threshold are associated with a significant higher chance of IBD exacerbation and remission, respectively. These data support the role of therapeutic drug monitoring in thiopurine maintenance therapy in IBD to reveal non-compliance or underdosing, and can be used as a practical tool to optimize thiopurine therapy, especially in case of thiopurine non-response. Copyright © 2011 European Crohn's and Colitis Organisation. Published by Elsevier B.V. All rights reserved.

  19. Methods to achieve high interrater reliability in data collection from primary care medical records.

    PubMed

    Liddy, Clare; Wiens, Miriam; Hogg, William

    2011-01-01

    We assessed interrater reliability (IRR) of chart abstractors within a randomized trial of cardiovascular care in primary care. We report our findings, and outline issues and provide recommendations related to determining sample size, frequency of verification, and minimum thresholds for 2 measures of IRR: the κ statistic and percent agreement. We designed a data quality monitoring procedure having 4 parts: use of standardized protocols and forms, extensive training, continuous monitoring of IRR, and a quality improvement feedback mechanism. Four abstractors checked a 5% sample of charts at 3 time points for a predefined set of indicators of the quality of care. We set our quality threshold for IRR at a κ of 0.75, a percent agreement of 95%, or both. Abstractors reabstracted a sample of charts in 16 of 27 primary care practices, checking a total of 132 charts with 38 indicators per chart. The overall κ across all items was 0.91 (95% confidence interval, 0.90-0.92) and the overall percent agreement was 94.3%, signifying excellent agreement between abstractors. We gave feedback to the abstractors to highlight items that had a κ of less than 0.70 or a percent agreement less than 95%. No practice had to have its charts abstracted again because of poor quality. A 5% sampling of charts for quality control using IRR analysis yielded κ and agreement levels that met or exceeded our quality thresholds. Using 3 time points during the chart audit phase allows for early quality control as well as ongoing quality monitoring. Our results can be used as a guide and benchmark for other medical chart review studies in primary care.

  20. Seasonal variations in body composition, maximal oxygen uptake, and gas exchange threshold in cross-country skiers.

    PubMed

    Polat, Metin; Korkmaz Eryılmaz, Selcen; Aydoğan, Sami

    2018-01-01

    In order to ensure that athletes achieve their highest performance levels during competitive seasons, monitoring their long-term performance data is crucial for understanding the impact of ongoing training programs and evaluating training strategies. The present study was thus designed to investigate the variations in body composition, maximal oxygen uptake (VO 2max ), and gas exchange threshold values of cross-country skiers across training phases throughout a season. In total, 15 athletes who participate in international cross-country ski competitions voluntarily took part in this study. The athletes underwent incremental treadmill running tests at 3 different time points over a period of 1 year. The first measurements were obtained in July, during the first preparation period; the second measurements were obtained in October, during the second preparation period; and the third measurements were obtained in February, during the competition period. Body weight, body mass index (BMI), body fat (%), as well as VO 2max values and gas exchange threshold, measured using V-slope method during the incremental running tests, were assessed at all 3 time points. The collected data were analyzed using SPSS 20 package software. Significant differences between the measurements were assessed using Friedman's twoway variance analysis with a post hoc option. The athletes' body weights and BMI measurements at the third point were significantly lower compared with the results of the second measurement ( p <0.001). Moreover, the incremental running test time was significantly higher at the third measurement, compared with both the first ( p <0.05) and the second ( p <0.01) measurements. Similarly, the running speed during the test was significantly higher at the third measurement time point compared with the first measurement time point ( p <0.05). Body fat (%), time to reach the gas exchange threshold, running speed at the gas exchange threshold, VO 2max , amount of oxygen consumed at gas exchange threshold level (VO 2GET ), maximal heart rate (HR max ), and heart rate at gas exchange threshold level (HR GET ) values did not significantly differ between the measurement time points ( p >0.05). VO 2max and gas exchange threshold values recorded during the third measurements, the timing of which coincided with the competitive season of the cross-country skiers, did not significantly change, but their incremental running test time and running speed significantly increased while their body weight and BMI significantly decreased. These results indicate that the cross-country skiers developed a tolerance for high-intensity exercise and reached their highest level of athletic performance during the competitive season.

  1. How is environmental conflict addressed by SIA?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrow, C.J., E-mail: c.j.barrow@swansea.ac.u

    2010-09-15

    The fields of Environmental Conflict Management (ECM), Environmental Conflict Resolution (ECR), and Peace and Conflict Impact Assessment (PCIA) have become well established; however, as yet there has not been much use of Social Impact Assessment (SIA) to manage environmental conflicts. ECM, ECR and PCIA are mainly undertaken when problems are advanced or, more likely, have run their course (post-conflict). This paper examines how conflict is addressed by SIA and whether there is potential to develop it for more proactive assessment of conflicts (pre-conflict or while things develop). SIA has the potential to identify and clarify the cause(s) of environmental andmore » natural resources conflicts, and could possibly enable some avoidance or early mitigation. A promising approach may be for 'conflict-aware' SIA to watch for critical conflict stages or thresholds and to monitor stakeholders. Effective conflict-aware SIA might also significantly contribute to efforts to achieve sustainable development.« less

  2. Temporal and Spatial Variation in, and Population Exposure to, Summertime Ground-Level Ozone in Beijing

    PubMed Central

    Zheng, Youfei; Li, Ting; Wei, Li; Guan, Qing

    2018-01-01

    Ground-level ozone pollution in Beijing has been causing concern among the public due to the risks posed to human health. This study analyzed the temporal and spatial distribution of, and investigated population exposure to, ground-level ozone. We analyzed hourly ground-level ozone data from 35 ambient air quality monitoring sites, including urban, suburban, background, and traffic monitoring sites, during the summer in Beijing from 2014 to 2017. The results showed that the four-year mean ozone concentrations for urban, suburban, background, and traffic monitoring sites were 95.1, 99.8, 95.9, and 74.2 μg/m3, respectively. A total of 44, 43, 45, and 43 days exceeded the Chinese National Ambient Air Quality Standards (NAAQS) threshold for ground-level ozone in 2014, 2015, 2016, and 2017, respectively. The mean ozone concentration was higher in suburban sites than in urban sites, and the traffic monitoring sites had the lowest concentration. The diurnal variation in ground-level ozone concentration at the four types of monitoring sites displayed a single-peak curve. The peak and valley values occurred at 3:00–4:00 p.m. and 7:00 a.m., respectively. Spatially, ground-level ozone concentrations decreased in gradient from the north to the south. Population exposure levels were calculated based on ground-level ozone concentrations and population data. Approximately 50.38%, 44.85%, and 48.49% of the total population of Beijing were exposed to ground-level ozone concentrations exceeding the Chinese NAAQS threshold in 2014, 2015, and 2016, respectively. PMID:29596366

  3. Alarm characterization for a continuous glucose monitor that replaces traditional blood glucose monitoring.

    PubMed

    McGarraugh, Geoffrey

    2010-01-01

    Continuous glucose monitoring (CGM) devices available in the United States are approved for use as adjuncts to self-monitoring of blood glucose (SMBG); all CGM alarms require SMBG confirmation before treatment. In this report, an analysis method is proposed to determine the CGM threshold alarm accuracy required to eliminate SMBG confirmation. The proposed method builds on the Clinical and Laboratory Standards Institute (CLSI) guideline for evaluating CGM threshold alarms using data from an in-clinic study of subjects with type 1 diabetes. The CLSI method proposes a maximum time limit of +/-30 minutes for the detection of hypo- and hyperglycemic events but does not include limits for glucose measurement accuracy. The International Standards Organization (ISO) standard for SMBG glucose measurement accuracy (ISO 15197) is +/-15 mg/dl for glucose <75 mg/dl and +/-20% for glucose > or = 75 mg/dl. This standard was combined with the CLSI method to more completely characterize the accuracy of CGM alarms. Incorporating the ISO 15197 accuracy margins, FreeStyle Navigator CGM system alarms detected 70 mg/dl hypoglycemia within 30 minutes at a rate of 70.3%, with a false alarm rate of 11.4%. The device detected high glucose in the range of 140-300 mg/dl within 30 minutes at an average rate of 99.2%, with a false alarm rate of 2.1%. Self-monitoring of blood glucose confirmation is necessary for detecting and treating hypoglycemia with the FreeStyle Navigator CGM system, but at high glucose levels, SMBG confirmation adds little incremental value to CGM alarms. 2010 Diabetes Technology Society.

  4. Temporal and Spatial Variation in, and Population Exposure to, Summertime Ground-Level Ozone in Beijing.

    PubMed

    Zhao, Hui; Zheng, Youfei; Li, Ting; Wei, Li; Guan, Qing

    2018-03-29

    Ground-level ozone pollution in Beijing has been causing concern among the public due to the risks posed to human health. This study analyzed the temporal and spatial distribution of, and investigated population exposure to, ground-level ozone. We analyzed hourly ground-level ozone data from 35 ambient air quality monitoring sites, including urban, suburban, background, and traffic monitoring sites, during the summer in Beijing from 2014 to 2017. The results showed that the four-year mean ozone concentrations for urban, suburban, background, and traffic monitoring sites were 95.1, 99.8, 95.9, and 74.2 μg/m³, respectively. A total of 44, 43, 45, and 43 days exceeded the Chinese National Ambient Air Quality Standards (NAAQS) threshold for ground-level ozone in 2014, 2015, 2016, and 2017, respectively. The mean ozone concentration was higher in suburban sites than in urban sites, and the traffic monitoring sites had the lowest concentration. The diurnal variation in ground-level ozone concentration at the four types of monitoring sites displayed a single-peak curve. The peak and valley values occurred at 3:00-4:00 p.m. and 7:00 a.m., respectively. Spatially, ground-level ozone concentrations decreased in gradient from the north to the south. Population exposure levels were calculated based on ground-level ozone concentrations and population data. Approximately 50.38%, 44.85%, and 48.49% of the total population of Beijing were exposed to ground-level ozone concentrations exceeding the Chinese NAAQS threshold in 2014, 2015, and 2016, respectively.

  5. Deriving a GPS Monitoring Time Recommendation for Physical Activity Studies of Adults.

    PubMed

    Holliday, Katelyn M; Howard, Annie Green; Emch, Michael; Rodríguez, Daniel A; Rosamond, Wayne D; Evenson, Kelly R

    2017-05-01

    Determining locations of physical activity (PA) is important for surveillance and intervention development, yet recommendations for using location recording tools like global positioning system (GPS) units are lacking. Specifically, no recommendation exists for the number of days study participants should wear a GPS to reliably estimate PA time spent in locations. This study used data from participants (N = 224, age = 18-85 yr) in five states who concurrently wore an ActiGraph GT1M accelerometer and a Qstarz BT-Q1000X GPS for three consecutive weeks to construct monitoring day recommendations through variance partitioning methods. PA bouts ≥10 min were constructed from accelerometer counts, and the location of GPS points was determined using a hand-coding protocol. Monitoring day recommendations varied by the type of location (e.g., participant homes vs parks) and the intensity of PA bouts considered (low and medium cut point moderate to vigorous PA [MVPA] bouts or high cut point vigorous PA [VPA] bouts). In general, minutes of all PA intensities spent in a given location could be measured with ≥80% reliability using 1-3 d of GPS monitoring for fitness facilities, schools, and footpaths. MVPA bout minutes in parks and roads required longer monitoring periods of 5-12 d. PA in homes and commercial areas required >19 d of monitoring. Twelve days of monitoring was found to reliably estimate minutes in both low and medium threshold MVPA as well as VPA bouts for many important built environment locations that can be targeted to increase PA at the population level. Minutes of PA in the home environment and commercial locations may be best assessed through other means given the lengthy estimated monitoring time required.

  6. Exploring applications of GPR methodology and uses in determining floodplain function of restored streams in the Gulf Coastal Plain, Alabama

    NASA Astrophysics Data System (ADS)

    Eckes, S. W.; Shepherd, S. L.

    2017-12-01

    Accurately characterizing subsurface structure and function of remediated floodplains is indispensable in understanding the success of stream restoration projects. Although many of these projects are designed to address increased storm water runoff due to urbanization, long term monitoring and assessment are often limited in scope and methodology. Common monitoring practices include geomorphic surveys, stream discharge, and suspended sediment loads. These data are comprehensive for stream monitoring but they do not address floodplain function in terms of infiltration and through flow. Developing noninvasive methods for monitoring floodplain moisture transfer and distribution will aid in current and future stream restoration endeavors. Ground penetrating radar (GPR) has been successfully used in other physiographic regions for noninvasive and continuous monitoring of (1) natural geomorphic environments including subsurface structure and landform change and (2) soil and turf management to monitor subsurface moisture content. We are testing the viability of these existing methods to expand upon the broad capabilities of GPR. Determining suitability will be done in three parts using GPR to (1) find known buried objects of typical materials used in remediation at measured depths, (2) understand GPR functionality in varying soil moisture content thresholds on turf plots, and (3) model reference, remediated, and impacted floodplains in a case study in the D'Olive Creek watershed located in Baldwin County, Alabama. We hypothesize that these methods will allow us to characterize moisture transfer from precipitation and runoff to the floodplain which is a direct function of floodplain health. The need for a methodology to monitor floodplains is widespread and with increased resolution and mobility, expanding GPR applications may help streamline remediation and monitoring practices.

  7. Periodic fluctuations in correlation-based connectivity density time series: Application to wind speed-monitoring network in Switzerland

    NASA Astrophysics Data System (ADS)

    Laib, Mohamed; Telesca, Luciano; Kanevski, Mikhail

    2018-02-01

    In this paper, we study the periodic fluctuations of connectivity density time series of a wind speed-monitoring network in Switzerland. By using the correlogram-based robust periodogram annual periodic oscillations were found in the correlation-based network. The intensity of such annual periodic oscillations is larger for lower correlation thresholds and smaller for higher. The annual periodicity in the connectivity density seems reasonably consistent with the seasonal meteo-climatic cycle.

  8. Wideband Communications Equipment, Ground Radio Communication, Space Comm Systems Equipment. 304X0/X4/X6. Appendix A - Task Analysis

    DTIC Science & Technology

    1990-05-01

    ALARM LAMPS A CHECK TWT POWER SUPPLY VOLTAGE AND CURRENT A ADJUST POWER ALARM THRESHOLD AND TRANSMITTER OUTPUT A CHECK HELIX MONITOR K INTERPRET AN/FRC...POWER SUPPLY A CHECK TRAVELING WAVE TUBE ( TWT ) POWER SUPPLY HELIX CURRENT AND BEAM CURRENT A CHECK TWT RF POWER OUTPUT A CHECK TRANSMITTER POWER...A ADJUST TRANSMITTER LINEARITY A CALIBRATE TRANSMIT DEVIATION AND ADJUST MODULATION AMPLIFIER A ADJUST TWT PERFORMANCE MONITOR A ADJUST TWT OUTPUT

  9. The sensitivity of normal brain and intracranially implanted VX2 tumour to interstitial photodynamic therapy.

    PubMed Central

    Lilge, L.; Olivo, M. C.; Schatz, S. W.; MaGuire, J. A.; Patterson, M. S.; Wilson, B. C.

    1996-01-01

    The applicability and limitations of a photodynamic threshold model, used to describe quantitatively the in vivo response of tissues to photodynamic therapy, are currently being investigated in a variety of normal and malignant tumour tissues. The model states that tissue necrosis occurs when the number of photons absorbed by the photosensitiser per unit tissue volume exceeds a threshold. New Zealand White rabbits were sensitised with porphyrin-based photosensitisers. Normal brain or intracranially implanted VX2 tumours were illuminated via an optical fibre placed into the tissue at craniotomy. The light fluence distribution in the tissue was measured by multiple interstitial optical fibre detectors. The tissue concentration of the photosensitiser was determined post mortem by absorption spectroscopy. The derived photodynamic threshold values for normal brain are significantly lower than for VX2 tumour for all photosensitisers examined. Neuronal damage is evident beyond the zone of frank necrosis. For Photofrin the threshold decreases with time delay between photosensitiser administration and light treatment. No significant difference in threshold is found between Photofrin and haematoporphyrin derivative. The threshold in normal brain (grey matter) is lowest for sensitisation by 5 delta-aminolaevulinic acid. The results confirm the very high sensitivity of normal brain to porphyrin photodynamic therapy and show the importance of in situ light fluence monitoring during photodynamic irradiation. Images Figure 1 Figure 4 Figure 5 Figure 6 Figure 7 PMID:8562339

  10. Analysis of start-of-takeoff roll aircraft noise levels at Baltimore/Washington International Airport

    DOT National Transportation Integrated Search

    1990-08-01

    This report analyzes 34 days of near continuous noise monitor data acquired at : 851 Main Avenue, Linthicum, Maryland. The site is approximately 4000 feet : north northeast of the threshold of Runway l5R at Baltimore/Washington : International Airpor...

  11. USE OF WATERSHED CLASSIFICATION IN MONITORING FRAMEWORKS FOR THE WESTERN LAKE SUPERIOR BASIS

    EPA Science Inventory

    In this case study we predicted stream sensitivity to nonpoint source pollution based on the nonlinear responses of hydrologic regimes and associated loadings of nonpoint source pollutants to catchment properties. We assessed two hydrologically-based thresholds of impairment, on...

  12. n-hexane polyneuropathy in Japan: a review of n-hexane poisoning and its preventive measures.

    PubMed

    Takeuchi, Y

    1993-07-01

    n-Hexane is used in industry as a solvent for adhesive, dry cleaning, and vegetable oil extraction. In 1963, the first case of severe polyneuropathy suspected to be caused by n-hexane was referred to us. Case studies, animal experiments, and field surveys on n-hexane poisoning were conducted, and preventive measures like threshold limit value revision and biological monitoring were also studied. I review a brief history of our investigations on n-hexane poisoning and its preventive measures in Japan. n-Hexane could cause overt polyneuropathy in workers exposed to more than 100 ppm time-weighted average concentrations [TWA]. The present threshold limit value of 40 ppm in Japan is considered low enough to prevent subclinical impairment of peripheral nerve caused by n-hexane. Urinary 2,5-hexanedione could be a good indicator for biological monitoring of n-hexane exposure. About 2.2 mg/liter of 2,5-hexanedione measured by our improved method corresponds to exposure of 40 ppm (TWA) of n-hexane.

  13. Ex vivo and in vivo label-free imaging of lymphatic vessels using OCT lymphangiography (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Gong, Peijun; Es'haghian, Shaghayegh; Karnowski, Karol; Rea, Suzanne; Wood, Fiona M.; Yu, Dao-Yi; McLaughlin, Robert A.; Sampson, David D.

    2017-02-01

    We have been developing an automated method to image lymphatic vessels both ex vivo and in vivo with optical coherence tomography (OCT), using their optical transparency. Our method compensates for the OCT signal attenuation for each A-scan in combination with the correction of the confocal function and sensitivity fall-off, enabling reliable thresholding of lymphatic vessels from the OCT scans. Morphological image processing with a segment-joining algorithm is also incorporated into the method to mitigate partial-volume artifacts, which are particularly evident with small lymphatic vessels. Our method is demonstrated for two different clinical application goals: the monitoring of conjunctival lymphatics for surgical guidance and assessment of glaucoma treatment; and the longitudinal monitoring of human burn scars undergoing laser ablation treatment. We present examples of OCT lymphangiography ex vivo on porcine conjunctivas and in vivo on human burn scars, showing the visualization of the lymphatic vessel network and their longitudinal changes due to treatment.

  14. Recurrence of stroke caused by nocturnal hypoxia-induced blood pressure surge in a young adult male with severe obstructive sleep apnea syndrome.

    PubMed

    Yoshida, Tetsuro; Kuwabara, Mitsuo; Hoshide, Satoshi; Kario, Kazuomi

    2016-03-01

    Obstructive sleep apnea syndrome (OSAS) causes resistant hypertension and a hypopnea-related nocturnal blood pressure (BP) surge. This could lead to an increase of not only the nocturnal BP level but also nocturnal BP variability, both of which increase an individual's cardiovascular risk. We recently developed a trigger sleep BP monitoring method that initiates BP measurement when an individual's oxygen desaturation falls below a variable threshold, and we demonstrated that it can detect a BP surge during apnea episodes. We here report the case of a 36-year-old man with severe OSAS who experienced the recurrence of stroke due to nocturnal hypoxia and a nocturnal BP surge measured by this trigger sleep BP monitoring device. A nocturnal BP surge during sleep in OSAS patients could be a strong trigger of cardiovascular events. Copyright © 2016 American Society of Hypertension. Published by Elsevier Inc. All rights reserved.

  15. Condition monitoring of 3G cellular networks through competitive neural models.

    PubMed

    Barreto, Guilherme A; Mota, João C M; Souza, Luis G M; Frota, Rewbenio A; Aguayo, Leonardo

    2005-09-01

    We develop an unsupervised approach to condition monitoring of cellular networks using competitive neural algorithms. Training is carried out with state vectors representing the normal functioning of a simulated CDMA2000 network. Once training is completed, global and local normality profiles (NPs) are built from the distribution of quantization errors of the training state vectors and their components, respectively. The global NP is used to evaluate the overall condition of the cellular system. If abnormal behavior is detected, local NPs are used in a component-wise fashion to find abnormal state variables. Anomaly detection tests are performed via percentile-based confidence intervals computed over the global and local NPs. We compared the performance of four competitive algorithms [winner-take-all (WTA), frequency-sensitive competitive learning (FSCL), self-organizing map (SOM), and neural-gas algorithm (NGA)] and the results suggest that the joint use of global and local NPs is more efficient and more robust than current single-threshold methods.

  16. A Muscle Fibre Conduction Velocity Tracking ASIC for Local Fatigue Monitoring.

    PubMed

    Koutsos, Ermis; Cretu, Vlad; Georgiou, Pantelis

    2016-12-01

    Electromyography analysis can provide information about a muscle's fatigue state by estimating Muscle Fibre Conduction Velocity (MFCV), a measure of the travelling speed of Motor Unit Action Potentials (MUAPs) in muscle tissue. MFCV better represents the physical manifestations of muscle fatigue, compared to the progressive compression of the myoelectic Power Spectral Density, hence it is more suitable for a muscle fatigue tracking system. This paper presents a novel algorithm for the estimation of MFCV using single threshold bit-stream conversion and a dedicated application-specified integrated circuit (ASIC) for its implementation, suitable for a compact, wearable and easy to use muscle fatigue monitor. The presented ASIC is implemented in a commercially available AMS 0.35 [Formula: see text] CMOS technology and utilizes a bit-stream cross-correlator that estimates the conduction velocity of the myoelectric signal in real time. A test group of 20 subjects was used to evaluate the performance of the developed ASIC, achieving good accuracy with an error of only 3.2% compared to Matlab.

  17. Spectral information enhancement using wavelet-based iterative filtering for in vivo gamma spectrometry.

    PubMed

    Paul, Sabyasachi; Sarkar, P K

    2013-04-01

    Use of wavelet transformation in stationary signal processing has been demonstrated for denoising the measured spectra and characterisation of radionuclides in the in vivo monitoring analysis, where difficulties arise due to very low activity level to be estimated in biological systems. The large statistical fluctuations often make the identification of characteristic gammas from radionuclides highly uncertain, particularly when interferences from progenies are also present. A new wavelet-based noise filtering methodology has been developed for better detection of gamma peaks in noisy data. This sequential, iterative filtering method uses the wavelet multi-resolution approach for noise rejection and an inverse transform after soft 'thresholding' over the generated coefficients. Analyses of in vivo monitoring data of (235)U and (238)U were carried out using this method without disturbing the peak position and amplitude while achieving a 3-fold improvement in the signal-to-noise ratio, compared with the original measured spectrum. When compared with other data-filtering techniques, the wavelet-based method shows the best results.

  18. Inductive monitoring system constructed from nominal system data and its use in real-time system monitoring

    NASA Technical Reports Server (NTRS)

    Iverson, David L. (Inventor)

    2008-01-01

    The present invention relates to an Inductive Monitoring System (IMS), its software implementations, hardware embodiments and applications. Training data is received, typically nominal system data acquired from sensors in normally operating systems or from detailed system simulations. The training data is formed into vectors that are used to generate a knowledge database having clusters of nominal operating regions therein. IMS monitors a system's performance or health by comparing cluster parameters in the knowledge database with incoming sensor data from a monitored-system formed into vectors. Nominal performance is concluded when a monitored-system vector is determined to lie within a nominal operating region cluster or lies sufficiently close to a such a cluster as determined by a threshold value and a distance metric. Some embodiments of IMS include cluster indexing and retrieval methods that increase the execution speed of IMS.

  19. [The application of cortical and subcortical stimulation threshold in identifying the motor pathway and guiding the resection of gliomas in the functional areas].

    PubMed

    Ren, X H; Yang, X C; Huang, W; Yang, K Y; Liu, L; Qiao, H; Guo, L J; Cui, Y; Lin, S

    2018-03-06

    Objective: This study aimed to analyze the application of cortical and subcortical stimulation threshold in identifying the motor pathway and guiding the resection of gliomas in the functional area, and to illustrate the minimal safe threshold by ROC method. Methods: Fifty-seven patients with gliomas in the functional areas were enrolled in the study at Beijing Tiantan Hospital from 2015 to 2017. Anesthesia was maintained intravenously with propofol 10% and remifentanil. Throughout the resection process, cortical or subcortical stimulation threshold was determined along tumor border using monopolar or bipolar electrodes. The motor pathway was identified and protected from resection according to the stimulation threshold and transcranial MEPs. Minimal threshold in each case was recorded. Results: Total resection was achieved in 32 cases(56.1%), sub-total resection in 22 cases(38.6%), and partial resection in 3 cases(5.3%). Pre-operative motor disability was found in 9 cases. Compared with pre-operative motor scores, 19 exhibited impaired motor functions on day 1 after surgery, 5 had quick recovery by day 7 after surgery, and 7 had late recovery by 3 months after surgery. At 3 months, 7 still had impaired motor function. The frequency of intraoperative seizure was 1.8%(1/57). No other side effect was found during electronic monitoring in the operation. The ROC curve revealed that the minimal safe monopolar subcortical threshold was 5.70 mA for strength deterioration on day 1 and day 7 after surgery. Univariate analysis revealed that decreased transcranial MEPs and minimal subcortical threshold ≤5.7 mA were correlated with postoperative strength deterioration. Conclusions: Cortical and subcortical stimulation threshold has its merit in identifying the motor pathway and guiding the resection for tumors within the functional areas. 5.7 mA can be used as the minimal safe threshold to protect the motor pathway from injury.

  20. β-Adrenergic stimulation increases the intra-sarcoplasmic reticulum Ca2+ threshold for Ca2+ wave generation

    PubMed Central

    Domeier, Timothy L; Maxwell, Joshua T; Blatter, Lothar A

    2012-01-01

    β-Adrenergic signalling induces positive inotropic effects on the heart that associate with pro-arrhythmic spontaneous Ca2+ waves. A threshold level of sarcoplasmic reticulum (SR) Ca2+ ([Ca2+]SR) is necessary to trigger Ca2+ waves, and whether the increased incidence of Ca2+ waves during β-adrenergic stimulation is due to an alteration in this threshold remains controversial. Using the low-affinity Ca2+ indicator fluo-5N entrapped within the SR of rabbit ventricular myocytes, we addressed this controversy by directly monitoring [Ca2+]SR and Ca2+ waves during β-adrenergic stimulation. Electrical pacing in elevated extracellular Ca2+ ([Ca2+]o= 7 mm) was used to increase [Ca2+]SR to the threshold where Ca2+ waves were consistently observed. The β-adrenergic agonist isoproterenol (ISO; 1 μm) increased [Ca2+]SR well above the control threshold and consistently triggered Ca2+ waves. However, when [Ca2+]SR was subsequently lowered in the presence of ISO (by lowering [Ca2+]o to 1 mm and partially inhibiting sarcoplasmic/endoplasmic reticulum calcium ATPase with cyclopiazonic acid or thapsigargin), Ca2+ waves ceased to occur at a [Ca2+]SR that was higher than the control threshold. Furthermore, for a set [Ca2+]SR level the refractoriness of wave occurrence (Ca2+ wave latency) was prolonged during β-adrenergic stimulation, and was highly dependent on the extent that [Ca]SR exceeded the wave threshold. These data show that acute β-adrenergic stimulation increases the [Ca2+]SR threshold for Ca2+ waves, and therefore the primary cause of Ca2+ waves is the robust increase in [Ca2+]SR above this higher threshold level. Elevation of the [Ca2+]SR wave threshold and prolongation of wave latency represent potentially protective mechanisms against pro-arrhythmogenic Ca2+ release during β-adrenergic stimulation. PMID:22988136

  1. Is undertransfusion a problem in modern clinical practice?

    PubMed

    Hibbs, Stephen; Miles, David; Staves, Julie; Murphy, Michael F

    2015-04-01

    Significant progress has been made in reducing inappropriate transfusion of blood products. However, there is also a need to monitor for their underutilization in patients who would benefit from transfusion. This study aimed to develop a method to monitor for undertransfusion and conduct a preliminary examination of whether it is a problem in modern clinical practice. All patients with a hemoglobin (Hb) concentration below 6 g/dL or platelet (PLT) count of fewer than 10 × 10(9) /L were identified during a 1-month period in an academic medical center in the United Kingdom. Patients who were transfused within 72 hours of the low reading were excluded from further analysis. For all other patients, records were examined against predefined criteria to ascertain whether the reason for nonadministration of transfusion was justified. During the study period there were 63 eligible Hb readings and 130 eligible PLT counts in 93 patients. Of these, 36 patients were not transfused within 72 hours of the low reading. The majority of nonadministration (n = 28) was justified by either an additional Hb or an additional PLT count on repeat sampling being above the transfusion threshold or the transfusion being medically inappropriate. No documentation was found to indicate that any cases of nonadministration of blood were unjustified. This study did not find that patients with low Hb readings or PLT counts were inappropriately undertransfused. However, systems similar to those described in this study should be developed to monitor for inappropriate undertransfusion as well as continuing efforts to monitor for and reduce inappropriate overtransfusion. © 2014 AABB.

  2. A fully integrated mixed-signal neural processor for implantable multichannel cortical recording.

    PubMed

    Sodagar, Amir M; Wise, Kensall D; Najafi, Khalil

    2007-06-01

    A 64-channel neural processor has been developed for use in an implantable neural recording microsystem. In the Scan Mode, the processor is capable of detecting neural spikes by programmable positive, negative, or window thresholding. Spikes are tagged with their associated channel addresses and formed into 18-bit data words that are sent serially to the external host. In the Monitor Mode, two channels can be selected and viewed at high resolution for studies where the entire signal is of interest. The processor runs from a 3-V supply and a 2-MHz clock, with a channel scan rate of 64 kS/s and an output bit rate of 2 Mbps.

  3. Extinction debt: a challenge for biodiversity conservation.

    PubMed

    Kuussaari, Mikko; Bommarco, Riccardo; Heikkinen, Risto K; Helm, Aveliina; Krauss, Jochen; Lindborg, Regina; Ockinger, Erik; Pärtel, Meelis; Pino, Joan; Rodà, Ferran; Stefanescu, Constantí; Teder, Tiit; Zobel, Martin; Steffan-Dewenter, Ingolf

    2009-10-01

    Local extinction of species can occur with a substantial delay following habitat loss or degradation. Accumulating evidence suggests that such extinction debts pose a significant but often unrecognized challenge for biodiversity conservation across a wide range of taxa and ecosystems. Species with long generation times and populations near their extinction threshold are most likely to have an extinction debt. However, as long as a species that is predicted to become extinct still persists, there is time for conservation measures such as habitat restoration and landscape management. Standardized long-term monitoring, more high-quality empirical studies on different taxa and ecosystems and further development of analytical methods will help to better quantify extinction debt and protect biodiversity.

  4. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    NASA Astrophysics Data System (ADS)

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in particular the spectral characteristics of events of interest, Is entirely model-based, i.e. does not rely on a priori's - particularly important for nuclear monitoring, Does not rely on individualized signal detection thresholds - it's the network solution that matters.

  5. An Approach to Industrial Stormwater Benchmarks: Establishing and Using Site-Specific Threshold Criteria at Lawrence Livermore National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, C G; Mathews, S

    2006-09-07

    Current regulatory schemes use generic or industrial sector specific benchmarks to evaluate the quality of industrial stormwater discharges. While benchmarks can be a useful tool for facility stormwater managers in evaluating the quality stormwater runoff, benchmarks typically do not take into account site-specific conditions, such as: soil chemistry, atmospheric deposition, seasonal changes in water source, and upstream land use. Failing to account for these factors may lead to unnecessary costs to trace a source of natural variation, or potentially missing a significant local water quality problem. Site-specific water quality thresholds, established upon the statistical evaluation of historic data take intomore » account these factors, are a better tool for the direct evaluation of runoff quality, and a more cost-effective trigger to investigate anomalous results. Lawrence Livermore National Laboratory (LLNL), a federal facility, established stormwater monitoring programs to comply with the requirements of the industrial stormwater permit and Department of Energy orders, which require the evaluation of the impact of effluent discharges on the environment. LLNL recognized the need to create a tool to evaluate and manage stormwater quality that would allow analysts to identify trends in stormwater quality and recognize anomalous results so that trace-back and corrective actions could be initiated. LLNL created the site-specific water quality threshold tool to better understand the nature of the stormwater influent and effluent, to establish a technical basis for determining when facility operations might be impacting the quality of stormwater discharges, and to provide ''action levels'' to initiate follow-up to analytical results. The threshold criteria were based on a statistical analysis of the historic stormwater monitoring data and a review of relevant water quality objectives.« less

  6. Is it possible to shorten ambulatory blood pressure monitoring?

    PubMed

    Wolak, Talya; Wilk, Lior; Paran, Esther; Wolak, Arik; Gutmacher, Bella; Shleyfer, Elena; Friger, Michael

    2013-08-01

    The aim of this investigation was to find a time segment in which average blood pressure (BP) has the best correlation with 24-hour BP control. A total of 240 patients with full ambulatory BP monitoring (ABPM) were included; 120 had controlled BP (systolic BP [SBP] ≤135 mm Hg and diastolic BP [DBP] ≤85 mm Hg) and 120 had uncontrolled BP (SBP >135 mm Hg and/or DBP >85 mm Hg). Each ABPM was divided into 6- and 8-hour segments. Evaluation for correlation between mean BP for each time segment and 24-hour BP control was performed using receiver operating characteristic curve analysis and Youden's index for threshold with the best sensitivity and specificity. The mean BP in the following segments showed the highest area under the curve (AUC) compared with average controlled 24-hour BP: SBP 2 am to 8 am (AUC, 0.918; threshold value of 133.5 mm Hg, sensitivity-0.752 and specificity-0.904); SBP 2 pm to 10 pm (AUC, 0.911; threshold value of 138.5 mm Hg, sensitivity-0.803 and specificity-0.878); and SBP 6 am to 2 pm (AUC, 0.903; threshold value of 140.5 mm Hg, sensitivity-0.778 and specificity-0.888). The time segment 2 pm to 10 pm was shown to have good correlation with 24-hour BP control (AUC >0.9; sensitivity and specificity >80%). This time segment might replace full ABPM as a screening measure for BP control or as abbreviated ABPM for patients with difficulty in performing full ABPM. © 2013 Wiley Periodicals, Inc.

  7. "What Is a Step?" Differences in How a Step Is Detected among Three Popular Activity Monitors That Have Impacted Physical Activity Research.

    PubMed

    John, Dinesh; Morton, Alvin; Arguello, Diego; Lyden, Kate; Bassett, David

    2018-04-15

    (1) Background: This study compared manually-counted treadmill walking steps from the hip-worn DigiwalkerSW200 and OmronHJ720ITC, and hip and wrist-worn ActiGraph GT3X+ and GT9X; determined brand-specific acceleration amplitude (g) and/or frequency (Hz) step-detection thresholds; and quantified key features of the acceleration signal during walking. (2) Methods: Twenty participants (Age: 26.7 ± 4.9 years) performed treadmill walking between 0.89-to-1.79 m/s (2-4 mph) while wearing a hip-worn DigiwalkerSW200, OmronHJ720ITC, GT3X+ and GT9X, and a wrist-worn GT3X+ and GT9X. A DigiwalkerSW200 and OmronHJ720ITC underwent shaker testing to determine device-specific frequency and amplitude step-detection thresholds. Simulated signal testing was used to determine thresholds for the ActiGraph step algorithm. Steps during human testing were compared using bias and confidence intervals. (3) Results: The OmronHJ720ITC was most accurate during treadmill walking. Hip and wrist-worn ActiGraph outputs were significantly different from the criterion. The DigiwalkerSW200 records steps for movements with a total acceleration of ≥1.21 g. The OmronHJ720ITC detects a step when movement has an acceleration ≥0.10 g with a dominant frequency of ≥1 Hz. The step-threshold for the ActiLife algorithm is variable based on signal frequency. Acceleration signals at the hip and wrist have distinctive patterns during treadmill walking. (4) Conclusions: Three common research-grade physical activity monitors employ different step-detection strategies, which causes variability in step output.

  8. “What Is a Step?” Differences in How a Step Is Detected among Three Popular Activity Monitors That Have Impacted Physical Activity Research

    PubMed Central

    John, Dinesh; Arguello, Diego; Lyden, Kate; Bassett, David

    2018-01-01

    (1) Background: This study compared manually-counted treadmill walking steps from the hip-worn DigiwalkerSW200 and OmronHJ720ITC, and hip and wrist-worn ActiGraph GT3X+ and GT9X; determined brand-specific acceleration amplitude (g) and/or frequency (Hz) step-detection thresholds; and quantified key features of the acceleration signal during walking. (2) Methods: Twenty participants (Age: 26.7 ± 4.9 years) performed treadmill walking between 0.89-to-1.79 m/s (2–4 mph) while wearing a hip-worn DigiwalkerSW200, OmronHJ720ITC, GT3X+ and GT9X, and a wrist-worn GT3X+ and GT9X. A DigiwalkerSW200 and OmronHJ720ITC underwent shaker testing to determine device-specific frequency and amplitude step-detection thresholds. Simulated signal testing was used to determine thresholds for the ActiGraph step algorithm. Steps during human testing were compared using bias and confidence intervals. (3) Results: The OmronHJ720ITC was most accurate during treadmill walking. Hip and wrist-worn ActiGraph outputs were significantly different from the criterion. The DigiwalkerSW200 records steps for movements with a total acceleration of ≥1.21 g. The OmronHJ720ITC detects a step when movement has an acceleration ≥0.10 g with a dominant frequency of ≥1 Hz. The step-threshold for the ActiLife algorithm is variable based on signal frequency. Acceleration signals at the hip and wrist have distinctive patterns during treadmill walking. (4) Conclusions: Three common research-grade physical activity monitors employ different step-detection strategies, which causes variability in step output. PMID:29662048

  9. Noninvasive determination of anaerobic threshold by monitoring the %SpO2 changes and respiratory gas exchange.

    PubMed

    Nikooie, Roohollah; Gharakhanlo, Reza; Rajabi, Hamid; Bahraminegad, Morteza; Ghafari, Ali

    2009-10-01

    The purpose of this study was to determine the validity of noninvasive anaerobic threshold (AT) estimation using %SpO2 (arterial oxyhemoglobin saturation) changes and respiratory gas exchanges. Fifteen active, healthy males performed 2 graded exercise tests on a motor-driven treadmill in 2 separated sessions. Respiratory gas exchanges and heart rate (HR), lactate concentration, and %SpO2 were measured continuously throughout the test. Anaerobic threshold was determined based on blood lactate concentration (lactate-AT), %SpO2 changes (%SpO2-AT), respiratory exchange ratio (RER-AT), V-slope method (V-slope-AT), and ventilatory equivalent for O2 (EqO2-AT). Blood lactate measuring was considered as gold standard assessment of AT and was applied to confirm the validity of other noninvasive methods. The mean O2 corresponding to lactate-AT, %SpO2-AT, RER-AT, V-slope -AT, and EqO2-AT were 2176.6 +/- 206.4, 1909.5 +/- 221.4, 2141.2 +/- 245.6, 1933.7 +/- 216.4, and 1975 +/- 232.4, respectively. Intraclass correlation coefficient (ICC) analysis indicates a significant correlation between 4 noninvasive methods and the criterion method. Blond-Altman plots showed the good agreement between O2 corresponding to AT in each method and lactate-AT (95% confidence interval (CI). Our results indicate that a noninvasive and easy procedure of monitoring the %SpO2 is a valid method for estimation of AT. Also, in the present study, the respiratory exchange ratio (RER) method seemed to be the best respiratory index for noninvasive estimation of anaerobic threshold, and the heart rate corresponding to AT predicted by this method can be used by coaches and athletes to define training zones.

  10. Cardiovascular testing in Fabry disease: exercise capacity reduction, chronotropic incompetence and improved anaerobic threshold after enzyme replacement.

    PubMed

    Lobo, T; Morgan, J; Bjorksten, A; Nicholls, K; Grigg, L; Centra, E; Becker, G

    2008-06-01

    The aim of this study was to document exercise capacity and serial electrocardiogram and echocardiograph findings in a cohort of Australian patients with Fabry disease, in relation to their history of enzyme replacement therapy (ERT). Fabry disease has multifactorial effects on the cardiovascular system. Most previous studies have focused on electrocardiographic and echocardiographic parameters. Exercise capacity can be used as an integrated measure of cardiovascular function and allows the effects of treatment to be monitored. A total of 38 patients (30 men and 8 women) with Fabry disease were monitored by 12-lead electrocardiograms every 6-12 months, and by annual standardized-protocol echocardiograms. Bicycle stress tests with VO(2) max measurement and once-only 6 minutes' walk tests were also carried out in subsets of patients whose general health status allowed testing. Seventy per cent of patients met electrocardiogram criteria for left ventricular hypertrophy. Left ventricular hypertrophy on echocardiograph was present in 64% of patients (80% of men). Exercise capacity was reduced in patients with Fabry disease compared with that predicted from normative population data. Mild improvement in anaerobic threshold was seen in the first year of ERT (14.1 +/- 3.0 to 15.8 +/- 3.0, P = 0.02), but no consistent further increase was seen beyond the first year. Most patients had resting bradycardia, with impaired ability to increase heart rate during exercise. Serial testing on ERT showed an improvement in anaerobic threshold but no significant change in VO(2) max. Male patients with Fabry disease were unable to attain predicted maximal heart rate on exercise or to achieve normal exercise levels. ERT was associated with a small improvement in anaerobic threshold over the first year.

  11. CONDITIONAL PROBABILITY ANALYSIS APPROACH FOR IDENTIFYING BIOLOGICAL THRESHOLD OF IMPACT FOR SEDIMENTATION: APPICATION TO FRESHWATER STREAMS IN OREGON COAST RANGE ECOREGION

    EPA Science Inventory

    A conditional probability analysis (CPA) approach has been developed for identifying biological thresholds of impact for use in the development of geographic-specific water quality criteria for protection of aquatic life. This approach expresses the threshold as the likelihood ...

  12. From drought indicators to impacts: developing improved tools for monitoring and early warning with decision-makers in mind

    NASA Astrophysics Data System (ADS)

    Hannaford, Jamie; Barker, Lucy; Svensson, Cecilia; Tanguy, Maliko; Laize, Cedric; Bachmair, Sophie; Tijdeman, Erik; Stahl, Kerstin; Collins, Kevin

    2016-04-01

    Droughts pose a threat to water security in most climate zones and water use sectors. With projections suggesting that droughts will intensify in many parts of the globe, the magnitude of this threat is likely to increase in the future and thus vulnerability of society to drought must be reduced through better preparedness. While the occurrence of drought cannot be prevented in the short term, a number of actions can be taken to reduce vulnerability. Monitoring and early warning (M&EW) systems are often central to drought management strategies aimed at reducing vulnerability, but they are generally less developed than for other hazards. There are many drought indicators available for characterising the hazard but they have only rarely been tested for their ability to capture observed impacts on society or the environment. There is a pressing need to better integrate the physical and social vulnerability elements of drought to improve M&EW systems. The Belmont Forum project DrIVER (Drought Impacts: Vulnerability thresholds in monitoring and Early-warning Research, 2014 - 2016) aims to fill this gap by strengthening the link between natural (hydrometeorological) drought characterisation and ecological and socio-economic impacts on three continents (North America, Europe and Australia). The UK is a key DrIVER case study area. The UK has a well-developed hydrological monitoring programme, but there is currently no national drought focused M&EW system; different actors (water companies, regulators, farmers or industry) typically conduct M&EW for their own particular purposes. In this paper we present the early outcomes of an extensive programme of research designed to provide a scientific foundation for improved M&EW systems for the UK in future. The UK is used here as an example, and the findings could prove useful for other localities seeking to develop M&EW systems. Firstly, we present the results of stakeholder engagement exercises designed to ascertain current use of M&EW and future aspirations. Different stakeholders clearly have different goals for M&EW, but there are a number of common themes, including a desire to better understand the links between the outputs of large-scale M&EW systems (rainfall, river flow, etc), localised triggers used by decision-makers during drought episodes, and actual impacts of drought. Secondly, we present analyses designed to test the utility of a wide range of drought indicators for their use in UK applications. We demonstrate the suitability of standardised indicators (like the SPI) for use in the UK, addressing the suitability of statistical distributions and using these indicators for drought severity quantification and for understanding propagation from meteorological to hydrological drought; all of which are currently poorly understood aspects that are vital for future monitoring. We then address the extent to which these indicators can be used to predict drought impacts, focusing on several sectors (water supply, agriculture and ecosystems). These analyses test which indicators perform best at predicting drought impacts, and seek to identify indicator thresholds that trigger impact occurrence. Unsurprisingly, we found that no single indicator best predicts impacts, and results are domain, sector and season specific. However, we reveal important linkages between indicators and impacts that could enhance the design and delivery of monitoring and forecasting information and its uptake by decision-makers concerned with drought.

  13. A Bioinformatic Pipeline for Monitoring of the Mutational Stability of Viral Drug Targets with Deep-Sequencing Technology.

    PubMed

    Kravatsky, Yuri; Chechetkin, Vladimir; Fedoseeva, Daria; Gorbacheva, Maria; Kravatskaya, Galina; Kretova, Olga; Tchurikov, Nickolai

    2017-11-23

    The efficient development of antiviral drugs, including efficient antiviral small interfering RNAs (siRNAs), requires continuous monitoring of the strict correspondence between a drug and the related highly variable viral DNA/RNA target(s). Deep sequencing is able to provide an assessment of both the general target conservation and the frequency of particular mutations in the different target sites. The aim of this study was to develop a reliable bioinformatic pipeline for the analysis of millions of short, deep sequencing reads corresponding to selected highly variable viral sequences that are drug target(s). The suggested bioinformatic pipeline combines the available programs and the ad hoc scripts based on an original algorithm of the search for the conserved targets in the deep sequencing data. We also present the statistical criteria for the threshold of reliable mutation detection and for the assessment of variations between corresponding data sets. These criteria are robust against the possible sequencing errors in the reads. As an example, the bioinformatic pipeline is applied to the study of the conservation of RNA interference (RNAi) targets in human immunodeficiency virus 1 (HIV-1) subtype A. The developed pipeline is freely available to download at the website http://virmut.eimb.ru/. Brief comments and comparisons between VirMut and other pipelines are also presented.

  14. Ground-Water Quality Data in the San Fernando-San Gabriel Study Unit, 2005 - Results from the California GAMA Program

    USGS Publications Warehouse

    Land, Michael; Belitz, Kenneth

    2008-01-01

    Ground-water quality in the approximately 460 square mile San Fernando-San Gabriel study unit (SFSG) was investigated between May and July 2005 as part of the Priority Basin Assessment Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Assessment Project was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The San Fernando-San Gabriel study was designed to provide a spatially unbiased assessment of raw ground-water quality within SFSG, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from 52 wells in Los Angeles County. Thirty-five of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study area (grid wells), and seventeen wells were selected to aid in the evaluation of specific water-quality issues or changes in water chemistry along a historic ground-water flow path (understanding wells). The ground-water samples were analyzed for a large number of synthetic organic constituents [volatile organic compounds (VOCs), pesticides and pesticide degradates], constituents of special interest [perchlorate, N-nitrosodimethylamine (NDMA), 1,2,3-trichloropropane (1,2,3-TCP), and 1,4-dioxane], naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), radioactive constituents, and microbial indicators. Naturally occurring isotopes (tritium, and carbon-14, and stable isotopes of hydrogen, oxygen, and carbon), and dissolved noble gases also were measured to help identify the source and age of the sampled ground water. Quality-control samples (blanks, replicates, samples for matrix spikes) were collected at approximately one-fifth (11 of 52) of the wells, and the results for these samples were used to evaluate the quality of the data for the ground-water samples. Assessment of the quality-control results showed that the data had very little bias or variability and resulted in censoring of less than 0.7 percent (32 of 4,484 measurements) of the data collected for ground-water samples. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, or blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to treated water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH) and thresholds established for aesthetic concerns (secondary maximum contaminant levels, SMCL-CA) by CDPH. VOCs were detected in more than 90 percent (33 of 35) of grid wells. For all wells sampled for SFSG, nearly all VOC detections were below health-based thresholds, and most were less than one-tenth of the threshold values. Samples from seven wells had at least one detection of PCE, TCE, tetrachloromethane, NDMA, or 1,2,3-TCP at or above a health-based threshold. Pesticides were detected in about 90 percent (31 of 35) grid wells and all detections in samples from SFSG wells were below health-based thresholds. Major ions, trace elements, and nutrients in samples from 17 SFSG wells were all below health-based thresholds, with the exception of one detection of nitrate that was above the USEPA maximum contaminant level (MCL-US). With the exception of 14 samples having radon-222 above the proposed MCL-US, radioactive constituents were below health-based thresholds for 16 of the SFSG wells sampled. Total dissolved solids in 6 of the 24 SFSG wells that were sampled ha

  15. Effects of intermittent theta burst stimulation on cerebral blood flow and cerebral vasomotor reactivity.

    PubMed

    Pichiorri, Floriana; Vicenzini, Edoardo; Gilio, Francesca; Giacomelli, Elena; Frasca, Vittorio; Cambieri, Chiara; Ceccanti, Marco; Di Piero, Vittorio; Inghilleri, Maurizio

    2012-08-01

    To determine whether intermittent theta burst stimulation influences cerebral hemodynamics, we investigated changes induced by intermittent theta burst stimulation on the middle cerebral artery cerebral blood flow velocity and vasomotor reactivity to carbon dioxide (CO(2)) in healthy participants. The middle cerebral artery flow velocity and vasomotor reactivity were monitored by continuous transcranial Doppler sonography. Changes in cortical excitability were tested by transcranial magnetic stimulation. In 11 healthy participants, before and immediately after delivering intermittent theta burst stimulation, we tested cortical excitability measured by the resting motor threshold and motor evoked potential amplitude over the stimulated hemisphere and vasomotor reactivity to CO(2) bilaterally. The blood flow velocity was monitored in both middle cerebral arteries throughout the experimental session. In a separate session, we tested the effects of sham stimulation under the same experimental conditions. Whereas the resting motor threshold remained unchanged before and after stimulation, motor evoked potential amplitudes increased significantly (P = .04). During and after stimulation, middle cerebral artery blood flow velocities also remained bilaterally unchanged, whereas vasomotor reactivity to CO(2) increased bilaterally (P = .04). The sham stimulation left all variables unchanged. The expected intermittent theta burst stimulation-induced changes in cortical excitability were not accompanied by changes in cerebral blood flow velocities; however, the bilateral increased vasomotor reactivity suggests that intermittent theta burst stimulation influences the cerebral microcirculation, possibly involving subcortical structures. These findings provide useful information on hemodynamic phenomena accompanying intermittent theta burst stimulation, which should be considered in research aimed at developing this noninvasive, low-intensity stimulation technique for safe therapeutic applications.

  16. A versatile and interoperable network sensors for water resources monitoring

    NASA Astrophysics Data System (ADS)

    Ortolani, Alberto; Brandini, Carlo; Costantini, Roberto; Costanza, Letizia; Innocenti, Lucia; Sabatini, Francesco; Gozzini, Bernardo

    2010-05-01

    Monitoring systems to assess water resources quantity and quality require extensive use of in-situ measurements, that have great limitations like difficulties to access and share data, and to customise and easy reconfigure sensors network to fulfil end-users needs during monitoring or crisis phases. In order to address such limitations Sensor Web Enablement technologies for sensors management have been developed and applied to different environmental context under the EU-funded OSIRIS project (Open architecture for Smart and Interoperable networks in Risk management based on In-situ Sensors, www.osiris-fp6.eu). The main objective of OSIRIS was to create a monitoring system to manage different environmental crisis situations, through an efficient data processing chain where in-situ sensors are connected via an intelligent and versatile network infrastructure (based on web technologies) that enables end-users to remotely access multi-domain sensors information. Among the project application, one was focused on underground fresh-water monitoring and management. With this aim a monitoring system to continuously and automatically check water quality and quantity has been designed and built in a pilot test, identified as a portion of the Amiata aquifer feeding the Santa Fiora springs (Grosseto, Italy). This aquifer present some characteristics that make it greatly vulnerable under some conditions. It is a volcanic aquifer with a fractured structure. The volcanic nature in Santa Fiora causes levels of arsenic concentrations that normally are very close to the threshold stated by law, but that sometimes overpass such threshold for reasons still not fully understood. The presence of fractures makes the infiltration rate very inhomogeneous from place to place and very high in correspondence of big fractures. In case of liquid-pollutant spills (typically hydrocarbons spills from tanker accidents or leakage from house tanks containing fuel for heating), these fractures can act as shortcuts to the heart of the aquifer, causing water contamination much faster than what inferable from average infiltration rates. A new system has been set up, upgrading a legacy sensor network with new sensors to address the monitoring and emergency phase management. Where necessary sensors have been modified in order to manage the whole sensor network through SWE services. The network manage sensors for water parameters (physical and chemical) and for atmospheric ones (for supporting the management of accidental crises). A main property of the developed architecture is that it can be easily reconfigured to pass from the monitoring to the alert phase, by changing sampling frequencies of interesting parameters, or deploying specific additional sensors on identified optimal positions (as in case of the hydrocarbon spill). A hydrogeological model, coupled through a hydrological interface to the atmospheric forcing, has been implemented for the area. Model products (accessed through the same web interface than sensors) give a fundamental added value to the upgraded sensors network (e.g. for data merging procedures). Together with the available measurements, it is shown how the model improves the knowledge of the local hydrogeological system, gives a fundamental support to eventually reconfigure the system (e.g. support on transportable sensors position). The network, basically conceived for real-time monitoring, allow to accumulate an unprecedent amount of information for the aquifer. The availability of such a large set of data (in terms of continuously measured water levels, fluxes, precipitation, concentrations, etc.) from the system, gives a unique opportunity for studying the influences of hydrogeological and geopedological parameters on arsenic and concentrations of other chemicals that are naturally present in water.

  17. Does navigated transcranial stimulation increase the accuracy of tractography? A prospective clinical trial based on intraoperative motor evoked potential monitoring during deep brain stimulation.

    PubMed

    Forster, Marie-Therese; Hoecker, Alexander Claudius; Kang, Jun-Suk; Quick, Johanna; Seifert, Volker; Hattingen, Elke; Hilker, Rüdiger; Weise, Lutz Martin

    2015-06-01

    Tractography based on diffusion tensor imaging has become a popular tool for delineating white matter tracts for neurosurgical procedures. To explore whether navigated transcranial magnetic stimulation (nTMS) might increase the accuracy of fiber tracking. Tractography was performed according to both anatomic delineation of the motor cortex (n = 14) and nTMS results (n = 9). After implantation of the definitive electrode, stimulation via the electrode was performed, defining a stimulation threshold for eliciting motor evoked potentials recorded during deep brain stimulation surgery. Others have shown that of arm and leg muscles. This threshold was correlated with the shortest distance between the active electrode contact and both fiber tracks. Results were evaluated by correlation to motor evoked potential monitoring during deep brain stimulation, a surgical procedure causing hardly any brain shift. Distances to fiber tracks clearly correlated with motor evoked potential thresholds. Tracks based on nTMS had a higher predictive value than tracks based on anatomic motor cortex definition (P < .001 and P = .005, respectively). However, target site, hemisphere, and active electrode contact did not influence this correlation. The implementation of tractography based on nTMS increases the accuracy of fiber tracking. Moreover, this combination of methods has the potential to become a supplemental tool for guiding electrode implantation.

  18. Hypo- and hyperglycemia in relation to the mean, standard deviation, coefficient of variation, and nature of the glucose distribution.

    PubMed

    Rodbard, David

    2012-10-01

    We describe a new approach to estimate the risks of hypo- and hyperglycemia based on the mean and SD of the glucose distribution using optional transformations of the glucose scale to achieve a more nearly symmetrical and Gaussian distribution, if necessary. We examine the correlation of risks of hypo- and hyperglycemia calculated using different glucose thresholds and the relationships of these risks to the mean glucose, SD, and percentage coefficient of variation (%CV). Using representative continuous glucose monitoring datasets, one can predict the risk of glucose values above or below any arbitrary threshold if the glucose distribution is Gaussian or can be transformed to be Gaussian. Symmetry and gaussianness can be tested objectively and used to optimize the transformation. The method performs well with excellent correlation of predicted and observed risks of hypo- or hyperglycemia for individual subjects by time of day or for a specified range of dates. One can compare observed and calculated risks of hypo- and hyperglycemia for a series of thresholds considering their uncertainties. Thresholds such as 80 mg/dL can be used as surrogates for thresholds such as 50 mg/dL. We observe a high correlation of risk of hypoglycemia with %CV and illustrate the theoretical basis for that relationship. One can estimate the historical risks of hypo- and hyperglycemia by time of day, date, day of the week, or range of dates, using any specified thresholds. Risks of hypoglycemia with one threshold (e.g., 80 mg/dL) can be used as an effective surrogate marker for hypoglycemia at other thresholds (e.g., 50 mg/dL). These estimates of risk can be useful in research studies and in the clinical care of patients with diabetes.

  19. Sub-threshold depolarizing pre-pulses can enhance the efficiency of biphasic stimuli in transcutaneous neuromuscular electrical stimulation.

    PubMed

    Vargas Luna, Jose Luis; Mayr, Winfried; Cortés-Ramirez, Jorge-Armando

    2018-06-09

    There is multiple evidence in the literature that a sub-threshold pre-pulse, delivered immediately prior to an electrical stimulation pulse, can alter the activation threshold of nerve fibers and motor unit recruitment characteristics. So far, previously published works combined monophasic stimuli with sub-threshold depolarizing pre-pulses (DPPs) with inconsistent findings-in some studies, the DPPs decreased the activation threshold, while in others it was increased. This work aimed to evaluate the effect of DPPs during biphasic transcutaneous electrical stimulation and to study the possible mechanism underlying those differences. Sub-threshold DPPs between 0.5 and 15 ms immediately followed by biphasic or monophasic pulses were administered to the tibial nerve; the electrophysiological muscular responses (motor-wave, M-wave) were monitored via electromyogram (EMG) recording from the soleus muscle. The data show that, under the specific studied conditions, DPPs tend to lower the threshold for nerve fiber activation rather than elevating it. DPPs with the same polarity as the leading phase of biphasic stimuli are more effective to increase the sensitivity. This work assesses for the first time the effect of DPPs on biphasic pulses, which are required to achieve charge-balanced stimulation, and it provides guidance on the effect of polarity and intensity to take full advantage of this feature. Graphical abstract In this work, the effect of sub-threshold depolarizing pre-pulses (DPP) is investigated in a setup with transcutaneous electrical stimulation. We found that, within the tested 0-15 ms DPP duration range, the DPPs administered immediately before biphasic pulses proportionally increase the nerve excitability as visible in the M-waves recorded from the soleus muscle. Interestingly, these findings oppose published results, where DPPs, administered immediately before monophasic stimuli via implanted electrodes, led to decrease of nerve excitability.

  20. Field Tests of a Tractor Rollover Detection and Emergency Notification System.

    PubMed

    Liu, B; Koc, A B

    2015-04-01

    The objective of this research was to assess the feasibility of a rollover detection and emergency notification system for farm tractors using field tests. The emergency notification system was developed based on a tractor stability model and implemented on a mobile electronic device with the iOS operating system. A complementary filter was implemented to combine the data from the accelerometer and gyroscope sensors to improve their accuracies in calculating the roll and pitch angles and the roll and pitch rates. The system estimates a stability index value during tractor operation, displays feedback messages when the stability index is lower than a preset threshold value, and transmits emergency notification messages when an overturn happens. Ten tractor rollover tests were conducted on a field track. The developed system successfully monitored the stability of the tractor during all of the tests. The iOS application was able to detect rollover accidents and transmit emergency notifications in the form of a phone call and email when an accident was detected. The system can be a useful tool for training and education in safe tractor operation. The system also has potential for stability monitoring and emergency notification of other on-road and off-road motorized vehicles.

  1. Classification criteria and probability risk maps: limitations and perspectives.

    PubMed

    Saisana, Michaela; Dubois, Gregoire; Chaloulakou, Archontoula; Spyrellis, Nikolas

    2004-03-01

    Delineation of polluted zones with respect to regulatory standards, accounting at the same time for the uncertainty of the estimated concentrations, relies on classification criteria that can lead to significantly different pollution risk maps, which, in turn, can depend on the regulatory standard itself. This paper reviews four popular classification criteria related to the violation of a probability threshold or a physical threshold, using annual (1996-2000) nitrogen dioxide concentrations from 40 air monitoring stations in Milan. The relative advantages and practical limitations of each criterion are discussed, and it is shown that some of the criteria are more appropriate for the problem at hand and that the choice of the criterion can be supported by the statistical distribution of the data and/or the regulatory standard. Finally, the polluted area is estimated over the different years and concentration thresholds using the appropriate risk maps as an additional source of uncertainty.

  2. Effects of N-acetylcysteine on noise-induced temporary threshold shift and temporary emission shift

    NASA Astrophysics Data System (ADS)

    Robinette, Martin

    2004-05-01

    Animal research has shown that antioxidants can provide significant protection to the cochlea from traumatic noise exposure with some benefit when given after the exposure. Similar results in humans would have a significant impact on both prevention and treatment of noise-induced hearing loss. The current study evaluates the effectiveness of N-acetylcysteine (NAC) on temporary threshold shift (TTS) by using both behavioral and physiological measures. Sixteen healthy, normal-hearing subjects were given NAC or a placebo prior to exposure to a 10-min, 102-dB narrow-band noise, centered at 2 kHz. This exposure was designed to induce a 10-15-dB TTS. Following the noise exposure, pure-tone thresholds (Bekesy) and transient-evoked otoacoustic emissions (TEOAE) were measured for 60 min to monitor the effects of NAC on TTS recovery. Postexposure measures were compared to baseline data. [Work supported by American BioHealth Group.

  3. Image intensifier gain uniformity improvements in sealed tubes by selective scrubbing

    DOEpatents

    Thomas, S.W.

    1995-04-18

    The gain uniformity of sealed microchannel plate image intensifiers (MCPIs) is improved by selectively scrubbing the high gain sections with a controlled bright light source. Using the premise that ions returning to the cathode from the microchannel plate (MCP) damage the cathode and reduce its sensitivity, a HeNe laser beam light source is raster scanned across the cathode of a microchannel plate image intensifier (MCPI) tube. Cathode current is monitored and when it exceeds a preset threshold, the sweep rate is decreased 1000 times, giving 1000 times the exposure to cathode areas with sensitivity greater than the threshold. The threshold is set at the cathode current corresponding to the lowest sensitivity in the active cathode area so that sensitivity of the entire cathode is reduced to this level. This process reduces tube gain by between 10% and 30% in the high gain areas while gain reduction in low gain areas is negligible. 4 figs.

  4. Image intensifier gain uniformity improvements in sealed tubes by selective scrubbing

    DOEpatents

    Thomas, Stanley W.

    1995-01-01

    The gain uniformity of sealed microchannel plate image intensifiers (MCPIs) is improved by selectively scrubbing the high gain sections with a controlled bright light source. Using the premise that ions returning to the cathode from the microchannel plate (MCP) damage the cathode and reduce its sensitivity, a HeNe laser beam light source is raster scanned across the cathode of a microchannel plate image intensifier (MCPI) tube. Cathode current is monitored and when it exceeds a preset threshold, the sweep rate is decreased 1000 times, giving 1000 times the exposure to cathode areas with sensitivity greater than the threshold. The threshold is set at the cathode current corresponding to the lowest sensitivity in the active cathode area so that sensitivity of the entire cathode is reduced to this level. This process reduces tube gain by between 10% and 30% in the high gain areas while gain reduction in low gain areas is negligible.

  5. Simple algorithms for digital pulse-shape discrimination with liquid scintillation detectors

    NASA Astrophysics Data System (ADS)

    Alharbi, T.

    2015-01-01

    The development of compact, battery-powered digital liquid scintillation neutron detection systems for field applications requires digital pulse processing (DPP) algorithms with minimum computational overhead. To meet this demand, two DPP algorithms for the discrimination of neutron and γ-rays with liquid scintillation detectors were developed and examined by using a NE213 liquid scintillation detector in a mixed radiation field. The first algorithm is based on the relation between the amplitude of a current pulse at the output of a photomultiplier tube and the amount of charge contained in the pulse. A figure-of-merit (FOM) value of 0.98 with 450 keVee (electron equivalent energy) energy threshold was achieved with this method when pulses were sampled at 250 MSample/s and with 8-bit resolution. Compared to the similar method of charge-comparison this method requires only a single integration window, thereby reducing the amount of computations by approximately 40%. The second approach is a digital version of the trailing-edge constant-fraction discrimination method. A FOM value of 0.84 with an energy threshold of 450 keVee was achieved with this method. In comparison with the similar method of rise-time discrimination this method requires a single time pick-off, thereby reducing the amount of computations by approximately 50%. The algorithms described in this work are useful for developing portable detection systems for applications such as homeland security, radiation dosimetry and environmental monitoring.

  6. The threshold algorithm: Description of the methodology and new developments

    NASA Astrophysics Data System (ADS)

    Neelamraju, Sridhar; Oligschleger, Christina; Schön, J. Christian

    2017-10-01

    Understanding the dynamics of complex systems requires the investigation of their energy landscape. In particular, the flow of probability on such landscapes is a central feature in visualizing the time evolution of complex systems. To obtain such flows, and the concomitant stable states of the systems and the generalized barriers among them, the threshold algorithm has been developed. Here, we describe the methodology of this approach starting from the fundamental concepts in complex energy landscapes and present recent new developments, the threshold-minimization algorithm and the molecular dynamics threshold algorithm. For applications of these new algorithms, we draw on landscape studies of three disaccharide molecules: lactose, maltose, and sucrose.

  7. Ground-Water Quality Data in the Central Sierra Study Unit, 2006 - Results from the California GAMA Program

    USGS Publications Warehouse

    Ferrari, Matthew J.; Fram, Miranda S.; Belitz, Kenneth

    2008-01-01

    Ground-water quality in the approximately 950 square kilometer (370 square mile) Central Sierra study unit (CENSIE) was investigated in May 2006 as part of the Priority Basin Assessment project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Assessment project was developed in response to the Ground-Water Quality Monitoring Act of 2001, and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). This study was designed to provide a spatially unbiased assessment of the quality of raw ground water used for drinking-water supplies within CENSIE, and to facilitate statistically consistent comparisons of ground-water quality throughout California. Samples were collected from thirty wells in Madera County. Twenty-seven of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study area (grid wells), and three were selected to aid in evaluation of specific water-quality issues (understanding wells). Ground-water samples were analyzed for a large number of synthetic organic constituents (volatile organic compounds [VOCs], gasoline oxygenates and degradates, pesticides and pesticide degradates), constituents of special interest (N-nitrosodimethylamine, perchlorate, and 1,2,3-trichloropropane), naturally occurring inorganic constituents [nutrients, major and minor ions, and trace elements], radioactive constituents, and microbial indicators. Naturally occurring isotopes [tritium, and carbon-14, and stable isotopes of hydrogen, oxygen, nitrogen, and carbon], and dissolved noble gases also were measured to help identify the sources and ages of the sampled ground water. In total, over 250 constituents and water-quality indicators were investigated. Quality-control samples (blanks, replicates, and samples for matrix spikes) were collected at approximately one-sixth of the wells, and the results for these samples were used to evaluate the quality of the data for the ground-water samples. Results from field blanks indicated contamination was not a noticeable source of bias in the data for ground-water samples. Differences between replicate samples were within acceptable ranges, indicating acceptably low variability. Matrix spike recoveries were within acceptable ranges for most constituents. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, or blended with other waters to maintain water quality. Regulatory thresholds apply to water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH), and thresholds established for aesthetic concerns (Secondary Maximum Contaminant Levels, SMCL-CA) by CDPH. Therefore, any comparisons of the results of this study to drinking-water standards only is for illustrative purposes and is not indicative of compliance or non-compliance to those standards. Most constituents that were detected in ground-water samples were found at concentrations below drinking-water standards or thresholds. Six constituents? fluoride, arsenic, molybdenum, uranium, gross-alpha radioactivity, and radon-222?were detected at concentrations higher than thresholds set for health-based regulatory purposes. Three additional constituents?pH, iron and manganese?were detected at concentrations above thresholds set for aesthetic concerns. Volatile organic compounds (VOCs) and pesticides, were detected in less than one-third of the samples and generally at less than one one-hundredth of a health-based threshold.

  8. Ground-Water Quality Data in the Owens and Indian Wells Valleys Study Unit, 2006: Results from the California GAMA Program

    USGS Publications Warehouse

    Densmore, Jill N.; Fram, Miranda S.; Belitz, Kenneth

    2009-01-01

    Ground-water quality in the approximately 1,630 square-mile Owens and Indian Wells Valleys study unit (OWENS) was investigated in September-December 2006 as part of the Priority Basin Project of Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Project was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in collaboration with the California State Water Resources Control Board (SWRCB). The Owens and Indian Wells Valleys study was designed to provide a spatially unbiased assessment of raw ground-water quality within OWENS study unit, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from 74 wells in Inyo, Kern, Mono, and San Bernardino Counties. Fifty-three of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study area (grid wells), and 21 wells were selected to evaluate changes in water chemistry in areas of interest (understanding wells). The ground-water samples were analyzed for a large number of synthetic organic constituents [volatile organic compounds (VOCs), pesticides and pesticide degradates, pharmaceutical compounds, and potential wastewater- indicator compounds], constituents of special interest [perchlorate, N-nitrosodimethylamine (NDMA), and 1,2,3- trichloropropane (1,2,3-TCP)], naturally occurring inorganic constituents [nutrients, major and minor ions, and trace elements], radioactive constituents, and microbial indicators. Naturally occurring isotopes [tritium, and carbon-14, and stable isotopes of hydrogen and oxygen in water], and dissolved noble gases also were measured to help identify the source and age of the sampled ground water. This study evaluated the quality of raw ground water in the aquifer in the OWENS study unit and did not attempt to evaluate the quality of treated water delivered to consumers. Water supplied to consumers typically is treated after withdrawal from the ground, disinfected, and blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to treated water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with regulatory and non-regulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH) and non-regulatory thresholds established for aesthetic concerns (secondary maximum contamination levels, SMCL-CA) by CDPH. VOCs and pesticides were detected in samples from less than one-third of the grid wells; all detections were below health-based thresholds, and most were less than one-one hundredth of threshold values. All detections of perchlorate and nutrients in samples from OWENS were below health-based thresholds. Most detections of trace elements in ground-water samples from OWENS wells were below health-based thresholds. In samples from the 53 grid wells, three constituents were detected at concentrations above USEPA maximum contaminant levels: arsenic in 5 samples, uranium in 4 samples, and fluoride in 1 sample. Two constituents were detected at concentrations above CDPH notification levels (boron in 9 samples and vanadium in 1 sample), and two were above USEPA lifetime health advisory levels (molybdenum in 3 samples and strontium in 1 sample). Most of the samples from OWENS wells had concentrations of major elements, TDS, and trace elements below the non-enforceable standards set for aesthetic concerns. Samples from nine grid wells had concentrations of manganese, iron, or TDS above the SMCL-CAs.

  9. Groundwater-Quality Data in the Antelope Valley Study Unit, 2008: Results from the California GAMA Program

    USGS Publications Warehouse

    Schmitt, Stephen J.; Milby Dawson, Barbara J.; Belitz, Kenneth

    2009-01-01

    Groundwater quality in the approximately 1,600 square-mile Antelope Valley study unit (ANT) was investigated from January to April 2008 as part of the Priority Basin Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Project was developed in response to the Groundwater Quality Monitoring Act of 2001, and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The study was designed to provide a spatially unbiased assessment of the quality of raw groundwater used for public water supplies within ANT, and to facilitate statistically consistent comparisons of groundwater quality throughout California. Samples were collected from 57 wells in Kern, Los Angeles, and San Bernardino Counties. Fifty-six of the wells were selected using a spatially distributed, randomized, grid-based method to provide statistical representation of the study area (grid wells), and one additional well was selected to aid in evaluation of specific water-quality issues (understanding well). The groundwater samples were analyzed for a large number of organic constituents (volatile organic compounds [VOCs], gasoline additives and degradates, pesticides and pesticide degradates, fumigants, and pharmaceutical compounds), constituents of special interest (perchlorate, N-nitrosodimethylamine [NDMA], and 1,2,3-trichloropropane [1,2,3-TCP]), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), and radioactive constituents (gross alpha and gross beta radioactivity, radium isotopes, and radon-222). Naturally occurring isotopes (strontium, tritium, and carbon-14, and stable isotopes of hydrogen and oxygen in water), and dissolved noble gases also were measured to help identify the sources and ages of the sampled groundwater. In total, 239 constituents and water-quality indicators (field parameters) were investigated. Quality-control samples (blanks, replicates, and samples for matrix spikes) were collected at 12 percent of the wells, and the results for these samples were used to evaluate the quality of the data for the groundwater samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that contamination was not a noticeable source of bias in the data for the groundwater samples. Differences between replicate samples generally were within acceptable ranges, indicating acceptably low variability. Matrix spike recoveries were within acceptable ranges for most compoundsThis study did not evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, or blended with other waters to maintain water quality. Regulatory thresholds apply to water that is served to the consumer, not to raw groundwater. However, to provide some context for the results, concentrations of constituents measured in the raw groundwater were compared with regulatory and non-regulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH) and thresholds established for aesthetic concerns (secondary maximum contaminant levels, SMCL-CA) by CDPH. Comparisons between data collected for this study and drinking-water thresholds are for illustrative purposes only, and are not indicative of compliance or non-compliance with drinking water standards. Most constituents that were detected in groundwater samples were found at concentrations below drinking-water thresholds. Volatile organic compounds (VOCs) were detected in about one-half of the samples and pesticides detected in about one-third of the samples; all detections of these constituents were below health-based thresholds. Most detections of trace elements and nutrients in samples from ANT wells were below health-based thresholds. Exceptions include: one detection of nitrite plus nitr

  10. Assessing and optimizing infrasound network performance: application to remote volcano monitoring

    NASA Astrophysics Data System (ADS)

    Tailpied, D.; LE Pichon, A.; Marchetti, E.; Kallel, M.; Ceranna, L.

    2014-12-01

    Infrasound is an efficient monitoring technique to remotely detect and characterize explosive sources such as volcanoes. Simulation methods incorporating realistic source and propagation effects have been developed to quantify the detection capability of any network. These methods can also be used to optimize the network configuration (number of stations, geographical location) in order to reduce the detection thresholds taking into account seasonal effects in infrasound propagation. Recent studies have shown that remote infrasound observations can provide useful information about the eruption chronology and the released acoustic energy. Comparisons with near-field recordings allow evaluating the potential of these observations to better constrain source parameters when other monitoring techniques (satellite, seismic, gas) are not available or cannot be made. Because of its regular activity, the well-instrumented Mount Etna is in Europe a unique natural repetitive source to test and optimize detection and simulation methods. The closest infrasound station part of the International Monitoring System is located in Tunisia (IS48). In summer, during the downwind season, it allows an unambiguous identification of signals associated with Etna eruptions. Under the European ARISE project (Atmospheric dynamics InfraStructure in Europe, FP7/2007-2013), experimental arrays have been installed in order to characterize infrasound propagation in different ranges of distance and direction. In addition, a small-aperture array, set up on the flank by the University of Firenze, has been operating since 2007. Such an experimental setting offers an opportunity to address the societal benefits that can be achieved through routine infrasound monitoring.

  11. Cloud-based privacy-preserving remote ECG monitoring and surveillance.

    PubMed

    Page, Alex; Kocabas, Ovunc; Soyata, Tolga; Aktas, Mehmet; Couderc, Jean-Philippe

    2015-07-01

    The number of technical solutions for monitoring patients in their daily activities is expected to increase significantly in the near future. Blood pressure, heart rate, temperature, BMI, oxygen saturation, and electrolytes are few of the physiologic factors that will soon be available to patients and their physicians almost continuously. The availability and transfer of this information from the patient to the health provider raises privacy concerns. Moreover, current data encryption approaches expose patient data during processing, therefore restricting their utility in applications requiring data analysis. We propose a system that couples health monitoring techniques with analytic methods to permit the extraction of relevant information from patient data without compromising privacy. This proposal is based on the concept of fully homomorphic encryption (FHE). Since this technique is known to be resource-heavy, we develop a proof-of-concept to assess its practicality. Results are presented from our prototype system, which mimics live QT monitoring and detection of drug-induced QT prolongation. Transferring FHE-encrypted QT and RR samples requires about 2 Mbps of network bandwidth per patient. Comparing FHE-encrypted values--for example, comparing QTc to a given threshold-runs quickly enough on modest hardware to alert the doctor of important results in real-time. We demonstrate that FHE could be used to securely transfer and analyze ambulatory health monitoring data. We present a unique concept that could represent a disruptive type of technology with broad applications to multiple monitoring devices. Future work will focus on performance optimizations to accelerate expansion to these other applications. © 2014 Wiley Periodicals, Inc.

  12. Bank supervision using the Threshold-Minimum Dominating Set

    NASA Astrophysics Data System (ADS)

    Gogas, Periklis; Papadimitriou, Theophilos; Matthaiou, Maria-Artemis

    2016-06-01

    An optimized, healthy and stable banking system resilient to financial crises is a prerequisite for sustainable growth. Minimization of (a) the associated systemic risk and (b) the propagation of contagion in the case of a banking crisis are necessary conditions to achieve this goal. Central Banks are in charge of this significant undertaking via a close and detailed monitoring of the banking network. In this paper, we propose the use of an auxiliary supervision/monitoring system that is both efficient with respect to the required resources and can promptly identify a set of banks that are in distress so that immediate and appropriate action can be taken by the supervising authority. We use the network defined by the interrelations between banking institutions employing tools from Complex Networks theory for an efficient management of the entire banking network. In doing so, we introduce the Threshold Minimum Dominating Set (T-MDS). The T-MDS is used to identify the smallest and most efficient subset of banks that can be used as (a) sensors of distress of a manifesting banking crisis and (b) provide a path of possible contagion. We propose the use of this method as a supplementary monitoring tool in the arsenal of a Central Bank. Our dataset includes the 122 largest American banks in terms of their interbank loans. The empirical results show that when the T-MDS methodology is applied, we can have an efficient supervision of the whole banking network, by monitoring just a subset of 47 banks.

  13. ECG signal performance de-noising assessment based on threshold tuning of dual-tree wavelet transform.

    PubMed

    El B'charri, Oussama; Latif, Rachid; Elmansouri, Khalifa; Abenaou, Abdenbi; Jenkal, Wissam

    2017-02-07

    Since the electrocardiogram (ECG) signal has a low frequency and a weak amplitude, it is sensitive to miscellaneous mixed noises, which may reduce the diagnostic accuracy and hinder the physician's correct decision on patients. The dual tree wavelet transform (DT-WT) is one of the most recent enhanced versions of discrete wavelet transform. However, threshold tuning on this method for noise removal from ECG signal has not been investigated yet. In this work, we shall provide a comprehensive study on the impact of the choice of threshold algorithm, threshold value, and the appropriate wavelet decomposition level to evaluate the ECG signal de-noising performance. A set of simulations is performed on both synthetic and real ECG signals to achieve the promised results. First, the synthetic ECG signal is used to observe the algorithm response. The evaluation results of synthetic ECG signal corrupted by various types of noise has showed that the modified unified threshold and wavelet hyperbolic threshold de-noising method is better in realistic and colored noises. The tuned threshold is then used on real ECG signals from the MIT-BIH database. The results has shown that the proposed method achieves higher performance than the ordinary dual tree wavelet transform into all kinds of noise removal from ECG signal. The simulation results indicate that the algorithm is robust for all kinds of noises with varying degrees of input noise, providing a high quality clean signal. Moreover, the algorithm is quite simple and can be used in real time ECG monitoring.

  14. Implications of lower risk thresholds for statin treatment in primary prevention: analysis of CPRD and simulation modelling of annual cholesterol monitoring.

    PubMed

    McFadden, Emily; Stevens, Richard; Glasziou, Paul; Perera, Rafael

    2015-01-01

    To estimate numbers affected by a recent change in UK guidelines for statin use in primary prevention of cardiovascular disease. We modelled cholesterol ratio over time using a sample of 45,151 men (≥40years) and 36,168 women (≥55years) in 2006, without statin treatment or previous cardiovascular disease, from the Clinical Practice Research Datalink. Using simulation methods, we estimated numbers indicated for new statin treatment, if cholesterol was measured annually and used in the QRISK2 CVD risk calculator, using the previous 20% and newly recommended 10% thresholds. We estimate that 58% of men and 55% of women would be indicated for treatment by five years and 71% of men and 73% of women by ten years using the 20% threshold. Using the proposed threshold of 10%, 84% of men and 90% of women would be indicated for treatment by 5years and 92% of men and 98% of women by ten years. The proposed change of risk threshold from 20% to 10% would result in the substantial majority of those recommended for cholesterol testing being indicated for statin treatment. Implications depend on the value of statins in those at low to medium risk, and whether there are harms. Copyright © 2014. Published by Elsevier Inc.

  15. Thresholds of sea-level rise rate and sea-level acceleration rate in a vulnerable coastal wetland

    NASA Astrophysics Data System (ADS)

    Wu, W.; Biber, P.; Bethel, M.

    2017-12-01

    Feedback among inundation, sediment trapping, and vegetation productivity help maintain coastal wetlands facing sea-level rise (SLR). However, when the SLR rate exceeds a threshold, coastal wetlands can collapse. Understanding the threshold help address the key challenge in ecology - nonlinear response of ecosystems to environmental change, and promote communication between ecologists and policy makers. We studied the threshold of SLR rate and developed a new threshold of SLR acceleration rate on sustainability of coastal wetlands as SLR is likely to accelerate due to the enhanced anthropogenic forces. We developed a mechanistic model to simulate wetland change and derived the SLR thresholds for Grand Bay, MS, a micro-tidal estuary with limited upland freshwater and sediment input in the northern Gulf of Mexico. The new SLR acceleration rate threshold complements the threshold of SLR rate and can help explain the temporal lag before the rapid decline of wetland area becomes evident after the SLR rate threshold is exceeded. Deriving these two thresholds depends on the temporal scale, the interaction of SLR with other environmental factors, and landscape metrics, which have not been fully accounted for before this study. The derived SLR rate thresholds range from 7.3 mm/yr to 11.9 mm/yr. The thresholds of SLR acceleration rate are 3.02×10-4 m/yr2 and 9.62×10-5 m/yr2 for 2050 and 2100 respectively. Based on the thresholds developed, predicted SLR that will adversely impact the coastal wetlands in Grand Bay by 2100 will fall within the likely range of SLR under a high warming scenario (RCP8.5), and beyond the very likely range under a low warming scenario (RCP2.6 or 3), highlighting the need to avoid the high warming scenario in the future if these marshes are to be preserved.

  16. Threshold Concepts in Finance: Student Perspectives

    ERIC Educational Resources Information Center

    Hoadley, Susan; Kyng, Tim; Tickle, Leonie; Wood, Leigh N.

    2015-01-01

    Finance threshold concepts are the essential conceptual knowledge that underpin well-developed financial capabilities and are central to the mastery of finance. In this paper we investigate threshold concepts in finance from the point of view of students, by establishing the extent to which students are aware of threshold concepts identified by…

  17. THRESHOLD LOGIC.

    DTIC Science & Technology

    synthesis procedures; a ’best’ method is definitely established. (2) ’Symmetry Types for Threshold Logic’ is a tutorial expositon including a careful...development of the Goto-Takahasi self-dual type ideas. (3) ’Best Threshold Gate Decisions’ reports a comparison, on the 2470 7-argument threshold ...interpretation is shown best. (4) ’ Threshold Gate Networks’ reviews the previously discussed 2-algorithm in geometric terms, describes our FORTRAN

  18. Using C-Band Dual-Polarization Radar Signatures to Improve Convective Wind Forecasting at Cape Canaveral Air Force Station and NASA Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Amiot, Corey G.; Carey, Lawrence D.; Roeder, William P.; McNamara, Todd M.; Blakeslee, Richard J.

    2017-01-01

    The United States Air Force's 45th Weather Squadron (45WS) is the organization responsible for monitoring atmospheric conditions at Cape Canaveral Air Force Station and NASA Kennedy Space Center (CCAFS/KSC) and issuing warnings for hazardous weather conditions when the need arises. One such warning is issued for convective wind events, for which lead times of 30 and 60 minutes are desired for events with peak wind gusts of 35 knots or greater (i.e., Threshold-1) and 50 knots or greater (i.e., Threshold-2), respectively (Roeder et al. 2014).

  19. Quality assurance for the clinical implementation of kilovoltage intrafraction monitoring for prostate cancer VMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, J. A.; Booth, J. T.; O’Brien, R. T.

    2014-11-01

    Purpose: Kilovoltage intrafraction monitoring (KIM) is a real-time 3D tumor monitoring system for cancer radiotherapy. KIM uses the commonly available gantry-mounted x-ray imager as input, making this method potentially more widely available than dedicated real-time 3D tumor monitoring systems. KIM is being piloted in a clinical trial for prostate cancer patients treated with VMAT (NCT01742403). The purpose of this work was to develop clinical process and quality assurance (QA) practices for the clinical implementation of KIM. Methods: Informed by and adapting existing guideline documents from other real-time monitoring systems, KIM-specific QA practices were developed. The following five KIM-specific QA testsmore » were included: (1) static localization accuracy, (2) dynamic localization accuracy, (3) treatment interruption accuracy, (4) latency measurement, and (5) clinical conditions accuracy. Tests (1)–(4) were performed using KIM to measure static and representative patient-derived prostate motion trajectories using a 3D programmable motion stage supporting an anthropomorphic phantom with implanted gold markers to represent the clinical treatment scenario. The threshold for system tolerable latency is <1 s. The tolerances for all other tests are that both the mean and standard deviation of the difference between the programmed trajectory and the measured data are <1 mm. The (5) clinical conditions accuracy test compared the KIM measured positions with those measured by kV/megavoltage (MV) triangulation from five treatment fractions acquired in a previous pilot study. Results: For the (1) static localization, (2) dynamic localization, and (3) treatment interruption accuracy tests, the mean and standard deviation of the difference are <1.0 mm. (4) The measured latency is 350 ms. (5) For the tests with previously acquired patient data, the mean and standard deviation of the difference between KIM and kV/MV triangulation are <1.0 mm. Conclusions: Clinical process and QA practices for the safe clinical implementation of KIM, a novel real-time monitoring system using commonly available equipment, have been developed and implemented for prostate cancer VMAT.« less

  20. Identifying multiple stressor controls on phytoplankton dynamics in the River Thames (UK) using high-frequency water quality data.

    PubMed

    Bowes, M J; Loewenthal, M; Read, D S; Hutchins, M G; Prudhomme, C; Armstrong, L K; Harman, S A; Wickham, H D; Gozzard, E; Carvalho, L

    2016-11-01

    River phytoplankton blooms can pose a serious risk to water quality and the structure and function of aquatic ecosystems. Developing a greater understanding of the physical and chemical controls on the timing, magnitude and duration of blooms is essential for the effective management of phytoplankton development. Five years of weekly water quality monitoring data along the River Thames, southern England were combined with hourly chlorophyll concentration (a proxy for phytoplankton biomass), flow, temperature and daily sunlight data from the mid-Thames. Weekly chlorophyll data was of insufficient temporal resolution to identify the causes of short term variations in phytoplankton biomass. However, hourly chlorophyll data enabled identification of thresholds in water temperature (between 9 and 19°C) and flow (<30m(3)s(-1)) that explained the development of phytoplankton populations. Analysis showed that periods of high phytoplankton biomass and growth rate only occurred when these flow and temperature conditions were within these thresholds, and coincided with periods of long sunshine duration, indicating multiple stressor controls. Nutrient concentrations appeared to have no impact on the timing or magnitude of phytoplankton bloom development, but severe depletion of dissolved phosphorus and silicon during periods of high phytoplankton biomass may have contributed to some bloom collapses through nutrient limitation. This study indicates that for nutrient enriched rivers such as the Thames, manipulating residence time (through removing impoundments) and light/temperature (by increasing riparian tree shading) may offer more realistic solutions than reducing phosphorus concentrations for controlling excessive phytoplankton biomass. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  1. A single CD4 test with 250 cells/mm3 threshold predicts viral suppression in HIV-infected adults failing first-line therapy by clinical criteria.

    PubMed

    Gilks, Charles F; Walker, A Sarah; Munderi, Paula; Kityo, Cissy; Reid, Andrew; Katabira, Elly; Goodall, Ruth L; Grosskurth, Heiner; Mugyenyi, Peter; Hakim, James; Gibb, Diana M

    2013-01-01

    In low-income countries, viral load (VL) monitoring of antiretroviral therapy (ART) is rarely available in the public sector for HIV-infected adults or children. Using clinical failure alone to identify first-line ART failure and trigger regimen switch may result in unnecessary use of costly second-line therapy. Our objective was to identify CD4 threshold values to confirm clinically-determined ART failure when VL is unavailable. 3316 HIV-infected Ugandan/Zimbabwean adults were randomised to first-line ART with Clinically-Driven (CDM, CD4s measured but blinded) or routine Laboratory and Clinical Monitoring (LCM, 12-weekly CD4s) in the DART trial. CD4 at switch and ART failure criteria (new/recurrent WHO 4, single/multiple WHO 3 event; LCM: CD4<100 cells/mm(3)) were reviewed in 361 LCM, 314 CDM participants who switched over median 5 years follow-up. Retrospective VLs were available in 368 (55%) participants. Overall, 265/361 (73%) LCM participants failed with CD4<100 cells/mm(3); only 7 (2%) switched with CD4≥250 cells/mm(3), four switches triggered by WHO events. Without CD4 monitoring, 207/314 (66%) CDM participants failed with WHO 4 events, and 77(25%)/30(10%) with single/multiple WHO 3 events. Failure/switching with single WHO 3 events was more likely with CD4≥250 cells/mm(3) (28/77; 36%) (p = 0.0002). CD4 monitoring reduced switching with viral suppression: 23/187 (12%) LCM versus 49/181 (27%) CDM had VL<400 copies/ml at failure/switch (p<0.0001). Amongst CDM participants with CD4<250 cells/mm(3) only 11/133 (8%) had VL<400 copies/ml, compared with 38/48 (79%) with CD4≥250 cells/mm(3) (p<0.0001). Multiple, but not single, WHO 3 events predicted first-line ART failure. A CD4 threshold 'tiebreaker' of ≥250 cells/mm(3) for clinically-monitored patients failing first-line could identify ∼80% with VL<400 copies/ml, who are unlikely to benefit from second-line. Targeting CD4s to single WHO stage 3 'clinical failures' would particularly avoid premature, costly switch to second-line ART.

  2. A Single CD4 Test with 250 Cells/Mm3 Threshold Predicts Viral Suppression in HIV-Infected Adults Failing First-Line Therapy by Clinical Criteria

    PubMed Central

    Munderi, Paula; Kityo, Cissy; Reid, Andrew; Katabira, Elly; Goodall, Ruth L.; Grosskurth, Heiner; Mugyenyi, Peter; Hakim, James; Gibb, Diana M.

    2013-01-01

    Background In low-income countries, viral load (VL) monitoring of antiretroviral therapy (ART) is rarely available in the public sector for HIV-infected adults or children. Using clinical failure alone to identify first-line ART failure and trigger regimen switch may result in unnecessary use of costly second-line therapy. Our objective was to identify CD4 threshold values to confirm clinically-determined ART failure when VL is unavailable. Methods 3316 HIV-infected Ugandan/Zimbabwean adults were randomised to first-line ART with Clinically-Driven (CDM, CD4s measured but blinded) or routine Laboratory and Clinical Monitoring (LCM, 12-weekly CD4s) in the DART trial. CD4 at switch and ART failure criteria (new/recurrent WHO 4, single/multiple WHO 3 event; LCM: CD4<100 cells/mm3) were reviewed in 361 LCM, 314 CDM participants who switched over median 5 years follow-up. Retrospective VLs were available in 368 (55%) participants. Results Overall, 265/361 (73%) LCM participants failed with CD4<100 cells/mm3; only 7 (2%) switched with CD4≥250 cells/mm3, four switches triggered by WHO events. Without CD4 monitoring, 207/314 (66%) CDM participants failed with WHO 4 events, and 77(25%)/30(10%) with single/multiple WHO 3 events. Failure/switching with single WHO 3 events was more likely with CD4≥250 cells/mm3 (28/77; 36%) (p = 0.0002). CD4 monitoring reduced switching with viral suppression: 23/187 (12%) LCM versus 49/181 (27%) CDM had VL<400 copies/ml at failure/switch (p<0.0001). Amongst CDM participants with CD4<250 cells/mm3 only 11/133 (8%) had VL<400copies/ml, compared with 38/48 (79%) with CD4≥250 cells/mm3 (p<0.0001). Conclusion Multiple, but not single, WHO 3 events predicted first-line ART failure. A CD4 threshold ‘tiebreaker’ of ≥250 cells/mm3 for clinically-monitored patients failing first-line could identify ∼80% with VL<400 copies/ml, who are unlikely to benefit from second-line. Targeting CD4s to single WHO stage 3 ‘clinical failures’ would particularly avoid premature, costly switch to second-line ART. PMID:23437399

  3. Development and Validation of a Portable Hearing Self-Testing System Based on a Notebook Personal Computer.

    PubMed

    Liu, Yan; Yang, Dong; Xiong, Fen; Yu, Lan; Ji, Fei; Wang, Qiu-Ju

    2015-09-01

    Hearing loss affects more than 27 million people in mainland China. It would be helpful to develop a portable and self-testing audiometer for the timely detection of hearing loss so that the optimal clinical therapeutic schedule can be determined. The objective of this study was to develop a software-based hearing self-testing system. The software-based self-testing system consisted of a notebook computer, an external sound card, and a pair of 10-Ω insert earphones. The system could be used to test the hearing thresholds by individuals themselves in an interactive manner using software. The reliability and validity of the system at octave frequencies of 0.25 Hz to 8.0 kHz were analyzed in three series of experiments. Thirty-seven normal-hearing particpants (74 ears) were enrolled in experiment 1. Forty individuals (80 ears) with sensorineural hearing loss (SNHL) participated in experiment 2. Thirteen normal-hearing participants (26 ears) and 37 participants (74 ears) with SNHL were enrolled in experiment 3. Each participant was enrolled in only one of the three experiments. In all experiments, pure-tone audiometry in a sound insulation room (standard test) was regarded as the gold standard. SPSS for Windows, version 17.0, was used for statistical analysis. The paired t-test was used to compare the hearing thresholds between the standard test and software-based self-testing (self-test) in experiments 1 and 2. In experiment 3 (main study), one-way analysis of variance and post hoc comparisons were used to compare the hearing thresholds among the standard test and two rounds of the self-test. Linear correlation analysis was carried out for the self-tests performed twice. The concordance was analyzed between the standard test and the self-test using the kappa method. p < 0.05 was considered statistically significant. Experiments 1 and 2: The hearing thresholds determined by the two methods were not significantly different at frequencies of 250, 500, or 8000 Hz (p > 0.05) but were significantly different at frequencies of 1000, 2000, and 4000 Hz (p < 0.05), except for 1000 Hz in the right ear in experiment 2. Experiment 3: The hearing thresholds determined by the standard test and self-tests repeated twice were not significantly different at any frequency (p > 0.05). The overall sensitivity of the self-test method was 97.6%, and the specificity was 98.3%. The sensitivity was 97.6% and the specificity was 97% for the patients with SNHL. The self-test had significant concordance with the standard test (kappa value = 0.848, p < 0.001). This portable hearing self-testing system based on a notebook personal computer is a reliable and sensitive method for hearing threshold assessment and monitoring. American Academy of Audiology.

  4. Turbidity Threshold sampling in watershed research

    Treesearch

    Rand Eads; Jack Lewis

    2003-01-01

    Abstract - When monitoring suspended sediment for watershed research, reliable and accurate results may be a higher priority than in other settings. Timing and frequency of data collection are the most important factors influencing the accuracy of suspended sediment load estimates, and, in most watersheds, suspended sediment transport is dominated by a few, large...

  5. Using Johnson Distribution for Automatic Threshold Setting in Wind Turbine Condition Monitoring System

    DTIC Science & Technology

    2014-12-23

    what the data in- dicate the most appropriate family is, such as in the work done by (Marhadi, Venkataraman , & Pai, 2012). However, having the data...diagnostic engineering management (coma- dem). Helsinki, Finland. Marhadi, K., Venkataraman , S., & Pai, S. S. (2012). Quan- tifying uncertainty in

  6. Perspiration Thresholds and Secure Suspension for Lower Limb Amputees in Demanding Environments

    DTIC Science & Technology

    2015-10-01

    VAPSHCS) site, fabricating custom, moisture-wicking textile sock with a proximal elastomeric seal, fabricating prosthetic sockets, fabricating electronic...Aims: This research has two specific aims: (1) determine if lower limb amputees are willing to use smart activity monitors as part of their daily life

  7. Vortex Advisory System Safety Analysis : Volume III, Summary of Laser Data Collection and Analysis

    DOT National Transportation Integrated Search

    1979-08-01

    A Laser-Doppler velocimeter (LDV) was used to monitor the wake vortices shed by 5300 landing aircraft at a point 10,000 feet from the runway threshold. The data were collected to verify the analysis in Volume I of the safety of decreasing interarriva...

  8. Using generalized additive modeling to empirically identify thresholds within the ITERS in relation to toddlers' cognitive development.

    PubMed

    Setodji, Claude Messan; Le, Vi-Nhuan; Schaack, Diana

    2013-04-01

    Research linking high-quality child care programs and children's cognitive development has contributed to the growing popularity of child care quality benchmarking efforts such as quality rating and improvement systems (QRIS). Consequently, there has been an increased interest in and a need for approaches to identifying thresholds, or cutpoints, in the child care quality measures used in these benchmarking efforts that differentiate between different levels of children's cognitive functioning. To date, research has provided little guidance to policymakers as to where these thresholds should be set. Using the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B) data set, this study explores the use of generalized additive modeling (GAM) as a method of identifying thresholds on the Infant/Toddler Environment Rating Scale (ITERS) in relation to toddlers' performance on the Mental Development subscale of the Bayley Scales of Infant Development (the Bayley Mental Development Scale Short Form-Research Edition, or BMDSF-R). The present findings suggest that simple linear models do not always correctly depict the relationships between ITERS scores and BMDSF-R scores and that GAM-derived thresholds were more effective at differentiating among children's performance levels on the BMDSF-R. Additionally, the present findings suggest that there is a minimum threshold on the ITERS that must be exceeded before significant improvements in children's cognitive development can be expected. There may also be a ceiling threshold on the ITERS, such that beyond a certain level, only marginal increases in children's BMDSF-R scores are observed. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  9. TH-C-18A-08: A Management Tool for CT Dose Monitoring, Analysis, and Protocol Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, J; Chan, F; Newman, B

    2014-06-15

    Purpose: To develop a customizable tool for enterprise-wide managing of CT protocols and analyzing radiation dose information of CT exams for a variety of quality control applications Methods: All clinical CT protocols implemented on the 11 CT scanners at our institution were extracted in digital format. The original protocols had been preset by our CT management team. A commercial CT dose tracking software (DoseWatch,GE healthcare,WI) was used to collect exam information (exam date, patient age etc.), scanning parameters, and radiation doses for all CT exams. We developed a Matlab-based program (MathWorks,MA) with graphic user interface which allows to analyze themore » scanning protocols with the actual dose estimates, and compare the data to national (ACR,AAPM) and internal reference values for CT quality control. Results: The CT protocol review portion of our tool allows the user to look up the scanning and image reconstruction parameters of any protocol on any of the installed CT systems among about 120 protocols per scanner. In the dose analysis tool, dose information of all CT exams (from 05/2013 to 02/2014) was stratified on a protocol level, and within a protocol down to series level, i.e. each individual exposure event. This allows numerical and graphical review of dose information of any combination of scanner models, protocols and series. The key functions of the tool include: statistics of CTDI, DLP and SSDE, dose monitoring using user-set CTDI/DLP/SSDE thresholds, look-up of any CT exam dose data, and CT protocol review. Conclusion: our inhouse CT management tool provides radiologists, technologists and administration a first-hand near real-time enterprise-wide knowledge on CT dose levels of different exam types. Medical physicists use this tool to manage CT protocols, compare and optimize dose levels across different scanner models. It provides technologists feedback on CT scanning operation, and knowledge on important dose baselines and thresholds.« less

  10. Cleaning Hospital Room Surfaces to Prevent Health Care-Associated Infections: A Technical Brief.

    PubMed

    Han, Jennifer H; Sullivan, Nancy; Leas, Brian F; Pegues, David A; Kaczmarek, Janice L; Umscheid, Craig A

    2015-10-20

    The cleaning of hard surfaces in hospital rooms is critical for reducing health care-associated infections. This review describes the evidence examining current methods of cleaning, disinfecting, and monitoring cleanliness of patient rooms, as well as contextual factors that may affect implementation and effectiveness. Key informants were interviewed, and a systematic search for publications since 1990 was done with the use of several bibliographic and gray literature resources. Studies examining surface contamination, colonization, or infection with Clostridium difficile, methicillin-resistant Staphylococcus aureus, or vancomycin-resistant enterococci were included. Eighty studies were identified-76 primary studies and 4 systematic reviews. Forty-nine studies examined cleaning methods, 14 evaluated monitoring strategies, and 17 addressed challenges or facilitators to implementation. Only 5 studies were randomized, controlled trials, and surface contamination was the most commonly assessed outcome. Comparative effectiveness studies of disinfecting methods and monitoring strategies were uncommon. Future research should evaluate and compare newly emerging strategies, such as self-disinfecting coatings for disinfecting and adenosine triphosphate and ultraviolet/fluorescent surface markers for monitoring. Studies should also assess patient-centered outcomes, such as infection, when possible. Other challenges include identifying high-touch surfaces that confer the greatest risk for pathogen transmission; developing standard thresholds for defining cleanliness; and using methods to adjust for confounders, such as hand hygiene, when examining the effect of disinfecting methods.

  11. Cleaning Hospital Room Surfaces to Prevent Health Care–Associated Infections

    PubMed Central

    Han, Jennifer H.; Sullivan, Nancy; Leas, Brian F.; Pegues, David A.; Kaczmarek, Janice L.; Umscheid, Craig A.

    2015-01-01

    The cleaning of hard surfaces in hospital rooms is critical for reducing health care–associated infections. This review describes the evidence examining current methods of cleaning, disinfecting, and monitoring cleanliness of patient rooms, as well as contextual factors that may affect implementation and effectiveness. Key informants were interviewed, and a systematic search for publications since 1990 was done with the use of several bibliographic and gray literature resources. Studies examining surface contamination, colonization, or infection with Clostridium difficile, methicillin-resistant Staphylococcus aureus, or vancomycinresistant enterococci were included. Eighty studies were identified—76 primary studies and 4 systematic reviews. Forty-nine studies examined cleaning methods, 14 evaluated monitoring strategies, and 17 addressed challenges or facilitators to implementation. Only 5 studies were randomized, controlled trials, and surface contamination was the most commonly assessed outcome. Comparative effectiveness studies of disinfecting methods and monitoring strategies were uncommon. Future research should evaluate and compare newly emerging strategies, such as self-disinfecting coatings for disinfecting and adenosine triphosphate and ultraviolet/fluorescent surface markers for monitoring. Studies should also assess patient-centered outcomes, such as infection, when possible. Other challenges include identifying high-touch surfaces that confer the greatest risk for pathogen transmission; developing standard thresholds for defining cleanliness; and using methods to adjust for confounders, such as hand hygiene, when examining the effect of disinfecting methods. PMID:26258903

  12. Ground-water vulnerability to nitrate contamination in the mid-atlantic region

    USGS Publications Warehouse

    Greene, Earl A.; LaMotte, Andrew E.; Cullinan, Kerri-Ann; Smith, Elizabeth R.

    2005-01-01

    The U.S. Environmental Protection Agency?s (USEPA) Regional Vulnerability Assessment (ReVA) Program has developed a set of statistical tools to support regional-scale, integrated ecological risk-assessment studies. One of these tools, developed by the U.S. Geological Survey (USGS), is used with available water-quality data obtained from USGS National Water-Quality Assessment (NAWQA) and other studies in association with land cover, geology, soils, and other geographic data to develop logistic-regression equations that predict the vulnerability of ground water to nitrate concentrations exceeding specified thresholds in the Mid-Atlantic Region. The models were developed and applied to produce spatial probability maps showing the likelihood of elevated concentrations of nitrate in the region. These maps can be used to identify areas that currently are at risk and help identify areas where ground water has been affected by human activities. This information can be used by regional and local water managers to protect water supplies and identify land-use planning solutions and monitoring programs in these vulnerable areas.

  13. International Monitoring System Correlation Detection at the North Korean Nuclear Test Site at Punggye-ri with Insights from the Source Physics Experiment

    DOE PAGES

    Ford, Sean R.; Walter, William R.

    2015-05-06

    Seismic waveform correlation offers the prospect of greatly reducing event detection thresholds when compared with more conventional processing methods. Correlation is applicable for seismic events that in some sense repeat, that is they have very similar waveforms. A number of recent studies have shown that correlated seismic signals may form a significant fraction of seismicity at regional distances. For the particular case of multiple nuclear explosions at the same test site, regional distance correlation also allows very precise relative location measurements and could offer the potential to lower thresholds when multiple events exist. Using the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Internationalmore » Monitoring System (IMS) seismic array at Matsushiro, Japan (MJAR), Gibbons and Ringdal (2012) were able to create a multichannel correlation detector with a very low false alarm rate and a threshold below magnitude 3.0. They did this using the 2006 or 2009 Democratic People’s Republic of Korea (DPRK) nuclear explosion as a template to search through a data stream from the same station to find a match via waveform correlation. In this paper, we extend the work of Gibbons and Ringdal (2012) and measure the correlation detection threshold at several other IMS arrays. We use this to address three main points. First, we show the IMS array station at Mina, Nevada (NVAR), which is closest to the Nevada National Security Site (NNSS), is able to detect a chemical explosion that is well under 1 ton with the right template. Second, we examine the two IMS arrays closest to the North Korean (DPRK) test site (at Ussuriysk, Russian Federation [USRK] and Wonju, Republic of Korea [KSRS]) to show that similarly low thresholds are possible when the right templates exist. We also extend the work of Schaff et al. (2012) and measure the correlation detection threshold at the nearest Global Seismic Network (GSN) three-component station (MDJ) at Mudanjiang, Heilongjiang Province, China, from the New China Digital Seismograph Network (IC). To conclude, we use these results to explore the recent claim by Zhang and Wen (2015) that the DPRK conducted “…a low-yield nuclear test…” on 12 May 2010.« less

  14. Improving predictive capabilities of environmental change with GLOBE data

    NASA Astrophysics Data System (ADS)

    Robin, Jessica Hill

    This dissertation addresses two applications of Normalized Difference Vegetation Index (NDVI) essential for predicting environmental changes. The first study focuses on whether NDVI can improve model simulations of evapotranspiration for temperate Northern (>35°) regions. The second study focuses on whether NDVI can detect phenological changes in start of season (SOS) for high Northern (>60°) environments. The overall objectives of this research were to (1) develop a methodology for utilizing GLOBE data in NDVI research; and (2) provide a critical analysis of NDVI as a long-term monitoring tool for environmental change. GLOBE is an international partnership network of K-12 students, teachers, and scientists working together to study and understand the global environment. The first study utilized data collected by one GLOBE school in Greenville, Pennsylvania and the second utilized phenology observations made by GLOBE students in Alaska. Results from the first study showed NDVI could predict transpiration periods for environments like Greenville, Pennsylvania. In phenological terms, these environments have three distinct periods (QI, QII, and QIII). QI reflects onset of the growing season (mid March--mid May) when vegetation is greening up (NDVI < 0.60) and transpiration is less than 2mm/day. QII reflects end of the growing season (mid September--October) when vegetation is greening down and transpiration is decreasing. QIII reflects height of the growing season (mid May--mid September) when transpiration rates average between 2 and 5 mm per day and NDVI is at its maximum (>0.60). Results from the second study showed that a climate threshold of 153 +/- 22 growing degree days was a better predictor of SOS for Fairbanks than a NDVI threshold applied to temporal AVHRR and MODIS datasets. Accumulated growing degree days captured the interannual variability of SOS better than the NDVI threshold and most closely resembled actual SOS observations made by GLOBE students. Overall, biweekly composites and effects of clouds, snow, and conifers limit the ability of NDVI to monitor phenological changes in Alaska. Both studies did show that GLOBE data provides an important source of input and validation information for NDVI research.

  15. Performance evaluation of the national early warning system for shallow landslides in Norway

    NASA Astrophysics Data System (ADS)

    Dahl, Mads-Peter; Piciullo, Luca; Devoli, Graziella; Colleuille, Hervé; Calvello, Michele

    2017-04-01

    As a consequence of the increased number of rainfall-and snowmelt-induced landslides (debris flows, debris slides, debris avalanches and slush flows) occurring in Norway, a national landslide early warning system (EWS) has been developed for monitoring and forecasting the hydro-meteorological conditions potentially necessary of triggering slope failures. The system, operational since 2013, is managed by the Norwegian Water Resources and Energy Directorate (NVE) and has been designed in cooperation with the Norwegian Public Road Administration (SVV), the Norwegian National Rail Administration (JBV) and the Norwegian Meteorological Institute (MET). Decision-making in the EWS is based upon hazard threshold levels, hydro-meteorological and real-time landslide observations as well as landslide inventory and susceptibility maps. Hazard threshold levels have been obtained through statistical analyses of historical landslides and modelled hydro-meteorological parameters. Daily hydro-meteorological conditions such as rainfall, snowmelt, runoff, soil saturation, groundwater level and frost depth have been derived from a distributed version of the hydrological HBV-model. Two different landslide susceptibility maps are used as supportive data in deciding daily warning levels. Daily alerts are issued throughout the country considering variable warning zones. Warnings are issued once per day for the following 3 days with an update possibility later during the day according to the information gathered by the monitoring variables. The performance of the EWS has been evaluated applying the EDuMaP method. In particular, the performance of warnings issued in Western Norway, in the period 2013-2014 has been evaluated using two different landslide datasets. The best performance is obtained for the smallest and more accurate dataset. Different performance results may be observed as a function of changing the landslide density criterion, Lden(k), (i.e., thresholds considered to differentiate among classes of landslide events) used as an input parameter within the EDuMaP method. To investigate this issue, a parametric analysis has been conducted; the results of the analysis show clear differences among computed performances when absolute or relative landslide density criteria are considered.

  16. The contribution of volunteer-based monitoring data to the assessment of harmful phytoplankton blooms in Brazilian urban streams.

    PubMed

    Cunha, Davi Gasparini Fernandes; Casali, Simone Pereira; de Falco, Patrícia Bortoletto; Thornhill, Ian; Loiselle, Steven Arthur

    2017-04-15

    Urban streams are vulnerable to a range of impacts, leading to the impairment of ecosystem services. However, studies on phytoplankton growth in tropical lotic systems are still limited. Citizen science approaches use trained volunteers to collect environmental data. We combined data on urban streams collected by volunteers with data obtained by professional scientists to identify potential drivers of phytoplankton community and determine thresholds for Cyanobacteria development. We combined datasets (n=117) on water quality and environmental observations in 64 Brazilian urban streams with paired data on phytoplankton. Sampling activities encompassed dry (July 2013 and July 2015) and warm (February and November 2014) seasons. Volunteers quantified phosphate (PO 4 3- ), nitrate (NO 3 - ) and turbidity in each stream using colorimetric and optical methods and recorded environmental conditions in the immediate surroundings of the sites through visual observations. We used non-parametric statistics to identify correlations among nutrients, turbidity and phytoplankton. We also looked for thresholds with respect to high Cyanobacteria abundance (>50,000cells/mL). The streams were characterized by relatively high nutrient concentrations (PO 4 3- : 0.11mg/L; NO 3 - : 2.6mg/L) and turbidity (49 NTU). Phytoplankton densities reached 189,000cells/mL, mainly potentially toxic Cyanobacteria species. Moderate but significant (p<0.05) correlations were observed between phytoplankton density and turbidity (ρ=0.338, Spearman) and PO 4 3- (ρ=0.292), but not with NO 3 - . Other important variables (river flow, temperature and light) were not assessed. Volunteers' observations covaried with phytoplankton density (p<0.05, Kruskal-Wallis), positively with increasing number of pollution sources and negatively with presence of vegetation in the riparian zone. Our results indicate that thresholds for PO 4 3- (0.11mg/L) can be used to separate systems with high Cyanobacteria density. The number of pollution sources provided a good indicator of waterbodies with potential cyanobacteria problems. Our findings reinforced the need for nutrient abatement and restoration of local streams and highlighted the benefits of volunteer-based monitoring to support decision-making. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Health diagnosis of arch bridge suspender by acoustic emission technique

    NASA Astrophysics Data System (ADS)

    Li, Dongsheng; Ou, Jinping

    2007-01-01

    Conventional non-destructive methods can't be dynamically monitored the suspenders' damage levels and types, so acoustic emission (AE) technique is proposed to monitor its activity. The validity signals are determined by the relationship with risetime and duration. The ambient noise is eliminated using float threshold value and placing a guard sensor. The cement mortar and steel strand damage level is analyzed by AE parameter method and damage types are judged by waveform analyzing technique. Based on these methods, all the suspenders of Sichuan Ebian Dadu river arch bridge have been monitored using AE techniques. The monitoring results show that AE signal amplitude, energy, counts can visually display the suspenders' damage levels, the difference of waveform and frequency range express different damage type. The testing results are well coincide with the practical situation.

  18. Assessment of continuous acoustic respiratory rate monitoring as an addition to a pulse oximetry-based patient surveillance system.

    PubMed

    McGrath, Susan P; Pyke, Joshua; Taenzer, Andreas H

    2017-06-01

    Technology advances make it possible to consider continuous acoustic respiratory rate monitoring as an integral component of physiologic surveillance systems. This study explores technical and logistical aspects of augmenting pulse oximetry-based patient surveillance systems with continuous respiratory rate monitoring and offers some insight into the impact on patient deterioration detection that may result. Acoustic respiratory rate sensors were introduced to a general care pulse oximetry-based surveillance system with respiratory rate alarms deactivated. Simulation was used after 4324 patient days to determine appropriate alarm thresholds for respiratory rate, which were then activated. Data were collected for an additional 4382 patient days. Physiologic parameters, alarm data, sensor utilization and patient/staff feedback were collected throughout the study and analyzed. No notable technical or workflow issues were observed. Sensor utilization was 57 %, with patient refusal leading reasons for nonuse (22.7 %). With respiratory rate alarm thresholds set to 6 and 40 breaths/min., the majority of nurse pager clinical notifications were triggered by low oxygen saturation values (43 %), followed by low respiratory rate values (21 %) and low pulse rate values (13 %). Mean respiratory rate collected was 16.6 ± 3.8 breaths/min. The vast majority (82 %) of low oxygen saturation states coincided with normal respiration rates of 12-20 breaths/min. Continuous respiratory rate monitoring can be successfully added to a pulse oximetry-based surveillance system without significant technical, logistical or workflow issues and is moderately well-tolerated by patients. Respiratory rate sensor alarms did not significantly impact overall system alarm burden. Respiratory rate and oxygen saturation distributions suggest adding continuous respiratory rate monitoring to a pulse oximetry-based surveillance system may not significantly improve patient deterioration detection.

  19. Temperature dataloggers as stove use monitors (SUMs): Field methods and signal analysis

    PubMed Central

    Ruiz-Mercado, Ilse; Canuz, Eduardo; Smith, Kirk R.

    2013-01-01

    We report the field methodology of a 32-month monitoring study with temperature dataloggers as Stove Use Monitors (SUMs) to quantify usage of biomass cookstoves in 80 households of rural Guatemala. The SUMs were deployed in two stoves types: a well-operating chimney cookstove and the traditional open-cookfire. We recorded a total of 31,112 days from all chimney cookstoves, with a 10% data loss rate. To count meals and determine daily use of the stoves we implemented a peak selection algorithm based on the instantaneous derivatives and the statistical long-term behavior of the stove and ambient temperature signals. Positive peaks with onset and decay slopes exceeding predefined thresholds were identified as “fueling events”, the minimum unit of stove use. Adjacent fueling events detected within a fixed-time window were clustered in single “cooking events” or “meals”. The observed means of the population usage were: 89.4% days in use from all cookstoves and days monitored, 2.44 meals per day and 2.98 fueling events. We found that at this study site a single temperature threshold from the annual distribution of daily ambient temperatures was sufficient to differentiate days of use with 0.97 sensitivity and 0.95 specificity compared to the peak selection algorithm. With adequate placement, standardized data collection protocols and careful data management the SUMs can provide objective stove-use data with resolution, accuracy and level of detail not possible before. The SUMs enable unobtrusive monitoring of stove-use behavior and its systematic evaluation with stove performance parameters of air pollution, fuel consumption and climate-altering emissions. PMID:25225456

  20. A dense microseismic monitoring network in Korea for uncovering relationship between seismic activity and neotectonic features

    NASA Astrophysics Data System (ADS)

    Kang, T.; Lee, J. M.; Kim, W.; Jo, B. G.; Chung, T.; Choi, S.

    2012-12-01

    A few tens of surface traces indicating movements in Quaternary were found in the southeastern part of the Korean Peninsula. Following both the geological and engineering definitions, those features are classified into "active", in geology, or "capable", in engineering, faults. On the other hand, the present-day seismicity of the region over a couple of thousand years is indistinguishable on the whole with the rest of the Korean Peninsula. It is therefore of great interest whether the present seismic activity is related to the neotectonic features or not. Either of conclusions is not intuitive in terms of the present state of seismic monitoring network in the region. Thus much interest in monitoring seismicity to provide an improved observation resolution and to lower the event-detection threshold has increased with many observations of the Quaternary faults. We installed a remote, wireless seismograph network which is composed of 20 stations with an average spacing of 10 km. Each station is equipped with a three-component Trillium Compact seismometer and Taurus digitizer. Instrumentation and analysis advancements are now offering better tools for this monitoring. This network is scheduled to be in operation over about one and a half year. In spite of the relatively short observation period, we expect that the high density of the network enables us to monitor seismic events with much lower magnitude threshold compared to the preexisting seismic network in the region. Following the Gutenberg-Richter relationship, the number of events with low magnitude is logarithmically larger than that with high magnitude. Following this rule, we can expect that many of microseismic events may reveal behavior of their causative faults, if any. We report the results of observation which has been performed over a year up to now.

  1. Thresholds of Principle and Preference: Exploring Procedural Variation in Postgraduate Surgical Education.

    PubMed

    Apramian, Tavis; Cristancho, Sayra; Watling, Chris; Ott, Michael; Lingard, Lorelei

    2015-11-01

    Expert physicians develop their own ways of doing things. The influence of such practice variation in clinical learning is insufficiently understood. Our grounded theory study explored how residents make sense of, and behave in relation to, the procedural variations of faculty surgeons. We sampled senior postgraduate surgical residents to construct a theoretical framework for how residents make sense of procedural variations. Using a constructivist grounded theory approach, we used marginal participant observation in the operating room across 56 surgical cases (146 hours), field interviews (38), and formal interviews (6) to develop a theoretical framework for residents' ways of dealing with procedural variations. Data analysis used constant comparison to iteratively refine the framework and data collection until theoretical saturation was reached. The core category of the constructed theory was called thresholds of principle and preference and it captured how faculty members position some procedural variations as negotiable and others not. The term thresholding was coined to describe residents' daily experiences of spotting, mapping, and negotiating their faculty members' thresholds and defending their own emerging thresholds. Thresholds of principle and preference play a key role in workplace-based medical education. Postgraduate medical learners are occupied on a day-to-day level with thresholding and attempting to make sense of the procedural variations of faculty. Workplace-based teaching and assessment should include an understanding of the integral role of thresholding in shaping learners' development. Future research should explore the nature and impact of thresholding in workplace-based learning beyond the surgical context.

  2. Threshold Monitoring Maps for Under-Water Explosions

    NASA Astrophysics Data System (ADS)

    Arora, N. S.

    2014-12-01

    Hydro-acoustic energy in the 1-100 Hz range from under-water explosions can easily spread for thousands of miles due to the unique properties of the deep sound channel. This channel, aka SOFAR channel, exists almost everywhere in the earth's oceans where the water has at least 1500m depth. Once the energy is trapped in this channel it spreads out cylindrically, and hence experiences very little loss, as long as there is an unblocked path from source to receiver. Other losses such as absorption due to chemicals in the ocean (mainly boric acid and magnesium sulphate) are also quite minimal at these low frequencies. It is not surprising then that the International Monitoring System (IMS) maintains a global network of hydrophone stations listening on this particular frequency range. The overall objective of our work is to build a probabilistic model to detect and locate under-water explosions using the IMS network. A number of critical pieces for this model, such as travel time predictions, are already well known. We are extending the existing knowledge-base by building the remaining pieces, most crucially the models for transmission losses and detection probabilities. With a complete model for detecting under-water explosions we are able to combine it with our existing model for seismic events, NET-VISA. In the conference we will present threshold monitoring maps for explosions in the earth's oceans. Our premise is that explosive sources release an unknown fraction of their total energy into the SOFAR channel, and this trapped energy determines their detection probability at each of the IMS hydrophone stations. Our threshold monitoring maps compute the minimum amount of energy at each location that must be released into the deep sound channel such that there is a ninety percent probability that at least two of the IMS stations detect the event. We will also present results of our effort to detect and locate hydro-acoustic events. In particular, we will show results from a recent under-water volcanic eruption at the Ahyl Seamount (April-May 2014), and compare our work with the current processing, both automated and human, at the IDC.

  3. 24 CFR 599.301 - Initial determination of threshold requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 24 Housing and Urban Development 3 2014-04-01 2013-04-01 true Initial determination of threshold requirements. 599.301 Section 599.301 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR COMMUNITY PLANNING AND DEVELOPMENT, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT...

  4. 24 CFR 594.7 - Other threshold requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 3 2013-04-01 2013-04-01 false Other threshold requirements. 594.7 Section 594.7 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR COMMUNITY PLANNING AND DEVELOPMENT, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT COMMUNITY FACILITIES JOH...

  5. 24 CFR 594.7 - Other threshold requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 3 2012-04-01 2012-04-01 false Other threshold requirements. 594.7 Section 594.7 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR COMMUNITY PLANNING AND DEVELOPMENT, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT COMMUNITY FACILITIES JOH...

  6. 24 CFR 599.301 - Initial determination of threshold requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 3 2013-04-01 2013-04-01 false Initial determination of threshold requirements. 599.301 Section 599.301 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR COMMUNITY PLANNING AND DEVELOPMENT, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT...

  7. 24 CFR 599.301 - Initial determination of threshold requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 3 2010-04-01 2010-04-01 false Initial determination of threshold requirements. 599.301 Section 599.301 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR COMMUNITY PLANNING AND DEVELOPMENT, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT...

  8. 24 CFR 594.7 - Other threshold requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 3 2010-04-01 2010-04-01 false Other threshold requirements. 594.7 Section 594.7 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR COMMUNITY PLANNING AND DEVELOPMENT, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT COMMUNITY FACILITIES JOH...

  9. 24 CFR 599.301 - Initial determination of threshold requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 3 2012-04-01 2012-04-01 false Initial determination of threshold requirements. 599.301 Section 599.301 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR COMMUNITY PLANNING AND DEVELOPMENT, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT...

  10. 24 CFR 594.7 - Other threshold requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 24 Housing and Urban Development 3 2014-04-01 2013-04-01 true Other threshold requirements. 594.7 Section 594.7 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR COMMUNITY PLANNING AND DEVELOPMENT, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT COMMUNITY FACILITIES JOHN...

  11. Cytomegalovirus (CMV) DNA quantification in bronchoalveolar lavage fluid of immunocompromised patients with CMV pneumonia.

    PubMed

    Beam, Elena; Germer, Jeffrey J; Lahr, Brian; Yao, Joseph D C; Limper, Andrew Harold; Binnicker, Matthew J; Razonable, Raymund R

    2018-01-01

    Cytomegalovirus (CMV) pneumonia causes major morbidity and mortality. Its diagnosis requires demonstration of viral cytopathic changes in tissue, entailing risks of lung biopsy. This study aimed to determine CMV viral load (VL) thresholds in bronchoalveolar lavage fluid (BALF) for diagnosis of CMV pneumonia in immunocompromised patients. CMV VL in BALF was studied in 17 patients (83% transplant recipients) and 21 control subjects with and without CMV pneumonia, respectively, using an FDA-approved PCR assay (Cobas ® AmpliPrep/Cobas TaqMan ® CMV Test, Roche Molecular Systems, Inc.) calibrated to the WHO International Standard for CMV DNA (NIBSC: 09/162). Receiver operating characteristic curve analysis produced a BALF CMV VL threshold of 34 800, IU/mL with 91.7% sensitivity and 100.0% specificity for diagnosis of possible, probable, and proven CMV pneumonia in transplant patients, while a threshold of 656 000 IU/mL yielded 100% sensitivity and specificity among biopsy-proven cases. For all immunocompromised patients, a VL threshold of 274 IU/mL was selected. VL thresholds also were normalized to BALF cell count yielding a threshold of 0.32 IU/10 6 cells with 91.7% sensitivity and 90.5% specificity for possible, probable, and proven CMV pneumonia in transplant recipients. Monitoring CMV VL in BALF may be a less invasive method for diagnosing CMV pneumonia in immunocompromised patients. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. Stress and strain analysis of contractions during ramp distension in partially obstructed guinea pig jejunal segments

    PubMed Central

    Zhao, Jingbo; Liao, Donghua; Yang, Jian; Gregersen, Hans

    2011-01-01

    Previous studies have demonstrated morphological and biomechanical remodeling in the intestine proximal to an obstruction. The present study aimed to obtain stress and strain thresholds to initiate contraction and the maximal contraction stress and strain in partially obstructed guinea pig jejunal segments. Partial obstruction and sham operations were surgically created in mid-jejunum of male guinea pigs. The animals survived 2, 4, 7, and 14 days, respectively. Animals not being operated on served as normal controls. The segments were used for no-load state, zero-stress state and distension analyses. The segment was inflated to 10 cmH2O pressure in an organ bath containing 37°C Krebs solution and the outer diameter change was monitored. The stress and strain at the contraction threshold and at maximum contraction were computed from the diameter, pressure and the zero-stress state data. Young’s modulus was determined at the contraction threshold. The muscle layer thickness in obstructed intestinal segments increased up to 300%. Compared with sham-obstructed and normal groups, the contraction stress threshold, the maximum contraction stress and the Young’s modulus at the contraction threshold increased whereas the strain threshold and maximum contraction strain decreased after 7 days obstruction (P<0.05 and 0.01). In conclusion, in the partially obstructed intestinal segments, a larger distension force was needed to evoke contraction likely due to tissue remodeling. Higher contraction stresses were produced and the contraction deformation (strain) became smaller. PMID:21632056

  13. Long-term ambient concentrations of total suspended particulates and oxidants as related to incidence of chronic disease in California Seventh-Day Adventists.

    PubMed

    Abbey, D E; Mills, P K; Petersen, F F; Beeson, W L

    1991-08-01

    Cancer incidence and mortality in a cohort of 6000 nonsmoking California Seventh-Day Adventists were monitored for a 6-year period, and relationships with long-term cumulative ambient air pollution were observed. Total suspended particulates (TSP) and ozone were measured in terms of numbers of hours in excess of several threshold levels corresponding to national standards as well as mean concentration. For all malignant neoplasms among females, risk increased with increasing exceedance frequencies of all thresholds of TSP except the lowest one, and those increased risks were highly statistically significant. For respiratory cancers, increased risk was associated with only one threshold of ozone, and this result was of borderline significance. Respiratory disease symptoms were assessed in 1977 and again in 1987 using the National Heart, Lung and Blood Institute respiratory symptoms questionnaire on a subcohort of 3914 individuals. Multivariate analyses which adjusted for past and passive smoking and occupational exposures indicated statistically significantly (p less than 0.05) elevated relative risks ranging up to 1.7 for incidence of asthma, definite symptoms of airway obstructive disease, and chronic bronchitis with TSP in excess of all thresholds except the lowest one but not for any thresholds of ozone. A trend association (p = 0.056) was noted between the threshold of 10 pphm ozone and incidence of asthma. These results are presented within the context of standards setting for these constituents of air pollution.

  14. Long-term ambient concentrations of total suspended particulates and oxidants as related to incidence of chronic disease in California Seventh-Day Adventists.

    PubMed Central

    Abbey, D E; Mills, P K; Petersen, F F; Beeson, W L

    1991-01-01

    Cancer incidence and mortality in a cohort of 6000 nonsmoking California Seventh-Day Adventists were monitored for a 6-year period, and relationships with long-term cumulative ambient air pollution were observed. Total suspended particulates (TSP) and ozone were measured in terms of numbers of hours in excess of several threshold levels corresponding to national standards as well as mean concentration. For all malignant neoplasms among females, risk increased with increasing exceedance frequencies of all thresholds of TSP except the lowest one, and those increased risks were highly statistically significant. For respiratory cancers, increased risk was associated with only one threshold of ozone, and this result was of borderline significance. Respiratory disease symptoms were assessed in 1977 and again in 1987 using the National Heart, Lung and Blood Institute respiratory symptoms questionnaire on a subcohort of 3914 individuals. Multivariate analyses which adjusted for past and passive smoking and occupational exposures indicated statistically significantly (p less than 0.05) elevated relative risks ranging up to 1.7 for incidence of asthma, definite symptoms of airway obstructive disease, and chronic bronchitis with TSP in excess of all thresholds except the lowest one but not for any thresholds of ozone. A trend association (p = 0.056) was noted between the threshold of 10 pphm ozone and incidence of asthma. These results are presented within the context of standards setting for these constituents of air pollution. PMID:1954938

  15. Measurement of Photoelectron Emission Using Vacuum Ultraviolet Ray Irradiation

    NASA Astrophysics Data System (ADS)

    Okamura, Shugo; Iwao, Toru; Yumoto, Motoshige; Miyake, Hiroaki; Nitta, Kumi

    2009-01-01

    Satellites have come to play many roles depending on their purpose, including communication, weather observation, astronomy observation, and space development. A satellite requires long life and high reliability in such a situation. However, at an altitude of several hundred kilometers, atomic oxygen (AO) is a destructive factor. With density of about 1015 atoms/m3, AO also has high reactivity. As the satellite collides with AO, surface materials of the satellite are degraded, engendering surface roughness and oxidation. Accordingly, it is necessary to monitor the surface conditions. In this study, photoemission characteristics of several materials, such as metals, glasses, and polymers are measured using a deuterium lamp and band pass filters. The threshold energy for photoemission and the quantum efficiency were evaluated from those measurements. Consequently, for the investigated materials the threshold energies for photoelectron emission were found to be 4.9-5.7 eV. The quantum efficiency of metals is about 100 times higher than that of other samples. The quantum efficiency of PS that includes a benzene ring is several times higher than that of either PP or PTFE, suggesting that deteriorated materials emit large amounts of photoelectrons.

  16. An Integrated 0-1 Hour First-Flash Lightning Nowcasting, Lightning Amount and Lightning Jump Warning Capability

    NASA Technical Reports Server (NTRS)

    Mecikalski, John; Jewett, Chris; Carey, Larry; Zavodsky, Brad; Stano, Geoffrey; Chronis, Themis

    2015-01-01

    Using satellite-based methods that provide accurate 0-1 hour convective initiation (CI) nowcasts, and rely on proven success coupling satellite and radar fields in the Corridor Integrated Weather System (CIWS; operated and developed at MIT-Lincoln Laboratory), to subsequently monitor for first-flash lightning initiation (LI) and later period lightning trends as storms evolve. Enhance IR-based methods within the GOES-R CI Algorithm (that must meet specific thresholds for a given cumulus cloud before the cloud is considered to have an increased likelihood of producing lightning next 90 min) that forecast LI. Integrate GOES-R CI and LI fields with radar thresholds (e.g., first greater than or equal to 40 dBZ echo at the -10 C altitude) and NWP model data within the WDSS-II system for LI-events from new convective storms. Track ongoing lightning using Lightning Mapping Array (LMA) and pseudo-Geostationary Lightning Mapper (GLM) data to assess per-storm lightning trends (e.g., as tied to lightning jumps) and outline threat regions. Evaluate the ability to produce LI nowcasts through a "lightning threat" product, and obtain feedback from National Weather Service forecasters on its value as a decision support tool.

  17. An Explosion Aftershock Model with Application to On-Site Inspection

    NASA Astrophysics Data System (ADS)

    Ford, Sean R.; Labak, Peter

    2016-01-01

    An estimate of aftershock activity due to a theoretical underground nuclear explosion is produced using an aftershock rate model. The model is developed with data from the Nevada National Security Site, formerly known as the Nevada Test Site, and the Semipalatinsk Test Site, which we take to represent soft-rock and hard-rock testing environments, respectively. Estimates of expected magnitude and number of aftershocks are calculated using the models for different testing and inspection scenarios. These estimates can help inform the Seismic Aftershock Monitoring System (SAMS) deployment in a potential Comprehensive Test Ban Treaty On-Site Inspection (OSI), by giving the OSI team a probabilistic assessment of potential aftershocks in the Inspection Area (IA). The aftershock assessment, combined with an estimate of the background seismicity in the IA and an empirically derived map of threshold magnitude for the SAMS network, could aid the OSI team in reporting. We apply the hard-rock model to a M5 event and combine it with the very sensitive detection threshold for OSI sensors to show that tens of events per day are expected up to a month after an explosion measured several kilometers away.

  18. Decline of human tactile angle discrimination in patients with mild cognitive impairment and Alzheimer's disease.

    PubMed

    Yang, Jiajia; Ogasa, Takashi; Ohta, Yasuyuki; Abe, Koji; Wu, Jinglong

    2010-01-01

    There is a need to differentiate between patients with mild cognitive impairment (MCI) and Alzheimer's disease (AD) from normal-aged controls (NC) in the field of clinical drug discovery. In this study, we developed a tactile angle discrimination system and examined whether the ability to discriminate tactile angle differed between patients with MCI and AD and the NC group. Thirty-seven subjects were divided into three groups: NC individuals (n=14); MCI patients (n=10); and probable AD patients (n=13). All subjects were asked to differentiate the relative sizes of the reference angle (60°) and one of eight comparison angles by passive touch. The accuracy of angle discrimination was measured and the discrimination threshold was calculated. We discovered that there were significant differences in the angle discrimination thresholds of AD patients compared to the NC group. Interestingly, we also found that ability to discriminate tactile angle of MCI patients were significantly lower than that of the NC group. This is the first study to report that patients with MCI and AD have substantial performance deficits in tactile angle discrimination compared to the NC individuals. This finding may provide a monitor and therapeutic approach in AD diagnosis and treatment.

  19. An explosion aftershock model with application to on-site inspection

    DOE PAGES

    Ford, Sean R.; Labak, Peter

    2015-02-14

    An estimate of aftershock activity due to a theoretical underground nuclear explosion is produced using an aftershock rate model. The model is developed with data from the Nevada National Security Site, formerly known as the Nevada Test Site, and the Semipalatinsk Test Site, which we take to represent soft-rock and hard-rock testing environments, respectively. Estimates of expected magnitude and number of aftershocks are calculated using the models for different testing and inspection scenarios. These estimates can help inform the Seismic Aftershock Monitoring System (SAMS) deployment in a potential Comprehensive Test Ban Treaty On-Site Inspection (OSI), by giving the OSI teammore » a probabilistic assessment of potential aftershocks in the Inspection Area (IA). The aftershock assessment, combined with an estimate of the background seismicity in the IA and an empirically derived map of threshold magnitude for the SAMS network, could aid the OSI team in reporting. Here, we apply the hard-rock model to a M5 event and combine it with the very sensitive detection threshold for OSI sensors to show that tens of events per day are expected up to a month after an explosion measured several kilometers away.« less

  20. Weight monitoring system for newborn incubator application

    NASA Astrophysics Data System (ADS)

    Widianto, Arif; Nurfitri, Intan; Mahatidana, Pradipta; Abuzairi, Tomy; Poespawati, N. R.; Purnamaningsih., Retno W.

    2018-02-01

    We proposed weight monitoring system using load cell sensor for newborn incubator application. The weight sensing system consists of a load cell, conditioning signal circuit, and microcontroller Arduino Uno R3. The performance of the sensor was investigated by using the various weight from 0 up to 3000 g. Experiment results showed that this system has a small error of 4.313% and 12.5 g of threshold and resolution value. Compared to the typical baby scale available in local market, the proposed system has a lower error value and hysteresis.

Top